A sophisticated AI agent framework built with TypeScript that combines Large Language Models (LLMs), Retrieval-Augmented Generation (RAG), and Model Context Protocol (MCP) for enhanced AI capabilities.
- π€ AI Agent: Autonomous agent with tool-calling capabilities
- π RAG System: Retrieval-Augmented Generation with embedding-based search
- π οΈ MCP Integration: Model Context Protocol for external tool integration
- π Vector Store: In-memory vector database with cosine similarity search
- π Streaming Support: Real-time streaming responses from OpenAI
- π§ Multiple Tools: File system operations, web fetching, and more
- π― TypeScript: Full type safety and modern development experience
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Agent βββββΆβ ChatOpenAI βββββΆβ OpenAI API β
β (Coordinator) β β (LLM Interface) β β β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ ββββββββββββββββββββ
β MCPClient βββββΆβ Tool Servers β
β (Tool Interface)β β (File, Web...) β
βββββββββββββββββββ ββββββββββββββββββββ
β
βΌ
ββββββββββββββββββββ ββββββββββββββββββββ
βEmbeddingRetrieverβββββΆβ Vector Store β
β (RAG System) β β (Similarity) β
ββββββββββββββββββββ ββββββββββββββββββββ
src/
βββ Agent.ts # Main agent coordinator
βββ ChatOpenAI.ts # OpenAI API wrapper with streaming
βββ MCPClient.ts # Model Context Protocol client
βββ EmbeddingRetriever.ts # RAG embedding and retrieval
βββ VectorStore.ts # In-memory vector database
βββ utils.ts # Utility functions
βββ index.ts # Example implementation
# Clone the repository
git clone <repository-url>
cd myagent
# Install dependencies
pnpm install
# Set up environment variables
cp .env.example .envCreate a .env file with the following variables:
# OpenAI Configuration
OPENAI_API_KEY=your_openai_api_key
OPENAI_API_BASE_URL=https://api.openai.com/v1
# Embedding Service Configuration
EMBEDDING_BASE_URL=your_embedding_service_url
EMBEDDING_KEY=your_embedding_api_key
# Proxy Configuration (optional)
HTTPS_PROXY=http://127.0.0.1:7890import Agent from "./Agent";
import MCPClient from "./MCPClient";
// Create MCP clients for different tools
const fileMCP = new MCPClient("file-server", "npx", ['-y', '@modelcontextprotocol/server-filesystem', './workspace']);
const fetchMCP = new MCPClient("fetch-server", "uvx", ['mcp-server-fetch']);
// Initialize agent
const agent = new Agent('gpt-4o-mini', [fileMCP, fetchMCP]);
await agent.init();
// Use the agent
const response = await agent.invoke("Please read the README.md file and summarize it");
console.log(response);
// Clean up
await agent.close();import EmbeddingRetriever from "./EmbeddingRetriever";
// Initialize retriever
const retriever = new EmbeddingRetriever("BAAI/bge-m3");
// Add documents
await retriever.embedDocument("Your document content here");
// Retrieve relevant content
const results = await retriever.retrieve("Your query", 3);
console.log(results);import ChatOpenAI from "./ChatOpenAI";
const llm = new ChatOpenAI("gpt-4o-mini", "You are a helpful assistant");
const { content, toolCalls } = await llm.chat("Hello, how are you?");
console.log(content); // Streams in real-timeThe main coordinator that orchestrates interactions between LLMs and tools.
Key Features:
- Autonomous tool calling
- Multi-round conversations
- Automatic tool result handling
- Resource management
OpenAI API wrapper with streaming support and tool integration.
Key Features:
- Real-time streaming responses
- Tool call processing
- Conversation history management
- Proxy support for restricted networks
Model Context Protocol client for external tool integration.
Key Features:
- Stdio transport for tool servers
- Dynamic tool discovery
- Tool execution management
- Connection lifecycle handling
RAG system with embedding-based document retrieval.
Key Features:
- Document embedding
- Similarity search
- Top-k retrieval
- Multiple embedding model support
In-memory vector database with similarity search.
Key Features:
- Cosine similarity calculation
- Efficient vector storage
- Top-k search results
- Simple API interface
# Development
pnpm dev # Run in development mode
# Production
pnpm build # Compile TypeScript
pnpm start # Run compiled JavaScript
# Example Usage
pnpm dev # Runs the example in src/index.tsThe project supports various MCP servers for different functionalities:
- File System:
@modelcontextprotocol/server-filesystem - Web Fetching:
mcp-server-fetch - Database: Various database connectors
- Custom Tools: Easy to integrate custom MCP servers
- Document Analysis: RAG-powered document Q&A system
- Web Research: Autonomous web scraping and analysis
- File Management: AI-powered file operations
- Content Generation: Context-aware content creation
- Data Processing: Automated data analysis workflows
- API keys are loaded from environment variables
- Proxy support for restricted network environments
- Tool execution is sandboxed through MCP protocol
- No sensitive data is logged
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
ISC License - see LICENSE file for details
- OpenAI for the GPT models
- Model Context Protocol for tool integration
- TypeScript for type safety
- Various MCP server implementations