A platform for building, running, and interacting with agentic workflows powered by Large Language Models (LLMs). This project provides a modular backend for orchestrating agent reasoning, planning, and tool use, along with a modern React frontend for interactive user experiences.
- Agentic Reasoning: Agents can plan, decompose, and execute complex tasks using a set of modular tools.
- LLM Integration: Supports local, HuggingFace, and OpenAI LLMs for planning, summarization, coding, and more.
- Pluggable Tools: Summarization, citation generation, open-access journal search, and code generation.
- Memory System: Short-term and long-term memory for contextual, stateful agent behavior.
- Modern Frontend: React + Vite UI for submitting queries, viewing plans, and interacting with agent results.
- API-First: FastAPI backend with REST endpoints for agent queries and results.
┌────────────┐ HTTP API ┌──────────────┐
│ Frontend │ ◄───────────────► │ Backend │
│ (React) │ │ (FastAPI) │
└────────────┘ ◄───────────────► │ Agentic │
│ Workflow │
└──────────────┘
- frontend/: React app (Vite, TypeScript) for user interaction
- backend/: FastAPI app with agent core, planning, memory, and tool modules
cd backend
pip install -r requirements.txt
# Configure environment variables in a .env file (see below)
uvicorn src.main:app --reload
# App runs at http://localhost:8000# Example for local LLM
LLM_CLIENT_TYPE=local
LLM_API_URL=http://localhost:1234/api/v1/chat
LLM_MODEL=qwen2.5-coder-32b-instruct
# Or for OpenAI
# LLM_CLIENT_TYPE=openai
# OPENAI_API_KEY=sk-...
# OPENAI_MODEL=gpt-3.5-turbo
cd frontend
npm install
npm run dev
# App runs at http://localhost:3000- Start the backend and frontend servers.
- Open the frontend in your browser.
- Enter a query and select an agent role (e.g., Researcher, Coder).
- The agent will plan, execute, and display results, including reasoning steps and memory.
- core/agent.py: Main agent class (planning, memory, tool use)
- core/planner.py: Planner interface and LLM-based planner
- core/tools.py: Modular tools (summarization, citation, search, coding, formatting)
- core/memory.py: Short-term and long-term memory system
- core/clients/: LLM client interfaces (local, HuggingFace, OpenAI)
- src/routes/: FastAPI routes for agent query and result endpoints
- src/pages/App.tsx: Main app page
- src/components/AgentInteraction.tsx: Query form and result display
- src/api/agent.ts: API client for backend
