Skip to content

tjkessler/agentic-workflow-system

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Agentic Workflow System

A platform for building, running, and interacting with agentic workflows powered by Large Language Models (LLMs). This project provides a modular backend for orchestrating agent reasoning, planning, and tool use, along with a modern React frontend for interactive user experiences.

Select personas UI


Features

  • Agentic Reasoning: Agents can plan, decompose, and execute complex tasks using a set of modular tools.
  • LLM Integration: Supports local, HuggingFace, and OpenAI LLMs for planning, summarization, coding, and more.
  • Pluggable Tools: Summarization, citation generation, open-access journal search, and code generation.
  • Memory System: Short-term and long-term memory for contextual, stateful agent behavior.
  • Modern Frontend: React + Vite UI for submitting queries, viewing plans, and interacting with agent results.
  • API-First: FastAPI backend with REST endpoints for agent queries and results.

Architecture

┌────────────┐      HTTP API     ┌──────────────┐
│  Frontend  │ ◄───────────────► │   Backend    │
│  (React)   │                   │  (FastAPI)   │
└────────────┘ ◄───────────────► │   Agentic    │
								 │   Workflow   │
								 └──────────────┘
  • frontend/: React app (Vite, TypeScript) for user interaction
  • backend/: FastAPI app with agent core, planning, memory, and tool modules

Quickstart

1. Backend Setup

cd backend
pip install -r requirements.txt
# Configure environment variables in a .env file (see below)
uvicorn src.main:app --reload
# App runs at http://localhost:8000

Environment Variables (.env)

# Example for local LLM
LLM_CLIENT_TYPE=local
LLM_API_URL=http://localhost:1234/api/v1/chat
LLM_MODEL=qwen2.5-coder-32b-instruct

# Or for OpenAI
# LLM_CLIENT_TYPE=openai
# OPENAI_API_KEY=sk-...
# OPENAI_MODEL=gpt-3.5-turbo

2. Frontend Setup

cd frontend
npm install
npm run dev
# App runs at http://localhost:3000

Usage

  1. Start the backend and frontend servers.
  2. Open the frontend in your browser.
  3. Enter a query and select an agent role (e.g., Researcher, Coder).
  4. The agent will plan, execute, and display results, including reasoning steps and memory.

Backend Overview

  • core/agent.py: Main agent class (planning, memory, tool use)
  • core/planner.py: Planner interface and LLM-based planner
  • core/tools.py: Modular tools (summarization, citation, search, coding, formatting)
  • core/memory.py: Short-term and long-term memory system
  • core/clients/: LLM client interfaces (local, HuggingFace, OpenAI)
  • src/routes/: FastAPI routes for agent query and result endpoints

Frontend Overview

  • src/pages/App.tsx: Main app page
  • src/components/AgentInteraction.tsx: Query form and result display
  • src/api/agent.ts: API client for backend

About

A platform for building, running, and interacting with LLM-powered agents. Modular backend, modern React frontend, and pluggable tools for research and coding.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors