Skip to content

junjie-w/ml-iris-inference-fastapi

Repository files navigation

Iris Inference API 🪻

Python Version FastAPI scikit-learn Docker

A FastAPI service for predicting iris flower species using scikit-learn's RandomForestClassifier.

The model is trained on the classic Iris flower dataset. By providing flower measurements (sepal and petal dimensions) as input, the API returns the most likely species classification with probability score.

Available as a Docker image.

🪻 API Reference

Endpoint Method Description
/ GET API info and available endpoints
/model/info GET Classifier details
/predict POST Single iris flower prediction
/predict/batch POST Batch iris flower predictions

Try all API endpoints with the included script:

./try_api.sh
Example API Requests

Root Endpoint

curl http://localhost:8000/

Model Info Endpoint

curl http://localhost:8000/model/info

Single Prediction

curl -X POST http://localhost:8000/predict \
  -H "Content-Type: application/json" \
  -d '{
    "sepal_length": 5.1,
    "sepal_width": 3.5,
    "petal_length": 1.4,
    "petal_width": 0.2
  }'

Batch Prediction

curl -X POST http://localhost:8000/predict/batch \
  -H "Content-Type: application/json" \
  -d '{
    "samples": [
      {
        "sepal_length": 5.1,
        "sepal_width": 3.5,
        "petal_length": 1.4,
        "petal_width": 0.2
      },
      {
        "sepal_length": 6.2,
        "sepal_width": 2.9,
        "petal_length": 4.3,
        "petal_width": 1.3
      }
    ]
  }'

🪻 Development Setup

# Clone repo
git clone https://github.com/junjie-w/ml-iris-inference-fastapi.git
cd ml-iris-inference-fastapi

# Install dependencies
pip install -r requirements.txt

# Train model (creates iris_model.pkl)
python model_training.py

# Run the API
python run.py

🪻 Docker Usage

Pre-built Image from Docker Hub

# Pull image from Docker Hub
docker pull junjiewu0/iris-inference-api

# For ARM-based machines (Apple Silicon, etc.)
docker pull --platform linux/amd64 junjiewu0/iris-inference-api

# Run container
docker run -p 8000:8000 junjiewu0/iris-inference-api

# For ARM-based machines (Apple Silicon, etc.)
docker run --platform linux/amd64 -p 8000:8000 junjiewu0/iris-inference-api

Build Image Locally

# Build image
docker build -t iris-inference-api .

# Run container
docker run -p 8000:8000 iris-inference-api

🪻 Run Tests

# Run the test suite
pytest

# For test coverage
pytest --cov=app tests/

🪻 Makefile Commands

make run                     # Start the API server
make dev                     # Start the server with auto-reload
make test                    # Run tests
make coverage                # Run tests with coverage report
make train                   # Train the model (creates iris_model.pkl)
make docker-build            # Build the Docker image
make docker-run              # Run container from local image
make docker-pull-remote      # Pull pre-built image from Docker Hub
make docker-run-remote       # Run container from pre-built Docker Hub image

About

FastAPI service for iris flower species inference using scikit-learn's RandomForestClassifier. Available as a Docker image.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published