Skip to content

netrome/openai-impl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenAI Impl - a macro addressing the problem of long compilation times in Rust

...by making them LONGER!

A Rust procedural macro that generates method implementations using OpenAI's API.

Overview

This macro allows you to define trait implementations where the method bodies are generated automatically by OpenAI's language models. Simply provide the method signatures and optional hints, and the macro will call the OpenAI API to generate appropriate implementations.

Installation

Add this to your Cargo.toml:

[dependencies]
openai-macro = { path = "crates/openai-macro" }

Usage

use openai_macro::openai_impl;

trait Calculator {
    fn add(&self, a: i32, b: i32) -> i32;
    fn multiply(&self, a: i32, b: i32) -> i32;
}

struct MyCalculator;

#[openai_impl(
    model = "gpt-4o-mini",
    prompt = "Implement basic arithmetic operations"
)]
impl Calculator for MyCalculator {
    fn add(&self, a: i32, b: i32) -> i32 {
        // Implementation generated by OpenAI
    }

    fn multiply(&self, a: i32, b: i32) -> i32 {
        // Implementation generated by OpenAI
    }
}

Configuration

Environment Variables

  • OPENAI_API_KEY (required): Your OpenAI API key
  • OPENAI_BASE_URL (optional): Custom API endpoint (defaults to OpenAI's API)
  • OPENAI_OFFLINE=1: Use only cached implementations, no network requests

Macro Parameters

  • model (optional): OpenAI model to use (default: "gpt-4o-mini")
  • prompt (optional): Additional context or instructions for the AI

Features

  • no-network: Compile-time flag to disable network requests and use only cached implementations

Caching

Generated implementations are automatically cached based on a hash of:

  • Trait signatures
  • Self type
  • Model name
  • Prompt hint

The cache is stored in your build output directory and persists across builds.

Workspace Structure

This is a Cargo workspace containing:

  • crates/openai-macro/: The main procedural macro crate
  • examples/abuse/: Example usage demonstrating the macro

Development

Building

cargo build

Testing with Examples

# Set your API key
export OPENAI_API_KEY=your_api_key_here

# Build the example
cargo build -p abuse

# Or test in offline mode (uses cache only)
OPENAI_OFFLINE=1 cargo build -p abuse

Running in CI

For CI environments, use the no-network feature or OPENAI_OFFLINE=1 to avoid network calls:

cargo build --features no-network

License

Licensed under either of

  • Apache License, Version 2.0
  • MIT License

at your option.

About

Making rust compile times longer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages