Getting Started
Get MAC up and running in minutes.
Install
Python 3.10+. Optional: pip install datasets for benchmark examples.
API Keys
Cloud models (OpenAI, OpenRouter):
Local models (vLLM): No key needed, just set base_url="http://localhost:8000/v1".
You can also put keys in a .env file (see .env.example).
Usage
MAC works in four steps:
- Define your task - provide a
task_descriptionandrule_type - Provide examples - a few
(input, output)pairs for training and holdout - Define a metric - any function that scores a prediction against gold
- Run MAC -
compile()learns the constitution; the result is a callable model
from mac import Example, MAC, CompiledMAC
train = [
Example(input="Find all integer bases b>9 where 17_b divides 97_b.", output="70"),
Example(input="How many ordered pairs (x,y) in [-100,100] satisfy 12x²-xy-6y²=0?", output="117"),
]
holdout = [
Example(input="Sum of positive integers n where n+2 divides 3(n+3)(n²+9)?", output="49"),
]
def metric(pred, gold):
try: return 1.0 if float(str(pred).strip()) == float(str(gold).strip()) else 0.0
except ValueError: return 0.0
optimized = MAC(
model="gpt-4o", task_description="Solve AIME math problems. Return only the integer answer.",
rule_type="math reasoning rules", num_epochs=1, batch_size=2,
).compile(trainset=train, holdout=holdout, metric=metric)
answer = optimized("Find all integer bases b>9 where 17_b divides 97_b.")
optimized.overview() # Rich panel: baseline → final, rules tree
optimized.save("rules.json") # Save / load later with CompiledMAC.load()
Next Steps
- Usage Modes - Custom prompt vs auto-adapt
- Model Configuration - Three-tier setup, provider support
- API Reference - Full constructor and method reference
- Examples - Runnable benchmark scripts (GSM8K, HotpotQA, HoVer)
- See MAC in Action - Watch real training runs