I'm excited to open source Fury. A lightweight python package making it easy to build with LLMs, even if it's your first time.

https://github.com/huwprosser/fury

With Fury the following things work out-of-the-box:

import asyncio
from fury import Agent

async def main():
    # Initialize the agent
    agent = Agent(
        model="your-model-name", # e.g., "gpt-4o" or a local model
        system_prompt="You are a helpful assistant.",
        base_url="http://127.0.0.1:8080/v1", # or https://openrouter.ai/api/v1, https://api.openai.com/v1
        api_key="your-api-key"
    )

    history = []

    # Simple chat loop
    while True:
        user_input = input("> ")
        history.append({"role": "user", "content": user_input})

        print()
        async for chunk, reasoning, tool_call in agent.chat(history):
            if chunk:
                print(chunk, end="", flush=True)
        print("\n")

if __name__ == "__main__":
    asyncio.run(main())

Examples:

Pip package coming soon! For now, install directly from Github.

Build neat stuff 🚀