Prerequisites
- Python 3.8+
cyclspackage installed- Docker installed (for local testing)
- OpenAI API key
Note: This guide uses OpenAI, but LangChain and Cycls support many providers including Anthropic, Google Gemini, Mistral, Cohere, and more. Simply swap the pip dependency and model name.
Step 1: Import Cycls
Create a new file calledagent.py and import the cycls package:
Step 2: Configure Environment
Create a file named.env in the same directory to store your API key safely:
Step 3: Initialize the Agent
Initialize the agent with LangChain dependencies and include your.env file.
pip: Installs required Python packages.copy: Copies your.envfile to the agent’s environment so it can access your keys.
Step 4: Define the Agent Logic
Use the@agent decorator to register your async handler. We’ll load the environment variables, initialize the chat model, and stream the response.
Important: Import dependencies inside the function body to ensure they work in the remote environment.
Step 5: Deploy the Agent
Add the deployment call at the end of your file:| Parameter | Description |
|---|---|
prod=False | Development mode (local testing) |
prod=True | Production deployment |
Full Code
Here is the completeagent.py file:
Step 6: Run the Agent
Execute your agent:Using Other LLM Providers
Swap the dependency and model name to use a different provider:| Provider | Pip Package | Model Example |
|---|---|---|
| OpenAI | langchain-openai | gpt-4o |
| Anthropic | langchain-anthropic | claude-sonnet-4-5-20250929 |
langchain-google-genai | gemini-3.0-pro | |
| Mistral | langchain-mistralai | mistral-large-latest |