Steps for running cli from docker
This commit is contained in:
parent
0c266d450e
commit
5064e8dfbb
|
|
@ -59,11 +59,23 @@ docker run --rm \
|
|||
```
|
||||
Adjust environment variables as needed for your local setup.
|
||||
|
||||
### Run the TradingAgents CLI
|
||||
To run the cli interface
|
||||
```bash
|
||||
docker run --rm \
|
||||
-e LLM_PROVIDER="ollama" \
|
||||
-e LLM_BACKEND_URL="http://localhost:11434/v1" \
|
||||
# Add other necessary environment variables for main.py
|
||||
tradingagents python -m cli.main
|
||||
```
|
||||
Adjust environment variables as needed for your local setup.
|
||||
|
||||
### Using Docker Compose
|
||||
|
||||
For a more streamlined local development experience, you can use Docker Compose. The `docker-compose.yml` file in the project root is configured to use the existing `Dockerfile`.
|
||||
|
||||
**Build and Run Tests:**
|
||||
|
||||
The default command in `docker-compose.yml` is set to run the test suite.
|
||||
```bash
|
||||
docker-compose up --build
|
||||
|
|
@ -78,12 +90,20 @@ docker-compose run --rm app python test_ollama_connection.py
|
|||
```
|
||||
|
||||
**Run the Main Application:**
|
||||
|
||||
To run the `main.py` script, you can override the default command:
|
||||
```bash
|
||||
docker-compose run --rm app python main.py
|
||||
```
|
||||
Or, you can modify the `command` in `docker-compose.yml` if you primarily want `docker-compose up` to run the main application.
|
||||
|
||||
**Run the TradingAgents CLI Application:**
|
||||
|
||||
To run the `cli/main.py` script, you can override the default command:
|
||||
```bash
|
||||
docker-compose run --it app python -m cli.main
|
||||
```
|
||||
|
||||
**Environment Variables:**
|
||||
The necessary environment variables (like `LLM_PROVIDER`, `LLM_BACKEND_URL`, model names, etc.) are pre-configured in the `docker-compose.yml` for the `app` service. Ollama is started by the entrypoint script within the same container, so `LLM_BACKEND_URL` is set to `http://localhost:11434/v1`.
|
||||
|
||||
|
|
|
|||
|
|
@ -151,6 +151,7 @@ def select_shallow_thinking_agent(provider) -> str:
|
|||
],
|
||||
"ollama": [
|
||||
("llama3.2 local", "llama3.2"),
|
||||
("qwen3 local", "qwen3:0.6b"),
|
||||
]
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -6,6 +6,7 @@ services:
|
|||
volumes:
|
||||
- .:/app # Mount current directory to /app in container for live code changes
|
||||
- ./.ollama:/app/.ollama # Cache Ollama models
|
||||
- ./data:/app/data # Mount data directory for data files
|
||||
#environment:
|
||||
# - LLM_PROVIDER=ollama
|
||||
# - LLM_BACKEND_URL=http://localhost:11434/v1
|
||||
|
|
@ -20,8 +21,7 @@ services:
|
|||
env_file:
|
||||
- .env # Load environment variables from files.env
|
||||
#command: python test_ollama_connection.py # Uncomment to run a specific test script
|
||||
# If you want `docker-compose up` to run tests and then exit, this command is fine.
|
||||
# If you want `docker-compose up` to run main.py, change command or remove it to use Dockerfile's CMD.
|
||||
#command: python -m cli.main # Uncomment to run cli interface
|
||||
# For more flexibility, users can override the command when using `docker-compose run`.
|
||||
ports:
|
||||
- "11434:11434" # Expose port 11434 for Ollama
|
||||
|
|
@ -2,7 +2,10 @@ import os
|
|||
|
||||
DEFAULT_CONFIG = {
|
||||
"project_dir": os.path.abspath(os.path.join(os.path.dirname(__file__), ".")),
|
||||
"data_dir": "/Users/yluo/Documents/Code/ScAI/FR1-data",
|
||||
"data_dir": os.path.join(
|
||||
os.path.abspath(os.path.join(os.path.dirname(__file__), ".")),
|
||||
"data",
|
||||
),
|
||||
"data_cache_dir": os.path.join(
|
||||
os.path.abspath(os.path.join(os.path.dirname(__file__), ".")),
|
||||
"dataflows/data_cache",
|
||||
|
|
|
|||
Loading…
Reference in New Issue