切换成new-api方式进行llm调用
This commit is contained in:
@ -2,7 +2,7 @@
|
||||
|
||||
This project exposes a FastAPI-based microservice that provides:
|
||||
|
||||
- A unified chat completions gateway supporting multiple LLM providers (OpenAI, Anthropic, OpenRouter, Gemini, Qwen, DeepSeek, etc.)
|
||||
- A unified chat completions gateway that now forwards requests to the internal `new-api` service (default `http://localhost:3000`) while preserving the same client-facing schema.
|
||||
- An asynchronous data import analysis pipeline that orchestrates LLM calls to produce structured metadata and processing recommendations
|
||||
|
||||
The following instructions cover environment setup, dependency installation, and running the backend service.
|
||||
@ -56,6 +56,7 @@ Copy `.env.example` to `.env` (if provided) or edit `.env` to supply API keys an
|
||||
- `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, `OPENROUTER_API_KEY`, etc.
|
||||
- `HTTP_CLIENT_TIMEOUT`, `IMPORT_CHAT_TIMEOUT_SECONDS`
|
||||
- `LOG_LEVEL`, `LOG_FORMAT` for logging
|
||||
- `NEW_API_BASE_URL` (defaults to `http://localhost:3000`) and optional `NEW_API_AUTH_TOKEN` if the new-api component enforces authentication.
|
||||
|
||||
|
||||
## Run the Backend Service
|
||||
@ -84,4 +85,4 @@ Or use a process manager such as `pm2`, `supervisor`, or systemd for production
|
||||
|
||||
- Run the data import analysis example: `python test/data_import_analysis_example.py`
|
||||
- Test the OpenRouter demo: `python test/openrouter_chat_example.py`
|
||||
- Send a DeepSeek chat request script: `python scripts/deepseek_request.py`
|
||||
- Send a DeepSeek chat request script: `python scripts/deepseek_request.py`
|
||||
|
||||
Reference in New Issue
Block a user