quick fix to use openai instead of ollama, in vetor_storage.py
This commit is contained in:
@@ -105,12 +105,19 @@ The project is organized into 8 development phases as outlined in `PLANNING.md`:
|
||||
The project uses environment variables for configuration:
|
||||
|
||||
```env
|
||||
# Embedding configuration
|
||||
OLLAMA_EMBEDDING_MODEL=MODEL # Name of the Ollama model for embeddings
|
||||
OPENAI_EMBEDDING_MODEL=MODEL # Name of the OpenAI model for embeddings (default: text-embedding-ada-002)
|
||||
OPENAI_EMBEDDING_BASE_URL=URL # OpenAI-compatible API URL for embeddings
|
||||
OPENAI_EMBEDDING_API_KEY=KEY # API key for OpenAI-compatible embedding service
|
||||
EMBEDDING_STRATEGY=ollama # Strategy to use for embeddings: "ollama" (default) or "openai"
|
||||
|
||||
# Chat model configuration
|
||||
OLLAMA_CHAT_MODEL=MODEL # Name of the Ollama model for chat
|
||||
OPENAI_CHAT_URL=URL # OpenAI-compatible API URL
|
||||
OPENAI_CHAT_KEY=KEY # Authorization token for OpenAI-compatible API
|
||||
OPENAI_CHAT_MODEL=MODEL # Name of the OpenAI-compatible model to use
|
||||
CHAT_MODEL_STRATEGY=ollama # Strategy to use: "ollama" (default) or "openai"
|
||||
OPENAI_CHAT_URL=URL # OpenAI-compatible API URL for chat
|
||||
OPENAI_CHAT_KEY=KEY # Authorization token for OpenAI-compatible API for chat
|
||||
OPENAI_CHAT_MODEL=MODEL # Name of the OpenAI-compatible model to use for chat
|
||||
CHAT_MODEL_STRATEGY=ollama # Strategy to use for chat: "ollama" (default) or "openai"
|
||||
```
|
||||
|
||||
## Building and Running
|
||||
@@ -234,4 +241,6 @@ The project is in early development phase. The virtual environment is set up and
|
||||
### Troubleshooting Notes
|
||||
- If encountering "No module named 'unstructured_inference'" error, install unstructured-inference
|
||||
- If seeing OCR-related errors, ensure tesseract is installed at the system level and unstructured-pytesseract is available
|
||||
- For language detection issues, verify that appropriate spaCy models are downloaded
|
||||
- For language detection issues, verify that appropriate spaCy models are downloaded
|
||||
- If getting Ollama connection errors when using OpenAI strategy, ensure EMBEDDING_STRATEGY is set correctly in .env
|
||||
- When deploying without Ollama, set both CHAT_MODEL_STRATEGY and EMBEDDING_STRATEGY to "openai" in your .env file
|
||||
Reference in New Issue
Block a user