LLM Providers #52
Reference in New Issue
Block a user
Delete Branch "29-aggiungere-llm-providers"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Pull Request Overview
This PR adds support for multiple LLM providers (OpenAI, Mistral, DeepSeek) alongside the existing Gemini and Ollama providers. The changes refactor model validation logic by moving it from the main
Configclass to theModelsConfigclass for better encapsulation.Configclass toModelsConfigclassReviewed Changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 2 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
@@ -54,6 +58,8 @@ class AppModel(BaseModel):output_schema=output_schemaThe warning message is generic and doesn't identify which provider is affected. Consider including the provider name or class name in the message for easier debugging, e.g.,
f\"No {key} set in environment variables for {clazz.__name__}.\"@@ -76,0 +117,4 @@model.model = clazzdef __validate_ollama_models(self) -> None:"""The docstring mentions 'like Gemini' as an example but this method now validates multiple providers (OpenAI, Mistral, DeepSeek, Gemini). Update the description to be provider-agnostic, e.g., 'Validate models for online API-based providers.'