We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug description
I am trying to use MetaGPT via a third party url wrapper. It works for me with openai and claude:
llm: api_type: "openai" # or azure / ollama / groq etc. model: "gpt-4-turbo" # or gpt-3.5-turbo base_url: "http://localhost:8989/openai" api_key: "xxxx"
But when i configure ollama, i have a problem:
llm: api_type: "ollama" # or azure / ollama / groq etc. model: "llama2" # or gpt-3.5-turbo base_url: "http://localhost:8989/ollama"
But it fails with a 404 not found, because the base URL needs to be: /api/chat and it just gets /chat
@property def api_suffix(self) -> str: return "/chat"
I also tried to setup a proxy:
llm: api_type: "ollama" # or azure / ollama / groq etc. model: "llama2" # or gpt-3.5-turbo base_url: "http://localhost:11434/api" # or forward url / other llm url proxy: "http://localhost:8989"
But it doesn't seem to be even picked. How can i configure ollama via this wrapper url?
Environment information mac os m2 llm type -> ollama
The text was updated successfully, but these errors were encountered:
Fix geekan#1721: Add support for Ollama with third-party URL wrappers
a317881
llm: api_type: "ollama" model: "llama3.2" # or gpt-3.5-turbo base_url: "http://localhost:11434/api" api_key: "any string will be ok"
Sovled by: #1710
Sorry, something went wrong.
No branches or pull requests
Bug description
I am trying to use MetaGPT via a third party url wrapper. It works for me with openai and claude:
But when i configure ollama, i have a problem:
But it fails with a 404 not found, because the base URL needs to be:
/api/chat and it just gets /chat
I also tried to setup a proxy:
But it doesn't seem to be even picked. How can i configure ollama via this wrapper url?
Environment information
mac os m2
llm type -> ollama
The text was updated successfully, but these errors were encountered: