Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 Auto-Update Model List from OpenRouter API for Latest Choices! 🧠 #4875

Closed
champ2050 opened this issue Nov 10, 2024 · 5 comments
Closed
Labels
enhancement New feature or request

Comments

@champ2050
Copy link

champ2050 commented Nov 10, 2024

Feature Request:

The model dropdown in OpenHands is missing several models available on OpenRouter, such as Gemini 1.5 Exp, Gemini 1.5 Flash, and others. Additionally, some outdated models still appear in the list, like Deepseek Coder V2 (which is deprecated by OpenRouter), while the latest model, Deepseek Coder V2.5, is not included. To ensure access to all available models, please add functionality to dynamically retrieve model options from the OpenRouter Models API.

Proposed Solution:

  • API Integration: Fetch model options from OpenRouter's API to auto-populate the dropdown.
  • Automatic Updates: Update the model list whenever OpenRouter adds new models, eliminating manual updates.
  • Fallback Option: In case of API issues, revert to a default model list.

Benefits:

  • Up-to-date Model Access: Users get immediate access to new models.
  • Reduced Maintenance: No need for frequent updates to the model list.

This would make OpenHands more adaptable and useful for model testing.

@champ2050 champ2050 added the enhancement New feature or request label Nov 10, 2024
@mamoodi
Copy link
Collaborator

mamoodi commented Nov 10, 2024

Regardless of the models available in the dropdown, you can enabled 'Advanced Options' and then specify your model by typing it out in the box. Therefore it doesn't have to be in the dropdown list.
However, for greater convenience having up to date model list does make sense.

@champ2050
Copy link
Author

Used Advanced Options, seems like it has issues with Litellm:

Custom Model: google/gemini-flash-1.5-exp
Base Url: https://openrouter.ai/api/v1
API: Entered.
Showed the below error:

litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=google/gemini-flash-1.5-exp Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

Regardless of the models available in the dropdown, you can enabled 'Advanced Options' and then specify your model by typing it out in the box. Therefore it doesn't have to be in the dropdown list. However, for greater convenience having up to date model list does make sense.

@mamoodi
Copy link
Collaborator

mamoodi commented Nov 10, 2024

https://docs.all-hands.dev/modules/usage/llms/openrouter

Try this for model: openrouter/google/gemini-flash-1.5-exp

See if that works. You likely don't need to set the base url.

@champ2050
Copy link
Author

Perfect, worked like a glove, thanks 👍 .

Used Advanced Options:

Custom Model: openrouter/google/gemini-flash-1.5-exp
Base Url: https://openrouter.ai/api/v1
API: Entered.

https://docs.all-hands.dev/modules/usage/llms/openrouter

Try this for model: openrouter/google/gemini-flash-1.5-exp

See if that works. You likely don't need to set the base url.

@champ2050
Copy link
Author

Used Advanced Options and this workaround works, so I'm good.

Custom Model: openrouter/google/gemini-flash-1.5-exp
Base Url: https://openrouter.ai/api/v1
API: Entered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants