๐ฎ TinyChat is a lightweight Desktop client for modern Language Models designed for straightforward comprehension. Supports OpenAI, Anthropic, Meta, Mistral, Google and Cohere APIs.
I would like this feature to be available in the settings.
The default value should be https://api.openai.com/v1, however, it should also allow the specification of custom values.
I would also like to see similar setting for Google Gemini API base, with default value https://generativelanguage.googleapis.com or https://generativelanguage.googleapis.com/v1beta.
This should not be a fixed option, as there may be a need to switch APIs dynamically.
Please note:
Different APIs typically do not use the same API keys.
Sometimes, certain APIs do not require API keys at all, and more importantly, passing a redundant value as an API key could potentially trigger an error.
Adding support for the HF inference API will unlock access to numerous models. Let me see if I get the time to play with your awesome little project and add support myself ๐
Currently, there are two distinct OpenAI options in the providers list: "gpt-3.5 turbo" and "gpt-4 turbo". It would be more efficient to consolidate these into a single item, with the capability to switch between different models. Naturally, the ability to assign custom model names is necessary.
The same improvements should also be made for Claude, LLama, and Mistral.