This is a straightforward, zero-dependency CLI tool to interact with any LLM service.
It is available in several flavors:
- Python version. Compatible with CPython or PyPy, v3.10 or higher.
- JavaScript version. Compatible with Node.js (>= v18) or Bun (>= v1.0).
- Clojure version. Compatible with Babashka (>= 1.3).
- Go version. Compatible with Go, v1.19 or higher.
Ask LLM is compatible with either a cloud-based (managed) LLM service (e.g. OpenAI GPT model, Groq, OpenRouter, etc) or with a locally hosted LLM server (e.g. llama.cpp, LocalAI, Ollama, etc). Please continue reading for detailed instructions.
Interact with the LLM with:
./ask-llm.py # for Python user
./ask-llm.js # for Node.js user
./ask-llm.clj # for Clojure user
go run ask-llm.go # for Go user
or pipe the question directly to get an immediate answer:
echo "Why is the sky blue?" | ./ask-llm.py
or request the LLM to perform a certain task:
echo "Translate into German: thank you" | ./ask-llm.py
Supported local LLM servers include llama.cpp, Jan, Ollama, and LocalAI.
To utilize llama.cpp locally with its inference engine, ensure to load a quantized model such as Phi-3 Mini, LLama-3 8B, or OpenHermes 2.5. Adjust the environment variable LLM_API_BASE_URL
accordingly:
/path/to/llama.cpp/server -m Phi-3-mini-4k-instruct-q4.gguf
export LLM_API_BASE_URL=http://127.0.0.1:8080/v1
To use Jan with its local API server, refer to its documentation and load a model like Phi-3 Mini, LLama-3 8B, or OpenHermes 2.5 and set the environment variable LLM_API_BASE_URL
:
export LLM_API_BASE_URL=http://127.0.0.1:1337/v1
export LLM_CHAT_MODEL='llama3-8b-instruct'
To use Ollama locally, load a model and configure the environment variable LLM_API_BASE_URL
:
ollama pull phi3
export LLM_API_BASE_URL=http://127.0.0.1:11434/v1
export LLM_CHAT_MODEL='phi3'
For LocalAI, initiate its container and adjust the environment variable LLM_API_BASE_URL
:
docker run -ti -p 8080:8080 localai/localai tinyllama-chat
export LLM_API_BASE_URL=http://localhost:3928/v1
To use OpenAI GPT model, configure the environment variable OPENAI_API_KEY
with your API key:
export OPENAI_API_KEY="sk-yourownapikey"
To utilize other LLM services, populate the relevant environment variables as demonstrated in the following examples:
export LLM_API_BASE_URL=https://api.endpoints.anyscale.com/v1
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="meta-llama/Llama-3-8b-chat-hf"
export LLM_API_BASE_URL=https://api.deepinfra.com/v1/openai
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="mistralai/Mistral-7B-Instruct-v0.1"
export LLM_API_BASE_URL=https://api.fireworks.ai/inference/v1
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="accounts/fireworks/models/llama-v3-8b-instruct"
export LLM_API_BASE_URL=https://api.groq.com/openai/v1
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="gemma-7b-it"
export LLM_API_BASE_URL=https://mixtral-8x7b.lepton.run/api/v1/
export LLM_API_KEY="yourownapikey"
export LLM_API_BASE_URL=https://api.novita.ai/v3/openai
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="meta-llama/llama-3-8b-instruct"
export LLM_API_BASE_URL=https://text.octoai.run/v1/
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="hermes-2-pro-mistral-7b"
export LLM_API_BASE_URL=https://openrouter.ai/api/v1
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="meta-llama/llama-3-8b-instruct:free"
export LLM_API_BASE_URL=https://api.together.xyz/v1
export LLM_API_KEY="yourownapikey"
export LLM_CHAT_MODEL="meta-llama/Llama-3-8b-chat-hf"
ask-llm's People
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ๐๐๐
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google โค๏ธ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.