GithubHelp home page GithubHelp logo

tokencost's Introduction

TokenCost

Clientside token counting + price estimation for LLM apps and AI agents. Tokencost helps calculate the USD cost of using major Large Language Model (LLMs) APIs by calculating the estimated cost of prompts and completions.

Building AI agents? Check out AgentOps

Features

  • LLM Price Tracking Major LLM providers frequently add new models and update pricing. This repo helps track the latest price changes
  • Token counting Accurately count prompt tokens before sending OpenAI requests
  • Easy integration Get the cost of a prompt or completion with a single function

Example usage:

from tokencost import calculate_prompt_cost, calculate_completion_cost

model = "gpt-3.5-turbo"
prompt = [{ "role": "user", "content": "Hello world"}]
completion = "How may I assist you today?"

prompt_cost = calculate_prompt_cost(prompt, model)
completion_cost = calculate_completion_cost(completion, model)

print(f"{prompt_cost} + {completion_cost} = {prompt_cost + completion_cost}")
# 135 + 140 = 275 ($0.0000275)
# Priced in TPUs (token price units), which is 1/10,000,000th of a USD.

Installation

Recommended: PyPI:

pip install tokencost

Usage

Cost estimates

Calculating the cost of prompts and completions from OpenAI requests

from openai import OpenAI

client = OpenAI()
model = "gpt-3.5-turbo"
prompt = [{ "role": "user", "content": "Say this is a test"}]

chat_completion = client.chat.completions.create(
    messages=prompt, model=model
)

completion = chat_completion.choices[0].message.content
# "This is a test."

prompt_cost = calculate_prompt_cost(prompt, model)
completion_cost = calculate_completion_cost(completion, model)
print(f"{prompt_cost} + {completion_cost} = {prompt_cost + completion_cost}")
# 180 + 100 = 280 ($0.0000280)

from tokencost import USD_PER_TPU
print(f"Cost USD: ${(prompt_cost + completion_cost)/USD_PER_TPU}")
# $2.8e-05

Calculating cost using string prompts instead of messages:

prompt_string = "Hello world" 
response = "How may I assist you today?"
model= "gpt-3.5-turbo"

prompt_cost = calculate_prompt_cost(prompt_string, model)
print(f"Cost: ${prompt_cost/USD_PER_TPU}")
# Cost: $2e-07

Counting tokens

from tokencost import count_message_tokens, count_string_tokens

message_prompt = [{ "role": "user", "content": "Hello world"}]
# Counting tokens in prompts formatted as message lists
print(count_message_tokens(message_prompt, model="gpt-3.5-turbo"))
# 9

# Alternatively, counting tokens in string prompts
print(count_string_tokens(prompt="Hello world", model="gpt-3.5-turbo"))
# 2

Cost table

Units denominated in TPUs (Token Price Units = 1/10,000,000 USD)

Model Name Prompt Cost Completion Cost Max Prompt Tokens
gpt-3.5-turbo 15 20 4097
gpt-3.5-turbo-0301 15 20 4097
gpt-3.5-turbo-0613 15 20 4097
gpt-3.5-turbo-16k 30 40 16385
gpt-3.5-turbo-16k-0613 30 40 16385
gpt-3.5-turbo-1106 10 20 16385
gpt-3.5-turbo-instruct 15 20 4096
gpt-4 300 600 8192
gpt-4-0314 300 600 8192
gpt-4-0613 300 600 8192
gpt-4-32k 600 1200 32768
gpt-4-32k-0314 600 1200 32768
gpt-4-32k-0613 600 1200 32768
gpt-4-1106-preview 100 300 128000
gpt-4-vision-preview 100 300 128000
text-embedding-ada-002 1 N/A 8192

Running locally

Installation via GitHub:

git clone [email protected]:AgentOps-AI/tokencost.git
cd tokencost
pip install -e .

Running tests

  1. Install pytest if you don't have it already
pip install pytest
  1. Run the tests/ folder while in the parent directory
pytest tests

This repo also supports tox, simply run python -m tox.

Contributing

Contributions to TokenCost are welcome! Feel free to create an issue for any bug reports, complaints, or feature suggestions.

License

TokenCost is released under the MIT License.

tokencost's People

Contributors

trisha avatar areibman avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.