GithubHelp home page GithubHelp logo

langfuse / langfuse-python Goto Github PK

View Code? Open in Web Editor NEW
48.0 4.0 50.0 2.62 MB

๐Ÿชข Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework

Home Page: https://langfuse.com/docs/sdk/python

License: MIT License

Python 100.00%
decorators langfuse pydantic pydantic-v2 python

langfuse-python's Introduction

GitHub Banner

Langfuse Python SDK

MIT License CI test status PyPI Version GitHub Repo stars Discord YC W23

Installation

Important

The SDK was rewritten in v2 and released on December 17, 2023. Refer to the v2 migration guide for instructions on updating your code.

pip install langfuse

Docs

Interfaces

Interfaces:

  • @observe() decorator (docs)
  • Low-level tracing SDK (docs)
  • Wrapper of Langfuse public API

Integrations

langfuse-python's People

Contributors

bell-steven avatar brandonkzw avatar christho23 avatar davidlms avatar dependabot[bot] avatar dev-khant avatar hassiebp avatar hubert-springbok avatar hugomichard avatar kobrinartem avatar marcklingen avatar maxdeichmann avatar noble-varghese avatar richardkruemmel avatar rubms avatar samyxdev avatar singhcoder avatar yigitbey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

langfuse-python's Issues

Missing `backoff` dependency

backoff is missing when installing langfuse on a fresh environment.

sh-3.2$ python3 -m venv env
sh-3.2$ source env/bin/activate
(env) sh-3.2$ python3 -V
Python 3.11.4
(env) sh-3.2$ python3 -m pip install langfuse
Collecting langfuse
  Using cached langfuse-1.0.18-py3-none-any.whl (45 kB)
Collecting ... (skipping a few lines here)
Installing collected packages: pytz, urllib3, typing-extensions, tenacity, sniffio, six, PyYAML, packaging, numpy, mypy-extensions, multidict, idna, h11, frozenlist, charset-normalizer, certifi, attrs, async-timeout, yarl, typing-inspect, SQLAlchemy, requests, python-dateutil, pydantic, numexpr, marshmallow, anyio, aiosignal, langsmith, httpcore, dataclasses-json, aiohttp, langchain, httpx, langfuse
Successfully installed PyYAML-6.0.1 SQLAlchemy-2.0.20 aiohttp-3.8.5 aiosignal-1.3.1 anyio-4.0.0 async-timeout-4.0.3 attrs-23.1.0 certifi-2023.7.22 charset-normalizer-3.2.0 dataclasses-json-0.5.14 frozenlist-1.4.0 h11-0.14.0 httpcore-0.17.3 httpx-0.24.1 idna-3.4 langchain-0.0.286 langfuse-1.0.18 langsmith-0.0.35 marshmallow-3.20.1 multidict-6.0.4 mypy-extensions-1.0.0 numexpr-2.8.6 numpy-1.25.2 packaging-23.1 pydantic-1.10.12 python-dateutil-2.8.2 pytz-2023.3.post1 requests-2.31.0 six-1.16.0 sniffio-1.3.0 tenacity-8.2.3 typing-extensions-4.7.1 typing-inspect-0.9.0 urllib3-2.0.4 yarl-1.9.2
(env) sh-3.2$ python3 -c "from langfuse.model import CreateTrace"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/__init__.py", line 1, in <module>
    from .client import Langfuse
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/client.py", line 28, in <module>
    from langfuse.task_manager import TaskManager
  File "/Users/alex/env/lib/python3.11/site-packages/langfuse/task_manager.py", line 8, in <module>
    import backoff
ModuleNotFoundError: No module named 'backoff'
(env) sh-3.2$ python3 -m pip install backoff
Collecting backoff
  Using cached backoff-2.2.1-py3-none-any.whl (15 kB)
Installing collected packages: backoff
Successfully installed backoff-2.2.1
(env) sh-3.2$ python3 -c "from langfuse.model import CreateTrace"
(env) sh-3.2$

Python SDK not compatible with Pydantic 2.0

poetry add langfuse           
The currently activated Python version 3.10.10 is not supported by the project (^3.11).
Trying to find and use a compatible version. 
Using python3 (3.11.0)
Using version ^1.1.2 for langfuse

Updating dependencies
Resolving dependencies... (0.7s)

Because no versions of pydantic-settings match >2.0.2,<2.0.3 || >2.0.3,<3.0.0
 and pydantic-settings (2.0.2) depends on pydantic (>=2.0.1), pydantic-settings (>=2.0.2,<2.0.3 || >2.0.3,<3.0.0) requires pydantic (>=2.0.1).
And because pydantic-settings (2.0.3) depends on pydantic (>=2.0.1), pydantic-settings (>=2.0.2,<3.0.0) requires pydantic (>=2.0.1).
Because no versions of langfuse match >1.1.2,<2.0.0
 and langfuse (1.1.2) depends on pydantic (>=1.10.7,<2.0), langfuse (>=1.1.2,<2.0.0) requires pydantic (>=1.10.7,<2.0).
Thus, langfuse (>=1.1.2,<2.0.0) is incompatible with pydantic-settings (>=2.0.2,<3.0.0).
So, because my-llm-app depends on both pydantic-settings (^2.0.2) and langfuse (^1.1.2), version solving failed.

Amazon Bedrock and Authropic Claude issue with indenting model_name

How to set model name so that LangFuse can get it. I have errors:

'model_name'
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/langfuse/callback.py", line 479, in __on_llm_action
model_name = kwargs["invocation_params"]["model_name"]
KeyError: 'model_name'
run not found
Traceback (most recent call last):
File "/opt/conda/lib/python3.10/site-packages/langfuse/callback.py", line 541, in on_llm_end
raise Exception("run not found")
Exception: run not found

bug: `meta.update()` fails when `meta = None`

This was incorrectly filed against the langfuse repo first (see langfuse/langfuse#1775)

Describe the bug

I believe this Commit da264a9 from last week has introduced a bug at https://github.com/langfuse/langfuse-python/blame/main/langfuse/callback/langchain.py#L722 by allowing returning None (edit: it seems the version before the commit has also allowed returning None, so it is interesting why I only see this now), which then fails at https://github.com/langfuse/langfuse-python/blame/main/langfuse/callback/langchain.py#L445, as None doesn't have an update() method

To reproduce

Run a callback with no metadata or tags and you will hit an error:

File "/..../lib/python3.10/site-packages/langfuse/callback/langchain.py", line 443, in on_tool_start
    meta.update(
    โ”” None

Additional information

No response

Replace dependency on langchain with langchain-core

langchain has been split into several slim packages, including langchain-core, langchain-<aws, ...>. Users, especially who are using LCEL, don't need to install the whole langchain package in their working environment.

However, the langfuse CallbackHanlder still takes a hard dependency on langchain, instead of langchain-core. See https://github.com/langfuse/langfuse-python/blob/main/langfuse/callback/langchain.py#L7

Modules here could be imported from langchain_core.

[Langchain Integration] Support HuggingFaceHub as LLM

A user got the following error when using the Langchain integration with HuggingFaceHub for LLMs.

ERROR:root:'model_name'
ERROR:root:run not found

Steps

  • investigate issue to find root cause
  • implement fix
  • Add additional tests to the langchain test suite

The user's implementation


def initialize_huggingface_llm(prompt: PromptTemplate, temperature: float, max_length: int) -> LLMChain:
    repo_id = "google/flan-t5-xxl"

    # Experiment with the max_length parameter and temperature
    llm = HuggingFaceHub(
        repo_id=repo_id, model_kwargs={"temperature": temperature, "max_length": max_length}
    )
    return LLMChain(prompt=prompt, llm=llm)

def generate_prompt() -> PromptTemplate:
    # You can play around with the prompt, see the results change if you make small changes to the prompt
    template = """Given the name of the country, give the languages that are spoken in that country. 
    Start with the official languages of the country and continue with the other languages of that country.
    Country: {country}?
    Languages: 
    """

    return PromptTemplate(template=template, input_variables=["country"])
if __name__ == '__main__':
    load_dotenv()

    handler = CallbackHandler(os.getenv('LANGFUSE_PUBLIC_KEY'),
                              os.getenv('LANGFUSE_SECRET_KEY'),
                              os.getenv('LANGFUSE_HOST'))

    # Try other values to see impact on results
    country = "belgium"
    country_max_length = 100
    country_temperature = 0.1

    country_prompt = generate_prompt()

    hugging_chain = initialize_huggingface_llm(prompt=country_prompt,
                                               temperature=country_temperature,
                                               max_length=country_max_length)
    
    print("HuggingFace")
    print(hugging_chain.run(country, callbacks=[handler]))

Host on CallbackHandler should be optional

from langfuse.callback import CallbackHandler
handler = CallbackHandler(PUBLIC_KEY, SECRET_KEY)

Currently need to manually set host=None as positional argument is required

Add end() method to spans and generations

Users want to be able to quickly end spans and generations.

generation = langfuse.generation(...)
generation.end()

  • For StatefulGenerationClient and StatefulSpanClient, add end() methods, send endTime to the Langfuse server. Under the hood, they can use the existing update functions.
  • Create test coverage for both of the functions

Testing with local Langfuse server

Context
Currently tests depend on instance of langfuse server and require HOST, LF_PK and LF_SK environment secrets.

Goal
tests of the SDK should not require any secrets to enable safely running them on forks of this project (except for E2E test that use external APIs, e.g. OpenAI, Huggingface Hub)

Potential solution
Run dockerized langfuse (langfuse/langfuse) in CI, acc to instructions here: https://langfuse.com/docs/deployment/local

Blocked by
Need to add db seeder to langfuse/langfuse to create default user, project, public key, secret key

Failure to format api errors

I'm having trouble trying out langfuse due to some error formatting bad status codes in this block:

if 200 <= _response.status_code < 300:
return pydantic.parse_obj_as(Trace, _response.json()) # type: ignore
if _response.status_code == 400:
raise Error(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 401:
raise UnauthorizedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 403:
raise AccessDeniedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 405:
raise MethodNotAllowedError(pydantic.parse_obj_as(str, _response.json())) # type: ignore
if _response.status_code == 404:
raise NotFoundError(pydantic.parse_obj_as(str, _response.json())) # type: ignore

  1. httpx.Response.json() surprisingly returns a dict object, not str (meanwhile, it's typed as Any)

https://github.com/encode/httpx/blob/053bc57c3799801ff11273dd393cb0715e63ecf9/httpx/_models.py#L756

  1. This causes pydantic.parse_obj_as(str, _) to fail:
pydantic.error_wrappers.ValidationError: 1 validation error for ParsingModel[str]
__root__
  str type expected (type=type_error.str)

It looks like this pattern is used a lot through the python sdk, I found 110 instances of "parse_obj_as(str".

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.