GithubHelp home page GithubHelp logo

apssouza22 / chatflow Goto Github PK

View Code? Open in Web Editor NEW
118.0 118.0 22.0 3.3 MB

Leveraging LLM to build Conversational UIs

License: BSD 2-Clause "Simplified" License

Dockerfile 0.25% HTML 1.26% JavaScript 1.85% TypeScript 60.10% CSS 0.14% Python 35.47% Shell 0.03% PowerShell 0.91%
ai aiagents llm natual-language-inference rag-llm

chatflow's Introduction

Hey ๐Ÿ‘‹, This is Alexsandro Souza


Coding


Gmail Badge Linkedin Badge Github Badge Portfolio Badge

I am passionate about software development and agile methods. I love solving team and company problems from a tactical and strategic point of view.

I help teams and company to achieve more. Improving code, processes, flows, architecture, communication and human resources.

Beyond my day-to-day work, I am giving back to positively impact the community with articles, videos, lectures and free courses.

Some of my Github Stats

apssouza22

Alex's GitHub stats

My most interesting projects

Architecture and Desing

AI/DL/ML

chatflow's People

Contributors

apssouza22 avatar giovannismokes avatar lucygrind avatar vporton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

chatflow's Issues

No input after answering all questions

After answering the questions, there should be an input field for user to try to communicate on his/her topic. But there are no such input visible. It may cost losing us many users:

image

No proper namespace for Redis keys

The following demonstrates that data_vector:9051 is available in the top-level namespace of Redis.

> KEYS *
...
data_vector:9051
...
> HGETALL data_vector:9051
...

That wrong, because on a real server Redis may be simultaneously used by other software and that key data_vector: is without a unique prefix.

After being tricked by the logic of existing software, I also created a subroutine that creates keys like:

predict_cache:Show 10 of my GitHub repos.

that is without a unique prefix, too.

It was created by:

async def create_predict_cache():
    value_field = TextField("value")

    # Create index
    await redis_conn.ft(INDEX_NAME + "_predict_cache").create_index(
        fields=[value_field],
        definition=IndexDefinition(prefix=["predict_cache:"], index_type=IndexType.HASH)
    )

Here I was tricked by my intuitive understanding of API that .ft(...) provides a namespace INDEX_NAME + "_predict_cache" but this intuition contradicts to the fact that

HGETALL "predict_cache:Show 10 of my GitHub repos."

retrieves data without a unique prefix having been specified.

The next step is to understand what is the argument of .ft(), does it provide any useful value (because it has no expected value of providing a prefix), and then what to replace it by.

add user and app information to the UI

Add to the topbar user and active application information. Currently we can not really tell what app is currently active in the session.

Here where the active app is set

Implement Upload Files From The Chat

Add the option to upload image using natural language. The user should just write "upload file" and the file upload option would appear to him

Store Apps ina DB

Currently the apps are stored using file, we should change it to store in the postgres db.

Simplify onboarding

The software asks a long series of questions for new users.

Many users would leave it in the middle. We need to make users able not to answer all these questions.

redis.exceptions.ResponseError: chat_docs: no such index

In the first run of the software as described in README.md:

INFO:     127.0.0.1:43206 - "POST /api/v1/docs/search HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/porton/.local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "/home/porton/.local/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
    return await self.app(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/fastapi/applications.py", line 284, in __call__
    await super().__call__(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/cors.py", line 91, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/cors.py", line 146, in simple_response
    await self.app(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "/home/porton/.local/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "/home/porton/.local/lib/python3.9/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "/home/porton/.local/lib/python3.9/site-packages/starlette/routing.py", line 66, in app
    response = await func(request)
  File "/home/porton/.local/lib/python3.9/site-packages/fastapi/routing.py", line 241, in app
    raw_response = await run_endpoint_function(
  File "/home/porton/.local/lib/python3.9/site-packages/fastapi/routing.py", line 167, in run_endpoint_function
    return await dependant.call(**values)
  File "/home/porton/Projects/chatflow/server/src/api/docs.py", line 31, in find_docs
    return await doc_search_service.full_search(search_req)
  File "/home/porton/Projects/chatflow/server/src/core/docs_search/doc_service.py", line 34, in full_search
    results_vector = await self.vector_search.search_vectors(search_req.tags, vector_bytes)
  File "/home/porton/Projects/chatflow/server/src/core/docs_search/vector_search.py", line 22, in search_vectors
    return await self.conn.ft(self.index).search(query, query_params={"vec_param": vector})
  File "/home/porton/.local/lib/python3.9/site-packages/redis/commands/search/commands.py", line 889, in search
    res = await self.execute_command(SEARCH_CMD, *args)
  File "/home/porton/.local/lib/python3.9/site-packages/redis/asyncio/client.py", line 518, in execute_command
    return await conn.retry.call_with_retry(
  File "/home/porton/.local/lib/python3.9/site-packages/redis/asyncio/retry.py", line 59, in call_with_retry
    return await do()
  File "/home/porton/.local/lib/python3.9/site-packages/redis/asyncio/client.py", line 492, in _send_command_parse_response
    return await self.parse_response(conn, command_name, **options)
  File "/home/porton/.local/lib/python3.9/site-packages/redis/asyncio/client.py", line 539, in parse_response
    response = await connection.read_response()
  File "/home/porton/.local/lib/python3.9/site-packages/redis/asyncio/connection.py", line 810, in read_response
    raise response from None
redis.exceptions.ResponseError: chat_docs: no such index

Amend docker-compose to run the app entirely

Now docker-compose.yml uses several ports that may glitch with host software (e.g. if host runs PostgreSQL and/or Redis, they will collide). So to run it I needed to shut down my PostgreSQL, what is at least inconvenient.

I tried to fix it by adding

networks:
  mynetwork:
    driver: bridge

at the end of docker-compose.yml and commenting out port forwarding. But to run the app, I needed to uncomment every commented out port mappings, because the server software and the frontend are running not in Docker.

I recommend, for cleanness, to add to docker-compose.yml means to run all the software entirely in Docker, and to make it the recommended way to run in README file.

"Something went wrong"

When executing it on localhost as described by the README.md file, it among useful output produces errors:
image

CONTEXT instead of CONTENT

There, apparently should be f'CONTEXT\n----------------------------\n {context}' instead of f'CONTENT\n----------------------------\n {context}' in
server/src/core/llm/prompt_handler.py

A typo in AI reasonings

In Given the contents below, please answer which content option it should be used to replay to the user input. there should be reply instead of replay.

server/src/core/llm/prompt_handler.py

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.