GithubHelp home page GithubHelp logo

2good4hisowngood / langchain Goto Github PK

View Code? Open in Web Editor NEW
0.0 0.0 0.0 28 KB

Implementing the LangChain AI as a containerized web app, experimenting with the tools following the quickstart doc

Home Page: https://python.langchain.com/en/latest/getting_started/getting_started.html

Dockerfile 10.92% Python 53.92% JavaScript 12.99% CSS 4.11% HTML 18.07%

langchain's People

Contributors

2good4hisowngood avatar

Watchers

 avatar

langchain's Issues

Template not applying to chat messages sent

Possible solution:

Update the /chat route in app.py to accept a template_id parameter in the request JSON:

python

@app.route('/chat', methods=['POST'])
def chat():
    user_input = request.json['input']
    template_id = request.json.get('template_id', None)
    if template_id:
        response = conversation.predict(input=user_input, template_id=template_id)
    else:
        response = conversation.predict(input=user_input)
    return jsonify({"response": response})

This modification checks if template_id is present in the request JSON and passes it to the conversation.predict method if it exists.

Modify the ConversationChain class in the langchain library to accept a template_id parameter in the predict method. Then, use this parameter to apply the selected template's context before sending the input to the LLM.

python

Inside the ConversationChain class

def predict(self, input: str, template_id: Optional[str] = None) -> str:
    if template_id:
        # Load the template by its ID and add its context to the input
        template = self.get_template_by_id(template_id)
        input_with_context = f"{template['context']} {input}"
    else:
        input_with_context = input

    # Send the input with context (if any) to the LLM
    response = self.llm.generate(input_with_context)

    return response

def get_template_by_id(self, template_id: str) -> Dict[str, str]:
    # Load templates from a file or a function that returns the list of templates
    templates = self.load_templates()

    for template in templates:
        if template['id'] == template_id:
            return template

    raise ValueError(f"Template with ID '{template_id}' not found.")

def load_templates(self) -> List[Dict[str, str]]:
    # Load templates from a file or a function that returns the list of templates
    # Example: loading from a JSON file
    with open('prompt_templates.json', 'r') as f:
        templates = json.load(f)
    return templates

Update the JavaScript code in script.js to include the template_id in the POST request to the /chat endpoint:

javascript

form.addEventListener("submit", async (e) => {
    e.preventDefault();
    const input = userInput.value;
    const selectedTemplateId = templateSelector.value;
    const response = await fetch("/chat", {
        method: "POST",
        headers: {
            "Content-Type": "application/json",
        },
        body: JSON.stringify({ input, template_id: selectedTemplateId }),
    });

    const data = await response.json();
    responseContainer.textContent = data.response;
});

With these changes, the selected template's context will be applied to the user input before it's sent to the LLM. Make sure to adjust the file paths and template loading process according to your project's structure.

UI file upload for Chat Over Documents with Chat History

https://python.langchain.com/en/latest/modules/chains/index_examples/chat_vector_db.html#conversationalretrievalchain-with-question-answering-with-sources

Desired Features:

  • UI Improvement to host an upload box which will allow a user to browse and upload their files, or simply drag them into the box to upload them to the worker.
  • Local storage on the worker to manage file uploads
  • Loader should load files and index them to a Vector Database
  • A vector Database should be initialized through variables
  • The DockerFile should accept variables to set up a vector database for longer term storage.
  • The docs, once indexed should be queryable with the ability to ask the LLM questions about the documents. Example, for code repos dropped into the worker, it should be able to offer code improvement suggestions to the existing files.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.