GithubHelp home page GithubHelp logo

ausboss / local-llm-langchain Goto Github PK

View Code? Open in Web Editor NEW
207.0 8.0 25.0 96 KB

Load local LLMs effortlessly in a Jupyter notebook for testing purposes alongside Langchain or other agents. Contains Oobagooga and KoboldAI versions of the langchain notebooks with examples.

Batchfile 0.75% Jupyter Notebook 99.25%
llama agi alpaca autogpt jupyter-notebook langchain language-model llm llms koboldai

local-llm-langchain's Introduction

Notebook for Local LLMs

The goal of this project is to allow users to easily load their locally hosted language models in a notebook for testing with Langchain. There are currently three notebooks available. Two of them use an API to create a custom Langchain LLM wrapper—one for oobabooga's text generation web UI and the other for KoboldAI. The third notebook loads the models without an API by leveraging the oobabooga's text-generation-webui virtual environment and modules for model loading.

You will end up with an instance of the Custom LLM Wrapper that can be used to generate text:

llm("prompt goes here")

You can use this instead of the OpenAI LLM class that you see used in most of the guides and documentation.

Getting Started

Please follow the setup instructions for the APIs provided in their respective repositories. Just update the url variable with your api url then run the cells to create an instance of the Custom LLM Wrapper.

Roadmap

Using the API is now my preferred method for loading the models. I plan on improving the API classes/notebooks, but for now, they work quite well. I'm leaving the non-api stuff up for now, but I won't be actively maintaining them in the future, so things might break.

Non-API Notebook Instructions:

  1. Activate your Python or Conda environment.
  2. Install Jupyter Notebook by running pip install jupyter in your preferred command prompt or terminal.
  3. Restart your command prompt or terminal to ensure that the installation is properly configured.
  4. Activate your Python or Conda environment again and run jupyter notebook in the command prompt or terminal to launch the Jupyter interface.
  5. Navigate to the directory where Non-API-Notebook.ipynb is located (ooba users put it in ./text-generation-webui/ and open the notebook in the Jupyter interface.

local-llm-langchain's People

Contributors

ausboss avatar deadbranches avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

local-llm-langchain's Issues

modules for Local-LLM notebook

Hi,
in the Local-LLM notebook we need to import:

`import sys
sys.argv = [sys.argv[0]]
import importlib
import json
import math
import os
import re
import sys
import time
import traceback
from functools import partial
from pathlib import Path
from threading import Lock
sys.path.append(str(Path().resolve().parent / "modules"))

import modules.extensions as extensions_module
from modules import chat, presets, shared, training, ui, utils
from modules.extensions import apply_extensions
from modules.github import clone_or_pull_repository
from modules.html_generator import chat_html_wrapper
from modules.LoRA import add_lora_to_model
from modules.models import load_model, unload_model
from modules.text_generation import (generate_reply_wrapper,
get_encoded_length, stop_everything_event)

import torch
torch.cuda.set_device(0)`

What is "modules"? It would be possible to have access to that code, please?
Thanks for sharing and all the hard work.
Cheers!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.