GithubHelp home page GithubHelp logo

mtcto / chat-ollama Goto Github PK

View Code? Open in Web Editor NEW

This project forked from sugarforever/chat-ollama

0.0 0.0 0.0 907 KB

ChatOllama is an open source chatbot based on LLMs. It supports a wide range of language models, and knowledge base management.

License: MIT License

Shell 0.14% TypeScript 37.26% Vue 45.24% Dockerfile 0.27% SCSS 17.09%

chat-ollama's Introduction

ChatOllama

ChatOllama is an open source chatbot based on LLMs. It supports a wide range of language models including:

  • Ollama served models
  • OpenAI
  • Azure OpenAI
  • Anthropic

ChatOllama supports multiple types of chat:

  • Free chat with LLMs
  • Chat with LLMs based on knowledge base

ChatOllama feature list:

  • Ollama models management
  • Knowledge bases management
  • Chat
  • Commercial LLMs API keys management

Join Our Community

If you are a user, contributor, or even just new to ChatOllama, you are more than welcome to join our community on Discord by clicking the invite link.

If you are a contributor, the channel technical-discussion is for you, where we discuss technical stuff.

If you have any issue in ChatOllama usage, please report to channel customer-support. We will help you out as soon as we can.

Quick Start

As a user of ChatOllama, please walk through the document below, to make sure you get all the components up and running before starting using ChatOllama.

Supported Vector Databases

ChatOllama supported 2 types of vector databases: Milvus and Chroma.

Please refer to the .env.example for how to work with your vector database setup.

# Supported values: chroma, milvus
VECTOR_STORE=chroma
CHROMADB_URL=http://localhost:8000
MILVUS_URL=http://localhost:19530

By default ChatOllama is using Chroma. If you'd like to use Milvus, set VECTOR_STORE to milvus and specify the corresponding URL. It works both in the development server and Docker containers.

Use with Nuxt 3 Development Server

If you'd like to run with the latest code base and apply changes as needed, you can clone this repository and follow the steps below.

  1. Install and run Ollama server

    You will need an Ollama server running. Follow the installation guide of Ollama. By default, it's running on http://localhost:11434.

  2. Install Chroma

    Please refer to https://docs.trychroma.com/getting-started for Chroma installation.

    We recommend you run it in a docker container:

    #https://hub.docker.com/r/chromadb/chroma/tags
    
    docker pull chromadb/chroma
    docker run -d -p 8000:8000 chromadb/chroma

    Now, ChromaDB is running on http://localhost:8000

  3. ChatOllama Setup

    Now, we can complete the necessary setup to run ChatOllama.

    3.1 Copy the .env.example file to .env file:

    cp .env.example .env

    3.2 Make sure to install the dependencies:

    # npm
    npm install
    
    # pnpm
    pnpm install
    
    # yarn
    yarn install
    
    # bun
    bun install

    3.3 Run a migration to create your database tables with Prisma Migrate

    # npm
    npm run prisma-migrate
    
    # pnpm
    pnpm prisma-migrate
    
    # yarn
    yarn prisma-migrate
    
    # bun
    bun run prisma-migrate
  4. Launch Development Server

    Make sure both Ollama Server and ChromaDB are running.

    Start the development server on http://localhost:3000:

    # npm
    npm run dev
    
    # pnpm
    pnpm dev
    
    # yarn
    yarn dev
    
    # bun
    bun run dev

Use with Docker

This is the easist way to use ChatOllama.

The only thing you need is a copy of docker-compose.yaml. Please download it and execute the command below to launch ChatOllama.

$ docker compose up

As ChatOllama is running within a docker container, you should set Ollama server to http://host.docker.internal:11434 in the Settings section, assuming your Ollama server is running locally with default port.

Make sure you initialize the SQLite database as below if you are launching the dockerized ChatOllama for the first time:

$ docker compose exec chatollama npx prisma migrate dev

Prerequisites for knowledge bases

When using KnowledgeBases, we need a valid embedding model in place. It can be one of the models downloaded by Ollama or from 3rd party service provider for example, OpenAI.

Ollama Managed Embedding Model

We recommend you download nomic-embed-text model for embedding purpose.

You can do so on Models page http://localhost:3000/models, or via CLI as below if you are using Docker.

# In the folder of docker-compose.yaml

$ docker compose exec ollama ollama pull nomic-embed-text:latest

OpenAI Embedding Model

If you prefer to use OpenAI, please make sure you set a valid OpenAI API Key in Settings, and fill with one of the OpenAI embedding models listed below:

  • text-embedding-3-large
  • text-embedding-3-small
  • text-embedding-ada-002

Data Storage with Docker Containers

There are 2 types of data storage, vector data and relational data. See the summary below and for more details, please refer to docker-compose.yaml for the settings.

Vector data

With docker-compose.yaml, a dockerized Chroma database is run side by side with ChatOllama. The data is persisted in a docker volume.

Relational data

The relational data including knowledge base records and their associated files are stored in a SQLite database file persisted and mounted from ~/.chatollama/chatollama.sqlite.

Developers Guide

As ChatOllama is still under active development, features, interfaces and database schema may be changed. Please follow the instructions below in your every git pull to make sure your dependencies and database schema are always in sync.

  1. Install the latest dependencies
    • npm install OR
    • pnpm install
  2. Prisma migrate
    • pnpm run prisma-migrate OR
    • npm run prisma-migrate

Change logs:

Here we summarize what's done and released in our day-to-day development.

03/10/2024

  1. Instructions data will be stored in SQLite database.
  2. vueuse is introduced for storage management.

chat-ollama's People

Contributors

sugarforever avatar satrong avatar jacky97s avatar yinzhidong avatar gslin1224 avatar sylaryip avatar qitest avatar eltociear avatar nxy666 avatar doggy8088 avatar lengweiping1983 avatar lihaowang avatar wusung avatar xmx0632 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.