GithubHelp home page GithubHelp logo

Comments (5)

muelletm avatar muelletm commented on September 18, 2024 1

Hi!

So first of all, the API is supposed to be used with a special binary that is a version of chat that is stateless and uses a JSON API. The binary can be built inside the cpp directory. If you use the dockerfile everything will be done for you.
Otherwise:

cd cpp
make
cd ..
export ALPACA_CLI_PATH="$PWD/cpp/build/alpaca"

Now ALPACA_CLI_PATH points to the right binary.

The locks are just there to make the client thread-safe so that we don't get into race conditions when we use it from e.g. FastAPI.

from alpaca.py.

robinnarsinghranabhat avatar robinnarsinghranabhat commented on September 18, 2024

Thanks It's working now. I thought I could run it using the original binary from alpaca.cpp.

I was curious. When new improvements/optimizations are made to original alpaca.cpp, like faster loading, memory enchancements e.t.c how difficult would it be to keep this repo in track with those ?

I am not a C/C++ person but will love to help out in any way I can.

from alpaca.py.

rg089 avatar rg089 commented on September 18, 2024

@muelletm I'm getting the same error even after making the new alpaca file as described within cpp/build. I'm on macOS 13.3 Ventura (M1 chip).

from alpaca.py.

muelletm avatar muelletm commented on September 18, 2024

@rg089, can you share how you run the client and the output you are getting?

from alpaca.py.

rg089 avatar rg089 commented on September 18, 2024

@muelletm I first compiled the binary and ran using multiple options. When I run using python as follows, I 'm getting this error: json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

from alpaca import Alpaca, InferenceRequest
import os

alpaca_cli_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'cpp/build/alpaca')
model_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), '../alpaca_cpp/ggml-alpaca-7b-q4.bin')
alpaca = Alpaca(alpaca_cli_path, model_path)
try:
    output = alpaca.run_simple(InferenceRequest(input_text="Are alpacas afraid of snakes?"))["output"]
    print(output)
finally:
    alpaca.stop()

When I directly ran the compiled alpaca file inside cpp/build using ./alpaca -m ../../../alpaca_cpp/ggml-alpaca-7b-q4.bin' and gave the input {"input_text": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n\nAre alpacas afraid of snakes?### Response:\n\n", "top_k": "40", "top_p": "0.95", "temp": "0.1", "repeat_penalty": "1.3", "repeat_last_n": "64", "n_predict": "128"}, there was a segmentation fault.

Please let me know if you need any further details. I had another query regarding this: If I wanted to continue the conversation with the model, would using alpaca.run_simple in a loop load the model multiple times as the generation params are also being passed in the message, or would it just pass the prompt to the loaded model? Is there a better way to interact with the model in a multi-turn conversation?

from alpaca.py.

Related Issues (1)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.