Use the new GPT-4 api to build a chatGPT chatbot for multiple Large PDF files.
Tech stack used includes LangChain, Pinecone, Typescript, Openai, and Next.js. LangChain is a framework that makes it easier to build scalable AI/LLM apps and chatbots. Pinecone is a vectorstore for storing embeddings and your PDF in text to later retrieve similar docs.
If you run into errors, please review the troubleshooting section further down this page.
Prelude: Please make sure you have already downloaded node on your system and the version is 18 or greater.
- Clone the repo or download the ZIP
git clone [github https url]
- Install packages
npm install
After installation, you should now see a node_modules
folder.
- Set up your
.env
file
- Copy
.env.example
into.env
Your.env
file should look like this:
OPENAI_API_KEY=
PINECONE_API_KEY=
PINECONE_ENVIRONMENT=
PINECONE_INDEX_NAME=
- Visit openai to retrieve API keys and insert into your
.env
file. - Visit pinecone to create and retrieve your API keys, and also retrieve your environment and index name from the dashboard.
This repo can load multiple PDF files
-
Inside
public/docs
folder, add your pdf files or folders that contain pdf files. -
Run the script
npm run ingest
to 'ingest' and embed your docs. If you run into errors troubleshoot below. -
Check Pinecone dashboard to verify your vectors have been added.
Once you've verified that the embeddings and content have been successfully added to your Pinecone, you can run the app npm run dev
to launch the local dev environment, and then type a question in the chat interface.