GithubHelp home page GithubHelp logo

run-llama / create-llama Goto Github PK

View Code? Open in Web Editor NEW
698.0 698.0 85.0 2.51 MB

The easiest way to get started with LlamaIndex

License: MIT License

JavaScript 0.30% TypeScript 76.93% Python 21.85% CSS 0.59% Shell 0.11% Dockerfile 0.22%

create-llama's People

Contributors

anush008 avatar arputikos avatar github-actions[bot] avatar himself65 avatar jac-zac avatar jerryjliu avatar jess-render avatar leehuwuj avatar logan-markewich avatar marcusschiesser avatar mohdamir avatar nirga avatar sagech avatar seldo avatar stmtk1 avatar thucpn avatar yisding avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

create-llama's Issues

npx create-llama crashes with "Unexpected token ;"

using
npm create llama@latest
i get: this message that is a bit more comprehensive.
npx: installed 1 in 1.125s
C:\Users\Lars\AppData\Roaming\npm-cache_npx\31536\node_modules\create-llama\dist\index.js:102
function getLineColFromPtr(e,t){let r=e.slice(0,t).split(/\r\n|\n|\r/g);return[r.length,r.pop().length+1]}function makeCodeBlock(e,t,r){let s=e.split(/\r\n|\n|\r/g);let n="";let i=(Math.log10(t+1)|0)+1;for(let e=t-1;e<=t+1;e++){let o=s[e-1];if(!o)continue;n+=e.toString().padEnd(i," ");n+=": ";n+=o;n+="\n";if(e===t){n+=" ".repeat(i+r+2);n+="^\n"}}return n}class TomlError extends Error{line;column;codeblock;constructor(e,t){const[r,s]=getLineColFromPtr(t.toml,t.ptr);const n=makeCodeBlock(t.toml,r,s);super(Invalid TOML document: ${e}\n\n${n},t);this.line=r;this.column=s;this.codeblock=n}}

SyntaxError: Unexpected token ;
at Module._compile (internal/modules/cjs/loader.js:723:23)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10)
at Module.load (internal/modules/cjs/loader.js:653:32)
at tryModuleLoad (internal/modules/cjs/loader.js:593:12)
at Function.Module._load (internal/modules/cjs/loader.js:585:3)
at Function.Module.runMain (internal/modules/cjs/loader.js:831:12)
at startup (internal/bootstrap/node.js:283:19)
at bootstrapNodeJSCore (internal/bootstrap/node.js:622:3)

"Operation not permitted" error on WSL.

image
It advised that this should be reported as a bug.
I'm using Ubuntu on WSL and I'm getting this "operation not permitted" error. It looks like it was going to make a directory on the Windows System.

Messages start to lag

Hey,

After a few user and assistant messages the responses become very lagging when printing. The stream is not printing fluently anymore. Why does this happen? And is there anything that can be done against it?
Thanks for any help.

Error: Missing tiktoken_bg.wasm

for last few days I have been trying to solve this issue...
I'm using create-llama as starter.

Screenshot 2024-07-15 at 05 32 16

package.json
`
{
"version": "0.1.0",
"scripts": {
"format": "prettier --ignore-unknown --cache --check .",
"format:write": "prettier --ignore-unknown --write .",
"dev": "next dev",
"build": "next build",
"start": "next start",
"lint": "next lint",
"generate": "tsx app/api/chat/engine/generate.ts"
},
"dependencies": {
"@apidevtools/swagger-parser": "^10.1.0",
"@e2b/code-interpreter": "^0.0.5",
"@llamaindex/pdf-viewer": "^1.1.1",
"@radix-ui/react-collapsible": "^1.0.3",
"@radix-ui/react-hover-card": "^1.0.7",
"@radix-ui/react-slot": "^1.0.2",
"ai": "^3.0.21",
"ajv": "^8.12.0",
"class-variance-authority": "^0.7.0",
"clsx": "^2.1.1",
"dotenv": "^16.3.1",
"duck-duck-scrape": "^2.2.5",
"formdata-node": "^6.0.3",
"got": "^14.4.1",
"llamaindex": "0.4.6",
"lucide-react": "^0.294.0",
"next": "^14.2.4",
"pdf2json": "3.0.5",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-markdown": "^8.0.7",
"react-syntax-highlighter": "^15.5.0",
"rehype-katex": "^7.0.0",
"remark": "^14.0.3",
"remark-code-import": "^1.2.0",
"remark-gfm": "^3.0.1",
"remark-math": "^5.1.1",
"supports-color": "^8.1.1",
"tailwind-merge": "^2.1.0",
"tiktoken": "^1.0.15",
"uuid": "^9.0.1",
"vaul": "^0.9.1"
},
"devDependencies": {
"@types/node": "^20.10.3",
"@types/react": "^18.2.42",
"@types/react-dom": "^18.2.17",
"@types/react-syntax-highlighter": "^15.5.11",
"@types/uuid": "^9.0.8",
"autoprefixer": "^10.4.16",
"cross-env": "^7.0.3",
"eslint": "^8.55.0",
"eslint-config-next": "^14.2.4",
"eslint-config-prettier": "^8.10.0",
"postcss": "^8.4.32",
"prettier": "^3.2.5",
"prettier-plugin-organize-imports": "^3.2.4",
"tailwindcss": "^3.3.6",
"tsx": "^4.7.2",
"typescript": "^5.3.2"
}
}

`

app/chat/route
`import { initObservability } from "@/app/observability";
import { Message, StreamData, StreamingTextResponse } from "ai";
import { ChatMessage, Settings } from "llamaindex";
import { NextRequest, NextResponse } from "next/server";
import { createChatEngine } from "./engine/chat";
import { initSettings } from "./engine/settings";
import { LlamaIndexStream, convertMessageContent } from "./llamaindex-stream";
import { createCallbackManager, createStreamTimeout } from "./stream-helper";

initObservability();
initSettings();

export const runtime = "nodejs";
export const dynamic = "force-dynamic";

export async function POST(request: NextRequest) {
// Init Vercel AI StreamData and timeout
const vercelStreamData = new StreamData();
const streamTimeout = createStreamTimeout(vercelStreamData);

try {
const body = await request.json();
const { messages }: { messages: Message[] } = body;
const userMessage = messages.pop();
if (!messages || !userMessage || userMessage.role !== "user") {
return NextResponse.json(
{
error:
"messages are required in the request body and the last message must be from the user",
},
{ status: 400 },
);
}

const chatEngine = await createChatEngine();

let annotations = userMessage.annotations;
if (!annotations) {
  // the user didn't send any new annotations with the last message
  // so use the annotations from the last user message that has annotations
  // REASON: GPT4 doesn't consider MessageContentDetail from previous messages, only strings
  annotations = messages
    .slice()
    .reverse()
    .find(
      (message) => message.role === "user" && message.annotations,
    )?.annotations;
}

// Convert message content from Vercel/AI format to LlamaIndex/OpenAI format
const userMessageContent = convertMessageContent(
  userMessage.content,
  annotations,
);

// Setup callbacks
const callbackManager = createCallbackManager(vercelStreamData);

// Calling LlamaIndex's ChatEngine to get a streamed response
const response = await Settings.withCallbackManager(callbackManager, () => {
  return chatEngine.chat({
    message: userMessageContent,
    chatHistory: messages as ChatMessage[],
    stream: true,
  });
});

// Transform LlamaIndex stream to Vercel/AI format
const stream = LlamaIndexStream(response, vercelStreamData);

// Return a StreamingTextResponse, which can be consumed by the Vercel/AI client
return new StreamingTextResponse(stream, {}, vercelStreamData);

} catch (error) {
console.error("[LlamaIndex]", error);
return NextResponse.json(
{
detail: (error as Error).message,
},
{
status: 500,
},
);
} finally {
clearTimeout(streamTimeout);
}
}
`

Missing tools.json file for setting up code interpreter

I created and deployed new app, but it wasn't accessing the code execution tool until I manually added this code in a new tools.json file. It would be nice to mention this step in the README.md or have it created automatically.

Thank you!

Screenshot 2024-05-25 at 18 56 47

Not able to see the citation on UI.

In the web chat application, we are not observing citations in the responses, which were present in the past but are missing in the most recent version.

Index creation process through backend

An issue observed when filenames contain spaces it leads to following error during the index creation process through python backend code.
llama_cloud.core.api_error.ApiError: status_code: 404, body: {'detail': 'Document not found'}

Unexpected error. Error: EEXIST: file already exists in using community-template sec-insights

Hey there I am trying to set up a project using the sec template even tho the folder was completely empty when I did as project name ./ or create a new project altogether the files were created but I got this error exception:

npx create-llama@latest
√ What is your project named? ... ./
√ Which template would you like to use? » Community template from https://github.com/run-llama/create_llama_projects
√ Select community template » sec-insights
Creating a new LlamaIndex app in C:\Users\test\Documents\Coding\llamaindex-bastion.


Installing community project: sec-insights
Adding .devcontainer

Aborting installation.
Unexpected error. Please report it as a bug:
 Error: EEXIST: file already exists, mkdir 'C:\Users\test\Documents\Coding\llamaindex-bastion\.devcontainer'
    at Object.mkdirSync (node:fs:1380:26)
    at writeDevcontainer (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:17421)
    at createApp (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:19819)
    at async run (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:62870) {
  errno: -4075,
  code: 'EEXIST',
  syscall: 'mkdir',
  path: 'C:\\Users\\test\\Documents\\Coding\\llamaindex-bastion\\.devcontainer'
}
npx create-llama@latest
√ What is your project named? ... llamaindex-bastion
√ Which template would you like to use? » Community template from https://github.com/run-llama/create_llama_projects
√ Select community template » sec-insights
Creating a new LlamaIndex app in C:\Users\test\Documents\Coding\llamaindex-bastion.


Installing community project: sec-insights
Adding .devcontainer

Aborting installation.
Unexpected error. Please report it as a bug:
 Error: EEXIST: file already exists, mkdir 'C:\Users\test\Documents\Coding\llamaindex-bastion\.devcontainer'
    at Object.mkdirSync (node:fs:1380:26)
    at writeDevcontainer (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:17421)
    at createApp (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:19819)
    at async run (C:\Users\test\AppData\Local\npm-cache\_npx\7bfc2205dda2d438\node_modules\create-llama\dist\index.js:298:62870) {
  errno: -4075,
  code: 'EEXIST',
  syscall: 'mkdir',
  path: 'C:\\Users\\test\\Documents\\Coding\\llamaindex-bastion\\.devcontainer'
}

3 high severity vulnerabilities

npm i llamaindex@latest
npm WARN deprecated @aws-sdk/[email protected]: This package has moved to @smithy/protocol-http
npm WARN deprecated @aws-sdk/[email protected]: This package has moved to @smithy/signature-v4

added 126 packages, changed 7 packages, and audited 1238 packages in 53s

3 high severity vulnerabilities

Some issues need review, and may require choosing
a different dependency.

Run npm audit for details.

LlamaPack not working with 0.1.0

npx create-llama@latest
✔ What is your project named? … my-app
✔ Which template would you like to use? › Example using a LlamaPack
✔ Select LlamaPack › rag-fusion-query-pipeline

Aborting installation.
Unexpected error. Please report it as a bug:
 TypeError: Cannot read properties of undefined (reading 'provider')
    at isModelConfigured (/Users/leegang/.npm/_npx/7bfc2205dda2d438/node_modules/create-llama/dist/index.js:298:55245)
    at askPostInstallAction (/Users/leegang/.npm/_npx/7bfc2205dda2d438/node_modules/create-llama/dist/index.js:298:16399)
    at askQuestions (/Users/leegang/.npm/_npx/7bfc2205dda2d438/node_modules/create-llama/dist/index.js:298:17907)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async run (/Users/leegang/.npm/_npx/7bfc2205dda2d438/node_modules/create-llama/dist/index.js:298:72553)

MultiModal does not work with Next.js Frontend and FastAPI backend

Hi,

I wanted to implement custom evaluating logic. Realizing that only the python implemention of LLamaIndex supports QuestionGenerator i thought that it would be more reasonable to the FastAPI backend + Next.js Frontend setup.

I managed to pass the data for images to the backend extending the handleSubmit of useChat for vercel/ai#725. I however don't know how to duplicate the functionality of StreamData in the FastAPI backend.

Can you make this example work out of the box or provide some further documentation of how to implement this? Currently the multi modality does not work, without multiple changes.

Thanks for taking your time and reading my request.

Question suggestions do not use N_QUESTION_TO_GENERATE variable

When using python backend, generated project contains bug such as in suggestion.py
N_QUESTION_TO_GENERATE variable is not used at all to generate NEXT_QUESTIONS_SUGGESTION_PROMPT
Even when it's set to 3 by default it's generating 5 questions (and in the logs we can see that text sent to the LLM provider
contains "...me $number_of_questions questions..." instead of the variable itself.

How to reproduce:
create project with python backend and next.js frontend, see that there are 5 question suggestions

poetry run generate not working

Hi Team

I generate the app with fastApi option.
It contains backend folder. And inside another folder data and a sample pdf file.

In my terminal
poetry install
poetry shell

Above commands works
However, poetry run generate is not working

It reponds 'generate' is not recognized as an internal or external command, operable program or batch file.

Can you pleas help what's wrong?

I am able run "python main.py" and in localhost, I can see the "swagger" and able to call the "chat" api but the it also says "do run generate first".

Kindly help!

I am from front-end development but developing UI for AI team. So, this domain is new for me.

Thanks

Allow custom system prompts

I just created a new application with version 0.1.25 and it looks like the system prompt setting from the .env file is ignored:

// app/api/chat/engine/chat.ts

  return new ContextChatEngine({
    chatModel: Settings.llm,
    retriever,
    // disable as a custom system prompt disables the generated context
    // systemPrompt: process.env.SYSTEM_PROMPT,
  });

I don't really understand the comment, is there a bug in llamaindex or is some code missing here? Is there an easy way to get this working?


✔ What is your project named? … my-app
✔ Which template would you like to use? › Agentic RAG (single agent)
✔ Which framework would you like to use? › NextJS
✔ Would you like to set up observability? › No
✔ Please provide your OpenAI API key (leave blank to skip): … 
✔ Which data source would you like to use? › Use an example PDF (you can add your own data files later)
✔ Would you like to add another data source? › No
✔ Would you like to use LlamaParse (improved parser for RAG - requires API key)? … no / yes › No
✔ Would you like to use a vector database? › No, just store the data in the file system
✔ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter › 
✔ How would you like to proceed? › Generate code and install dependencies (~2 min)

Whenever I try to use create-llama through CLI, it throws the error: The data stream is hanging. Did you forget to close it with data.close()?

Here's a full copy of what I did on the CLI and the respective output:

Microsoft Windows [Version 10.0.22631.3593]
(c) Microsoft Corporation. All rights reserved.

C:\Users\lamak>npx create-llama@latest --ask-models
√ What is your project named? ... my-app
√ Which template would you like to use? » Chat
√ Which framework would you like to use? » NextJS
√ Would you like to set up observability? » No
√ Which model provider would you like to use » Gemini
√ Please provide your Google API key (or leave blank to use GOOGLE_API_KEY env variable): .
√ Which LLM model would you like to use? » gemini-1.5-pro-latest
√ Which embedding model would you like to use? » text-embedding-004
√ Which data source would you like to use? » Use an example PDF
√ Would you like to add another data source? » No
√ Would you like to use LlamaParse (improved parser for RAG - requires API key)? ... no / yes
√ Would you like to use a vector database? » No, just store the data in the file system
√ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter »
√ How would you like to proceed? » Generate code, install dependencies, and run the app (~2 min)
Creating a new LlamaIndex app in C:\Users\lamak\my-app.

Using npm.

Initializing project with template: streaming

Using vector DB: none

No tools selected - use optimized context chat engine

Installing dependencies:

@radix-ui/react-collapsible
@radix-ui/react-hover-card
@radix-ui/react-slot
ai
ajv
class-variance-authority
clsx
dotenv
llamaindex
lucide-react
next
pdf2json
react
react-dom
react-markdown
react-syntax-highlighter
remark
remark-code-import
remark-gfm
remark-math
rehype-katex
supports-color
tailwind-merge
vaul
@llamaindex/pdf-viewer
@e2b/code-interpreter
uuid
Installing devDependencies:

@types/node
@types/react
@types/react-dom
@types/react-syntax-highlighter
autoprefixer
cross-env
eslint
eslint-config-next
eslint-config-prettier
postcss
prettier
prettier-plugin-organize-imports
tailwindcss
tsx
typescript
@types/uuid
npm warn deprecated [email protected]: This module is not supported, and leaks memory. Do not use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
npm warn deprecated @humanwhocodes/[email protected]: Use @eslint/config-array instead
npm warn deprecated [email protected]: Rimraf versions prior to v4 are no longer supported
npm warn deprecated [email protected]: Glob versions prior to v9 are no longer supported
npm warn deprecated @humanwhocodes/[email protected]: Use @eslint/object-schema instead
npm warn deprecated [email protected]: Use mz or fs-extra^3.0 with Promise Support
npm warn deprecated [email protected]: dommatrix is no longer maintained. Please use @thednp/dommatrix.

added 986 packages, and audited 989 packages in 2m

291 packages are looking for funding
run npm fund for details

3 high severity vulnerabilities

Some issues need review, and may require choosing
a different dependency.

Run npm audit for details.
Created '.env' file. Please check the settings.

Generating context data...

Copying data from path: C:\Users\lamak\AppData\Local\npm-cache_npx\7bfc2205dda2d438\node_modules\create-llama\dist\templates\components\data\101.pdf
Running npm run generate to generate the context data.

[email protected] generate
tsx app\api\chat\engine\generate.ts

Using 'gemini' model provider
Generating storage context...
No valid data found at path: cache\doc_store.json starting new store.
No valid data found at path: cache\index_store.json starting new store.
No valid data found at path: cache\vector_store.json starting new store.
Storage context successfully generated in 10.826s.
Finished generating storage.
Initialized a git repository.

Success! Created my-app at C:\Users\lamak\my-app
Now have a look at the README.md (​file://C:\Users\lamak\my-app/README.md​) and learn how to get started.

Running app in C:\Users\lamak\my-app...

[email protected] dev
next dev

▲ Next.js 14.2.4

Local: http://localhost:3000/
Environments: .env
✓ Starting...
✓ Ready in 2.6s
○ Compiling / ...
✓ Compiled / in 10s (2917 modules)
GET / 200 in 11380ms
○ Compiling /favicon.ico ...
✓ Compiled /api/chat/config in 5.7s (2812 modules)
✓ Compiled in 0ms (1489 modules)
✓ Compiled in 0ms (1489 modules)
✓ Compiled /api/chat in 1ms (1489 modules)
GET /favicon.ico 200 in 4956ms
✓ Compiled (2363 modules)
GET /api/chat/config 200 in 6989ms
Using 'gemini' model provider
GET /api/chat/config 200 in 932ms
GET /api/chat/config 200 in 11ms
GET /api/chat/config 200 in 10ms
The data stream is hanging. Did you forget to close it with data.close()?
POST /api/chat 200 in 7783ms

Swagger support - swagger works in FastAPI but Not in Express

Hi

I tried to create different backends.
WORKS: In FastAPI backend, the apis are exposed in swagger in the url - http://0.0.0.0:8000/docs

NEED HELP: However, when I run express \ nodejs in localhost, it is not showing swagger.
I tried this below url -- http://localhost:8000/api/<name-in-backend-pacakage.json>/api/swagger/

Also, I am not finding swagger.json, in express backend project, which is supposed contain the APIs exposed by this backened.

Kindly, help to resolve.
3RD BACKEND: I am not sure how swagger works in next.js (backend/fullstack). Will it support in next.js (backend) too.?

Azure Open Ai Model Provider Issue

When using the Azure Open AI Model Provider option, I encountered a 404-Resource Not Found error. It seems additional configurations, such as 'azure_endpoint' and 'api_version', might need to be added to settings.py.

Listing Ollama models failed. Is 'ollama' running? TypeError: fetch failed

Ollama models throw TypeError in CLI.

(polenv) PS C:\UniCult\full-stack build check> npx create-llama@latest --ask-models
√ What is your project named? ... my-app
√ Which template would you like to use? » Chat
√ Which framework would you like to use? » NextJS
√ Would you like to set up observability? » No
√ Which model provider would you like to use » Ollama
√ Which LLM model would you like to use? » llama3:8b
Listing Ollama models failed. Is 'ollama' running? TypeError: fetch failed
(polenv) PS C:\UniCult\full-stack build check>

Cannot upload PDF files: Provide Uint8Array rather than Buffer

After creating a new application with version 0.1.25 I cannot upload PDF files with the web interface.

The POST to http://localhost:3000/api/chat/upload responds with 500:

{"error":"Please provide binary data as `Uint8Array`, rather than `Buffer`."}

Related log:

Processing uploaded document of type: application/pdf
[Upload API] Error: Please provide binary data as `Uint8Array`, rather than `Buffer`.
    at /app/.next/server/chunks/602.js:9:144687
    at tH (/app/.next/server/chunks/602.js:9:145105)
    at i (/app/.next/server/chunks/804.js:1:153)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async l (/app/.next/server/chunks/804.js:1:780)
    at async u (/app/.next/server/chunks/804.js:1:1174)
    at async T (/app/.next/server/app/api/chat/upload/route.js:4:322)
    at async w.loadDataAsContent (/app/.next/server/app/api/chat/upload/route.js:4:146)
    at async S (/app/.next/server/app/api/chat/upload/route.js:4:976)
    at async R (/app/.next/server/app/api/chat/upload/route.js:5:793)
    at async L (/app/.next/server/app/api/chat/upload/route.js:5:1071)
    at async /app/node_modules/next/dist/compiled/next-server/app-route.runtime.prod.js:6:36258
    at async eR.execute (/app/node_modules/next/dist/compiled/next-server/app-route.runtime.prod.js:6:26874)
    at async eR.handle (/app/node_modules/next/dist/compiled/next-server/app-route.runtime.prod.js:6:37512)
    at async doRender (/app/node_modules/next/dist/server/base-server.js:1377:42)
    at async cacheEntry.responseCache.get.routeKind (/app/node_modules/next/dist/server/base-server.js:1599:28)

This fails when the application runs locally on my Ubuntu 20.04 machine as well as inside a docker container with image nextjs-docker (started as described in your generated README.md). It happens in Chrome and Firefox.

I use azure-openai as model provider but I don't think that this is related. Indexing PDFs with npm run generate works just fine.


✔ What is your project named? … my-app
✔ Which template would you like to use? › Agentic RAG (single agent)
✔ Which framework would you like to use? › NextJS
✔ Would you like to set up observability? › No
✔ Please provide your OpenAI API key (leave blank to skip): … 
✔ Which data source would you like to use? › Use an example PDF (you can add your own data files later)
✔ Would you like to add another data source? › No
✔ Would you like to use LlamaParse (improved parser for RAG - requires API key)? … no / yes › No
✔ Would you like to use a vector database? › No, just store the data in the file system
✔ Would you like to build an agent using tools? If so, select the tools here, otherwise just press enter › 
✔ How would you like to proceed? › Generate code and install dependencies (~2 min)

Incorrect References in settings.ts not caught by e2e

I generated the app with the combinations of NextJs + Express + MongoDB.

The below code is found in ".\backend\src\controllers\engine\generate.ts" and ".\backend\src\controllers\engine\index.ts":-

import { MongoDBAtlasVectorSearch } from "llamaindex/storage/vectorStore/MongoDBAtlasVectorSearch";

However, I had to resolve the reference error with below code:

import { MongoDBAtlasVectorSearch } from "llamaindex";

Please, check if this update is needed.

Supabase as Vector DB FATAL: Max client connections reached

Project generated with create-llama.
Works most of the time - after a while throws up this error.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in call
await self.app(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 758, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
await route.handle(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
await self.app(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/aSearch/backend-asearch/app/api/routers/chat.py", line 157, in chat_request
response = await chat_engine.achat(last_message_content, messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 56, in async_wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 326, in achat
chat_messages, context_source, context_nodes = await self._arun_c3(
^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 249, in _arun_c3
context_str, context_nodes = await self._aretrieve_context(condensed_question)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 181, in _aretrieve_context
nodes = await self._retriever.aretrieve(message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/base/base_retriever.py", line 249, in aretrieve
nodes = await self._aretrieve(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 105, in _aretrieve
return await self._aget_nodes_with_embeddings(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 177, in _aget_nodes_with_embeddings
query_result = await self._vector_store.aquery(query, **self._kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 638, in aquery
self._initialize()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 295, in _initialize
self._create_extension()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 288, in _create_extension
session.execute(statement)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2306, in execute
return self._execute_internal(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2181, in _execute_internal
conn = self._connection_for_bind(bind)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2050, in _connection_for_bind
return trans._connection_for_bind(engine, execution_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 2, in _connection_for_bind
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/state_changes.py", line 139, in _go
ret_value = fn(self, *arg, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1144, in _connection_for_bind
conn = bind.connect()
^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 3280, in connect
return self._connection_cls(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 148, in init
Connection._handle_dbapi_exception_noconnection(
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2444, in _handle_dbapi_exception_noconnection
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 146, in init
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 3304, in raw_connection
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 449, in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 1263, in _checkout
fairy = _ConnectionRecord.checkout(pool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 712, in checkout
rec = pool._do_get()
^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/impl.py", line 179, in _do_get
with util.safe_reraise():
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/util/langhelpers.py", line 146, in exit
raise exc_value.with_traceback(exc_tb)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/impl.py", line 177, in _do_get
return self._create_connection()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 390, in _create_connection
return _ConnectionRecord(self)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 674, in init
self.__connect()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 900, in __connect
with util.safe_reraise():
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/util/langhelpers.py", line 146, in exit
raise exc_value.with_traceback(exc_tb)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 896, in __connect
self.dbapi_connection = connection = pool._invoke_creator(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/create.py", line 643, in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 617, in connect
return self.loaded_dbapi.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/psycopg2/init.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in call
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in call
await super().call(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/applications.py", line 123, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in call
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in call
await self.app(scope, receive, _send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in call
await self.app(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in call
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 758, in call
await self.middleware_stack(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
await route.handle(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
await self.app(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
raise exc
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
response = await func(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
raise e
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
raw_response = await run_endpoint_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
return await dependant.call(**values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/aSearch/backend-asearch/app/api/routers/chat.py", line 157, in chat_request
response = await chat_engine.achat(last_message_content, messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/callbacks/utils.py", line 56, in async_wrapper
return await func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 326, in achat
chat_messages, context_source, context_nodes = await self._arun_c3(
^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 249, in _arun_c3
context_str, context_nodes = await self._aretrieve_context(condensed_question)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/chat_engine/condense_plus_context.py", line 181, in _aretrieve_context
nodes = await self._retriever.aretrieve(message)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/base/base_retriever.py", line 249, in aretrieve
nodes = await self._aretrieve(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 105, in _aretrieve
return await self._aget_nodes_with_embeddings(query_bundle)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 177, in _aget_nodes_with_embeddings
query_result = await self._vector_store.aquery(query, **self._kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 638, in aquery
self._initialize()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 295, in _initialize
self._create_extension()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/llama_index/vector_stores/postgres/base.py", line 288, in _create_extension
session.execute(statement)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2306, in execute
return self._execute_internal(
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2181, in _execute_internal
conn = self._connection_for_bind(bind)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 2050, in _connection_for_bind
return trans._connection_for_bind(engine, execution_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 2, in _connection_for_bind
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/state_changes.py", line 139, in _go
ret_value = fn(self, *arg, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/orm/session.py", line 1144, in _connection_for_bind
conn = bind.connect()
^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 3280, in connect
return self._connection_cls(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 148, in init
Connection._handle_dbapi_exception_noconnection(
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2444, in _handle_dbapi_exception_noconnection
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 146, in init
self._dbapi_connection = engine.raw_connection()
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 3304, in raw_connection
return self.pool.connect()
^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 449, in connect
return _ConnectionFairy._checkout(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 1263, in _checkout
fairy = _ConnectionRecord.checkout(pool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 712, in checkout
rec = pool._do_get()
^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/impl.py", line 179, in _do_get
with util.safe_reraise():
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/util/langhelpers.py", line 146, in exit
raise exc_value.with_traceback(exc_tb)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/impl.py", line 177, in _do_get
return self._create_connection()
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 390, in _create_connection
return _ConnectionRecord(self)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 674, in init
self.__connect()
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 900, in __connect
with util.safe_reraise():
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/util/langhelpers.py", line 146, in exit
raise exc_value.with_traceback(exc_tb)
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/pool/base.py", line 896, in __connect
self.dbapi_connection = connection = pool._invoke_creator(self)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/create.py", line 643, in connect
return dialect.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 617, in connect
return self.loaded_dbapi.connect(*cargs, **cparams)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/Me/Library/Caches/pypoetry/virtualenvs/app-BCQ_KSqF-py3.11/lib/python3.11/site-packages/psycopg2/init.py", line 122, in connect
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) connection to server at "aws-0-us-westxx.pooler.supabase.com" (), port 5432 failed: FATAL: Max client connections reached

Installation error in Ubuntu 22.04.4 LTS

image

Hello there, after running the first command In Ubuntu:

npx create-llama@latest

I have run into this error that I have specified in the screenshot above , I am adding this error so that if anyone who would come up with the solution or have the same problem as me would be notified. Thanks.

Line break on chat item

I found useful to have line break when you ask for a list of something. The way I get it working was to add the following style in global.css

p.break-words{ white-space: pre-wrap; }

That presents the response formatted as list:
list

References are not shown properly in UI

Have run the package from main branch and created a bot with PDF documents. There are three nodes shown as references for the answers but they are empty and not showing the part of PDF as source/reference (Screenshot attached)
Screenshot 2024-04-23 at 1 48 55 PM

missing `generate` script

The readme has the following:

Before you can use your data, you need to index it. If you're using the Next.js or Express apps, run:

npm run generate
Then re-start your app. Remember you'll need to re-run generate if you add new files to your data folder.

If you're using the Python backend, you can trigger indexing of your data by calling:

poetry run generate

This script seems to be missing from the repo.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.