mekb-turtle / discord-ai-bot Goto Github PK
View Code? Open in Web Editor NEWDiscord AI chatbot using Ollama
Home Page: https://github.com/jmorganca/ollama
Discord AI chatbot using Ollama
Home Page: https://github.com/jmorganca/ollama
Can we please have docker image for the same, i have no i dea how to do it and i dont seem to find the appropriate resources. Docker will be helpful considering it can just autostart on reboot and you can use your desktop without a terminal window always open
sorry for the question i know its probably a simple solution but i really have no clue where to start with this discord api.
but i would like to run it in discord chat freely so it can reply to all the messages given.
is this possible?
It seems that Ollama will be hitting against Discord's message size limit fairly regularly. Could be worth looking into the feasibility of chunking responses over multiple messages.
[Shard #0] [DEBUG] #study-room3 - boneitis-dev: tell me a dirty joke.
[Shard #0] [DEBUG] Response: Why did the chicken cross the road? To prove to the world that chickens can be just as smart as pigs.
[Shard #0] [DEBUG] #study-room3 - boneitis-dev: tell me another, but much more NSFW.
[Shard #0] [DEBUG] Response: Once upon a time there was a little boy who <..snip..original message was about 2800 characters..>
[Shard #0] [ERROR] DiscordAPIError[50035]: Invalid Form Body
content[BASE_TYPE_MAX_LENGTH]: Must be 2000 or fewer in length.
at handleErrors (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:687:13)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async SequentialHandler.runRequest (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:1072:23)
at async SequentialHandler.queueRequest (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist /index.js:913:14)
at async _REST.request (/home/llamabeast/discord-ai-bot/node_modules/@discordjs/rest/dist/index.js:1218:22)
at async TextChannel.send (/home/llamabeast/discord-ai-bot/node_modules/discord.js/src/structures/interfaces/TextBasedChannel.js:162:15)
at async handleMessage (file:///home/llamabeast/discord-ai-bot/src/bot.js:300:24)
at async Timeout._onTimeout (file:///home/llamabeast/discord-ai-bot/src/bot.js:107:16)
If messages are still pending, the reset/clear command will not clear them, but they should be cleared
start
node src/index.js
[Shard Manager] [INFO] Loading
/root/discord-ai-bot/node_modules/discord.js/src/util/Util.js:69
if (!token) throw new DiscordjsError(ErrorCodes.TokenMissing);
^
Error [TokenMissing]: Request to use token, but token was unavailable to the client.
at fetchRecommendedShardCount (/root/discord-ai-bot/node_modules/discord.js/src/util/Util.js:69:21)
at ShardingManager.spawn (/root/discord-ai-bot/node_modules/discord.js/src/sharding/ShardingManager.js:197:22)
at file:///root/discord-ai-bot/src/index.js:29:9
at ModuleJob.run (node:internal/modules/esm/module_job:195:25)
at async ModuleLoader.import (node:internal/modules/esm/loader:336:24)
at async loadESM (node:internal/process/esm_loader:34:7)
at async handleMainPromise (node:internal/modules/run_main:106:12) {
code: 'TokenMissing'
}
Node.js v18.19.0
Hi - sorry for the naive question - I got this working and it's really cool - I'm trying to figure out what the benefit is of using a local llm (ollama) for a (non-local) discord channel. Can you shed some light on this?
Followed setup instructions here
Ran npm i
then npm start
Got error:
[Shard Manager] [INFO] Loading
[Shard #0] [INFO] Created shard
/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1132
error: new Error("Used disallowed intents")
^
Error: Used disallowed intents
at WebSocketShard.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1132:18)
at connection.onclose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:676:17)
at callListener (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/event-target.js:290:14)
at WebSocket.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/event-target.js:220:9)
at WebSocket.emit (node:events:514:28)
at WebSocket.emitClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/websocket.js:260:10)
at TLSSocket.socketOnClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/ws/lib/websocket.js:1272:15)
at TLSSocket.emit (node:events:526:35)
at node:net:337:12
at TCP.done (node:_tls_wrap:657:7)
Node.js v20.9.0
/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:178
reject(new DiscordjsError(ErrorCodes.ShardingReadyDied, this.id));
^
Error [ShardingReadyDied]: Shard 0's process exited before its Client became ready.
at Shard.onDeath (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:178:16)
at Object.onceWrapper (node:events:629:26)
at Shard.emit (node:events:514:28)
at Shard._handleExit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/Shard.js:439:10)
at ChildProcess.emit (node:events:514:28)
at ChildProcess._handle.onexit (node:internal/child_process:294:12) {
code: 'ShardingReadyDied'
}
Node.js v20.9.0
ollama/discord bot/discord-ai-bot via v20.9.0 on ☁️ [user]@gmail.com
❯ node:events:492
throw er; // Unhandled 'error' event
^
Error [ERR_IPC_CHANNEL_CLOSED]: Channel closed
at new NodeError (node:internal/errors:406:5)
at target.send (node:internal/child_process:754:16)
at Client.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/sharding/ShardClientUtil.js:43:19)
at Client.emit (node:events:514:28)
at WebSocketManager.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/discord.js/src/client/websocket/WebSocketManager.js:271:19)
at WebSocketManager.emit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@vladfrangu/async_event_emitter/dist/index.cjs:282:31)
at WebSocketShard.<anonymous> (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1173:51)
at WebSocketShard.emit (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@vladfrangu/async_event_emitter/dist/index.cjs:290:37)
at WebSocketShard.onClose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:1056:10)
at connection.onclose (/Users/[user]/Documents/Mozilla/AI/localai/ollama/discord bot/discord-ai-bot/node_modules/@discordjs/ws/dist/index.js:676:17)
Emitted 'error' event on process instance at:
at node:internal/child_process:758:35
at process.processTicksAndRejections (node:internal/process/task_queues:77:11) {
code: 'ERR_IPC_CHANNEL_CLOSED'
}
npm start
start
node src/index.js
/home/user/ollama_discord_bot/node_modules/discord.js/src/client/BaseClient.js:29
userAgentAppendix: options.rest?.userAgentAppendix
^
SyntaxError: Unexpected token '.'
at wrapSafe (internal/modules/cjs/loader.js:915:16)
at Module._compile (internal/modules/cjs/loader.js:963:27)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
at Module.load (internal/modules/cjs/loader.js:863:32)
at Function.Module._load (internal/modules/cjs/loader.js:708:14)
at Module.require (internal/modules/cjs/loader.js:887:19)
at require (internal/modules/cjs/helpers.js:74:18)
at Object. (/home/lee/ollama_discord_bot/node_modules/discord.js/src/index.js:6:22)
at Module._compile (internal/modules/cjs/loader.js:999:30)
at Object.Module._extensions..js (internal/modules/cjs/loader.js:1027:10)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.