chrisrude / oobabot-plugin Goto Github PK
View Code? Open in Web Editor NEWA Discord bot plugin to text-generation-webui, based on oobabot.
Home Page: https://github.com/chrisrude/oobabot
License: MIT License
A Discord bot plugin to text-generation-webui, based on oobabot.
Home Page: https://github.com/chrisrude/oobabot
License: MIT License
I'm trying to get Oobabot to function with Simbake's "Web Search" extension.
It's a very simple extension that does a quick web search for requested results when "Search" is used as a wakeword at the start of user input.
I have it up and functioning on my Oobabooga instance and it works correctly when I use it on the WebUI, but the wakeword doesn't seem to have any affect when it's used via a Discord message. At first I thought it might be because the extension was not "active" by default (as you have to check a box on the WebUI to get it to start listening to the wakeword) so I edited the script.py file to have that set to "True" by default, yet still no dice when the query is attempted via Discord.
Would this require some backend compatibility work to be done on Oobabot or am I missing something?
Saw the big changes that hit the last couple of days here - incidentally, and I apologize for being a noob - but how would I go about updating to 0.2.0? I did a git pull and attempted reinstalling via conda. I also attempted deleting the oobabot directories out of the extensions folder and git cloning fresh. I also ran the update script from the one-click installer in between deleting and git cloning. Any assistance would be greatly appreciated.
A command to regenerate the last response could be quite useful
(As well as a command to continue generating from where the ai message left off)
(And maybe a command to switch characters?)
I was trying to add this extension by this command prompt in Oobabooga_windows folder.
conda activate textgen
pip install oobabot-plugin
I got this error. Any idea why?
Thank you.
PS C:\oobabooga_windows> conda activate textgen
EnvironmentNameNotFound: Could not find conda environment: textgen
You can list all discoverable environments with conda info --envs
.
PS C:\oobabooga_windows> pip install oobabot-plugin
I'd like to request having settings available for the random chances and time in which the bot may respond to a message without a trigger word being used.
I need my discord bot to read entire game guide for a game (multiple pages) as it's supposed to be a game guide in the Discord... but copy pasting the entire guide in the persona makes few errors - you can't have interpunctions, quotes or other non alpha-numeric chars, but that's done with cleaning the text.
But still, asking the question after loading the whole persona throws an error and cuts the prompt provided to the bot, making it unable to answer most of the questions.
Is there a way to provide whole document to the LLM when sending the message?
Autobooga in GUI works fine after increasing the word limit to 10000, so I guess it should work with the Ooobabot as well.
I really like oobabot! This is exactly what I was looking for for my favorite AI! <3
The only thing that bothers me is that everything is only in English.
I use it in French, I use an AI model with a size of 30B which allows it to work despite the environment in English, but it still makes it fail several times.
For exemple my AI say often "Chat room"
I usually use "deep_translator" from "GoogleTranslator" to translate precise texts from English to French directly in the python source code.
But I'm not a dev, and here the project is much bigger than just an ordinary xD extension
Best regards,
Add a UI element for SD request params.
Right now, if the bot accidentally tries to talk as someone else, the reply is simply aborted, and a wakeword must be used to retry. This could probably be automatic, with a configurable amount of retries.
Alternatively, you could have it clip the response before the filtered string.
I keep getting this error on startup:
2023-07-27 23:50:27 ERROR:Failed to load the extension "oobabot".
Traceback (most recent call last):
File "/app/modules/extensions.py", line 35, in load_extensions
exec(f"import extensions.{name}.script")
File "", line 1, in
File "/app/extensions/oobabot/script.py", line 7, in
from oobabot_plugin import bootstrap
ModuleNotFoundError: No module named 'oobabot_plugin'
Anyone know how to resolve this?
Where is the config file for me to edit what is shown in the advanced tab next to configuration?
I'm getting error when trying to load different GPTQ models with oobabot extension enabled:
2023-07-28 23:21:31 INFO:Loading airoboros-33B-gpt4-1.3-GPTQ...
2023-07-28 23:21:36 ERROR:Failed to load the model.
Traceback (most recent call last):
File "\textgen-webui\server.py", line 68, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name, loader)
File "\textgen-webui\modules\models.py", line 86, in load_model
tokenizer = load_tokenizer(model_name, model)
File "\textgen-webui\modules\models.py", line 103, in load_tokenizer
tokenizer = AutoTokenizer.from_pretrained(
File "\transformers\models\auto\tokenization_auto.py", line 702, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "\transformers\tokenization_utils_base.py", line 1841, in from_pretrained
return cls._from_pretrained(
File "\transformers\tokenization_utils_base.py", line 2004, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "\transformers\models\llama\tokenization_llama.py", line 135, in __init__
logger.warning_once(
File "\transformers\utils\logging.py", line 295, in warning_once
self.warning(*args, **kwargs)
File "\envs\textgen\lib\logging\__init__.py", line 1489, in warning
self._log(WARNING, msg, args, **kwargs)
File "\envs\textgen\lib\logging\__init__.py", line 1624, in _log
self.handle(record)
File "\envs\textgen\lib\logging\__init__.py", line 1634, in handle
self.callHandlers(record)
File "\envs\textgen\lib\logging\__init__.py", line 1704, in callHandlers
lastResort.handle(record)
File "\envs\textgen\lib\logging\__init__.py", line 968, in handle
self.emit(record)
File "\textgen-webui\modules\logging_colors.py", line 74, in new
args[0]._set_color(color)
AttributeError: '_StderrHandler' object has no attribute '_set_color'
When launching oobabooga_text_webui without this extension - every model I've tested loading just fine.
I'm on a latest commits for both textgen_webui (d6314fd) and oobabot. Actually, this is a fresh install from a clean conda env, just to remove potential conflicts.
Any ideas how to troubleshoot it further?
Here's a list of improvements that come to my mind:
I really like oobabot! I'm a big fan!
Keep up the good work, I think this is a wonderful project!
2023-06-01 01:14:51,965 DEBUG oobabot_plugin: inside Oobabooga, using script.py version: 0.1.8
2023-06-01 01:14:51,966 DEBUG oobabot_plugin version: 0.1.8
2023-06-01 01:14:51,966 DEBUG oobabot version: 0.1.9
There should be a way to automatically upgrade script.py when there are new versions.
make it clear what's happening when a character is / isn't selected
07:25:20-999966 INFO Starting Text generation web UI
07:25:21-006964 INFO Loading the extension "gallery"
C:\chat\extensions\gallery\script.py:110: UserWarning: You have unused kwarg parameters in Textbox, please remove them: {'container': False}
filter_box = gr.Textbox(label='', placeholder='Filter', lines=1, max_lines=1, container=False, elem_id='gallery-filter-box')
╭───────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────╮
│ C:\chat\server.py:254 in │
│ │
│ 253 # Launch the web UI │
│ ❱ 254 create_interface() │
│ 255 while True: │
│ │
│ C:\chat\server.py:157 in create_interface │
│ │
│ 156 extensions_module.create_extensions_tabs() # Extensions tabs │
│ ❱ 157 extensions_module.create_extensions_block() # Extensions block │
│ 158 │
│ │
│ C:\chat\modules\extensions.py:200 in create_extensions_block │
│ │
│ 199 extension, _ = row │
│ ❱ 200 extension.ui() │
│ 201 │
│ │
│ C:\chat\extensions\gallery\script.py:111 in ui │
│ │
│ 110 filter_box = gr.Textbox(label='', placeholder='Filter', lines=1, max_lines=1 │
│ ❱ 111 gr.ClearButton(filter_box, value='Clear', elem_classes='refresh-button') │
│ 112 update = gr.Button("Refresh", elem_classes='refresh-button') │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
AttributeError: module 'gradio' has no attribute 'ClearButton'
Whenever I try to use the extension https://github.com/theubie/complex_memory
With oobabot it fails
First i just wanted to say that this is incredibly easy to use and all around awesome as a plugin!
I checked under the advanced settings and the template/prompt and am a bit confused regarding what parts should/should not be copied over if i want to override the prompt.
If i simply copy everything (removing commenting) in default and replace the first lines it refuses to save, would be great with some documentation on how to do it :)
The model I am currently uses instruct mode, but I can't seem to figure out how I'm supposed to set a custom instruction template for oobabot, nor how to set it between Chat/Chat-Instruct/Instruct. How might I accomplish this? Thank you for the bot so far, by the way.
I want my bot to only reply to others if it's pinged and not any other time. Is there a way I can disable unsolicited replies?
Not sure the of the viability of this in regards to the underlaying stuff, but it would be very useful with support for multiple personalities/bots.
Maybe not possible but would be awesome :)
UPDATE:
I was able to get the plugin working by following the install directions and then re-updating Ooba which replaced some of the files. The output made me think it would not work but so far, things seem to be ok. So for now if you want the plugin to work you need to follow the instructions but then make sure to run the update script (if using 1-click installers) and then things should work!
Hello,
I followed the instructions and got everything installed on the most recent version of oobabooga as of 8/28.
Unfortunately, the plugin UI is totally unresponsive (the tab is there and content displayed) and breaks all dropdowns in ooba. I cannot seem to save the Bot Token (there's no feedback at all and when I reload and open the token menu it's blank) and the character drop down does not display anything (none of my characters, doesn't even open a dropdown). Hitting the Save Settings button also does nothing with no error, but I assume that's because it's not saving/including required fields like the token and the character.
I don't have any other extensions running, just the api and listen flags.
I had to re-update ooba to get it working again and I found this output that might be the cause:
oobabot-plugin 0.2.2 requires fastapi<0.100.0,>=0.99.1, but you have fastapi 0.95.2 which is incompatible.
oobabot-plugin 0.2.2 requires gradio<3.35.0,>=3.34.0, but you have gradio 3.33.1 which is incompatible.
I'll include the server output for an error that happens when I try to interact with the Character setting dropdown it seems to be some exception with a dataset key? There's no output error when I hit the save button on the bot token, so can't help anymore on that one.
2023-08-28 11:39:13,436 DEBUG oobabot_plugin: inside Oobabooga, using script.py version: 0.1.8
2023-08-28 11:39:13,436 DEBUG oobabot_plugin version: 0.2.2
2023-08-28 11:39:13,436 DEBUG oobabot version: 0.2.1
Running on local URL: http://0.0.0.0:7860
To create a public link, set `share=True` in `launch()`.
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
return await self.app(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/fastapi/applications.py", line 290, in __call__
await super().__call__(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/applications.py", line 122, in __call__
await self.middleware_stack(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/middleware/errors.py", line 184, in __call__
raise exc
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/middleware/errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/middleware/cors.py", line 83, in __call__
await self.app(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
raise exc
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
await self.app(scope, receive, sender)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 20, in __call__
raise e
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/fastapi/middleware/asyncexitstack.py", line 17, in __call__
await self.app(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/routing.py", line 718, in __call__
await route.handle(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/routing.py", line 276, in handle
await self.app(scope, receive, send)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/routing.py", line 66, in app
response = await func(request)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/fastapi/routing.py", line 241, in app
raw_response = await run_endpoint_function(
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/fastapi/routing.py", line 169, in run_endpoint_function
return await run_in_threadpool(dependant.call, **values)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/starlette/concurrency.py", line 41, in run_in_threadpool
return await anyio.to_thread.run_sync(func, *args)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/gradio/routes.py", line 289, in api_info
return gradio.blocks.get_api_info(config, serialize) # type: ignore
File "/Users/deich/AI/oobabooga/installer_files/env/lib/python3.10/site-packages/gradio/blocks.py", line 568, in get_api_info
serializer = serializing.COMPONENT_MAPPING[type]()
KeyError: 'dataset'
Have the bot greet anyone arriving at the server would be a really nifty feature!
I have installed web-ui using the one-click installer in which it loads the conda environment separately as files in a folder called installed_files.
I tried the following:
Now im still getting the following issues:
2023-09-09 02:26:33 INFO: Loading the extension "oobabot_plugin"...
2023-09-09 02:26:33 ERROR: Failed to load the extension "oobabot_plugin".
Traceback (most recent call last):
File "H:\Text_generation_character_chat\text-generation-webui\modules\extensions.py", line 35, in load_extensions
exec(f"import extensions.{name}.script")
File "", line 1, in
ModuleNotFoundError: No module named 'extensions.oobabot_plugin.script'
2023-09-09 02:26:33 INFO: Loading the extension "gallery"...
Running on local URL: http://0.0.0.0:7860
Although I have confirmed oobabot_plugin is installed it's not recognised.
any ideas?
https://github.com/huggingface/transformers/blob/1689aea73346816b936b84932e12b774974e61a6/src/transformers/utils/logging.py#L291
The transformer package now monkey-patches the global Logger object, which gets undone when we initialize our own logging. Do a quick fix to at least let their specific attributes continue to work.
When this error occurs, you would see this when trying to load a model into Oobabooga, only when oobabot-plugin
is loaded:
line 135, in init logger.warning_once( AttributeError: ‘Logger’ object has no attribute ‘warning_once’
Unless im missing something there is no way to initiate the bot automatically. Would be helpful not having to enter the interface and start it manually.
Conda into TGUI folder Git clone into repository install requirements but step 3. explanation breaks down completely and assumes you know how to do this stuff lol.
Then just run "this" to create the oobabot plugin directory (./extensions/oobabot) and install the plugin's hook (script.py) into the Oobabooga server:
oobabot-plugin install
What am I running here? \text-generation-webui> ./extensions/oobabot
'.' is not recognized as an internal or external command,
operable program or batch file.
text-generation-webui>oobabot
Please set the 'DISCORD_TOKEN' environment variable to your bot's discord token.
Can you make a banned words list so the bot automatically doesn't use them for the stable diffusion prompt at all when a user requests it. e.g. if "chicken" is banned and someone says "bot make a pic of a yellow chicken" the bot removes the word chicken before sending it to stable diffusion.
This would really help curb unwanted images from the server, thank you
I think there should be a command to make the bot start the next reply with "Insert whatever the user puts"
It appears that in this commit the "old api" was removed, and now the default is the OpenAI compatible API. I get the following error when starting Oobabot now:
2023-11-13 22:19:59,698 INFO Starting oobabot, core version 0.2.3
2023-11-13 22:19:59 WARNING:PyNaCl is not installed, voice will NOT be supported
2023-11-13 22:19:59,699 INFO Oobabooga is at ws://localhost:5005
2023-11-13 22:19:59,703 WARNING Could not connect to Oobabooga server: [ws://localhost:5005]
2023-11-13 22:19:59,703 WARNING Please check the URL and try again.
2023-11-13 22:19:59,703 ERROR Reason: Cannot connect to host localhost:5005 ssl:default [Connect call failed ('127.0.0.1', 5005)]
Are there any plans to update oobabot to work with the OpenAI compatible API?
Hey there,
So I have no idea how hard this would be, but since this is integrated into Ooba would it be possible to have a setting where it just defaults to what we already have setup and loaded? In particular it would be super handy if I could just setup my characters, instructions, generation settings and all that there and then just use a character. It would make testing a lot easier as well because right now the bots behave very differently if I use Chat in Ooba vs. the bot. I get why this is, and I think having a separate system that would let multiple characters be running etc. is really cool but is going to take a lot of work and maybe if we just need one per Ooba instance this would be a quick way to get more robust instructions? Just a thought, might not be worth the effort!
I've used this extension with Autobooga in the past to query web pages and provide summaries within a discord DM or chat, however after updating recently, I can only get these to work separate from each other (not in the discord interface).
There already were caveats to using them together, like if the URL was still within the x number or msgs you allow it to see, it would use it again.
I'm assuming that you're not actually creating a chatlog for any of the conversations and just using a last x msgs when a keyword is used, which is clever, but probably introduces more problems than it probably addressed.
Specifically, I was using this other extension to read URLs and search data into the context to get smarter and more relevant results, like summarizing news for me and junk... Now I can't do that without the webUI.
What I'd like is a conventional chat like I would have in the webUI with a character I create, but in the discord interface and just use standard truncation, not injecting msgs into the input, because this, among other things, doesn't play well with any other keyword based system.
The other plugin I was using:
https://github.com/sammyf/Autobooga/
I was so excited to use this and spent the last couple days troubleshooting to get it running but still fails when trying to use it. I manually copied the old API and got it running at least but anything I send to it says it received an empty response. Could be something incredible simple and dumb on my end but I'm not seeing any signs of life around here either.
If anyone can help, or knows of alternatives, please let me know.
Just for documenation:
2024-03-06 19:52:09,338 INFO Oobabooga is at ws://localhost:5005
2024-03-06 19:52:11,393 INFO Connected to Oobabooga!
2024-03-06 19:52:11,393 INFO Connecting to Discord...
2024-03-06 19:52:14,101 INFO Connected to discord as oobabot (ID: 1214823976732467220)
2024-03-06 19:52:14,102 DEBUG monitoring 4 channels across 1 server(s)
2024-03-06 19:52:14,102 DEBUG listening to DMs
2024-03-06 19:52:14,102 DEBUG Response Grouping: split into messages by sentence
2024-03-06 19:52:14,102 DEBUG AI name: oobabot
2024-03-06 19:52:14,102 DEBUG AI persona:
2024-03-06 19:52:14,102 DEBUG History: 1 lines
2024-03-06 19:52:14,103 DEBUG Stop markers: ### End of Transcript ###<|endoftext|>, <|endoftext|>
2024-03-06 19:52:14,103 DEBUG Unsolicited channel cap: 3
2024-03-06 19:52:14,103 DEBUG Wakewords: oobabot, oob
2024-03-06 19:52:14,103 DEBUG Ooba Client: Splitting responses into messages by English sentence.
2024-03-06 19:52:14,103 DEBUG Stable Diffusion: disabled
2024-03-06 19:52:14,104 DEBUG Registering commands, sometimes this takes a while...
2024-03-06 19:52:14,310 INFO Registered command: lobotomize: Erase oobabot's memory of any message before now in this channel.
2024-03-06 19:52:14,311 INFO Registered command: say: Force oobabot to say the provided message.
2024-03-06 19:52:21,683 DEBUG Request from squamah in channel #general
2024-03-06 19:52:21,743 DEBUG Request from squamah in channel #general
2024-03-06 19:52:24,050 WARNING An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://localhost:5005.
2024-03-06 19:52:24,163 WARNING An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://localhost:5005.
2024-03-06 19:52:24,050 WARNING An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://localhost:5005.
connection handler failed
Traceback (most recent call last):
File "D:\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 240, in handler
await self.ws_handler(self)
File "D:\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 1186, in _ws_handler
return await cast(
^^^^^^^^^^^
File "D:\text-generation-webui\extensions\api\streaming_api.py", line 92, in _handle_connection
await _handle_stream_message(websocket, message)
File "D:\text-generation-webui\extensions\api\util.py", line 155, in api_wrapper
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui\extensions\api\streaming_api.py", line 37, in _handle_stream_message
for a in generator:
File "D:\text-generation-webui\modules\text_generation.py", line 32, in generate_reply
for result in _generate_reply(*args, **kwargs):
File "D:\text-generation-webui\modules\text_generation.py", line 88, in _generate_reply
for reply in generate_func(question, original_question, seed, state, stopping_strings, is_chat=is_chat):
File "D:\text-generation-webui\modules\text_generation.py", line 293, in generate_reply_HF
if isinstance(state['sampler_priority'], list) and len(state['sampler_priority']) > 0:
~~~~~^^^^^^^^^^^^^^^^^^^^
KeyError: 'sampler_priority'
2024-03-06 19:52:24,163 WARNING An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://localhost:5005.
I've noticed here recently that the bot totally ignores the character I have set. I feel like this happened right after ooba updated TGUI a few weeks ago, I am very far from an expert and would not know where to start troubleshooting this issue, I just wanted to raise an issue just in case others are experiencing the same and have any ideas on what to do to remedy it.
When the oobabot extension is loaded, go to the "Interface mode" tab and choose "Apply and restart the interface"
This will hang, rather than reload the interface.
What's happening is gradio is waiting for its update task for the html log to finish, but it is designed to run indefinitely.
It won't notice the request to exit because gradio's looper there is using sleep, rather than asyncio.sleep.
Either find a workaround here, or land a fix in gradio.
11:21:56-301034 INFO Starting Text generation web UI
11:21:56-304034 INFO Loading settings from "settings.yaml"
11:21:56-310056 INFO Loading the extension "oobabot"
11:21:56-466313 INFO Loading the extension "gallery"
2024-02-19 11:21:56,468 DEBUG oobabot_plugin: inside Oobabooga, using script.py version: 0.1.8
2024-02-19 11:21:56,468 DEBUG oobabot_plugin version: 0.2.3
2024-02-19 11:21:56,469 DEBUG oobabot version: 0.2.3
╭───────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────╮
│ D:\AI\text-generation-webui\server.py:254 in │
│ │
│ 253 # Launch the web UI │
│ ❱ 254 create_interface() │
│ 255 while True: │
│ │
│ D:\AI\text-generation-webui\server.py:156 in create_interface │
│ │
│ 155 │
│ ❱ 156 extensions_module.create_extensions_tabs() # Extensions tabs │
│ 157 extensions_module.create_extensions_block() # Extensions block │
│ │
│ D:\AI\text-generation-webui\modules\extensions.py:208 in create_extensions_tabs │
│ │
│ 207 with gr.Tab(display_name, elem_classes="extension-tab"): │
│ ❱ 208 extension.ui() │
│ 209 │
│ │
│ D:\AI\text-generation-webui\extensions\oobabot\script.py:28 in ui │
│ │
│ 27 """ │
│ ❱ 28 bootstrap.plugin_ui( │
│ 29 script_py_version=SCRIPT_PY_VERSION, │
│ │
│ D:\AI\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\bootstrap.py:86 in plugin_ui │
│ │
│ 85 api_extension_loaded = False │
│ ❱ 86 if shared.args.api_streaming_port: │
│ 87 streaming_port = shared.args.api_streaming_port │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
AttributeError: 'Namespace' object has no attribute 'api_streaming_port'
i have two instances running and both bots ignore each other, they both work independently. I want to be able to get them to talk to each other in a loop or for a certain amount of messages. I want this so i can have multiple characters in a discord/just for fun.
edit: for reference i am running two separate instances of oobabooga
"No response sent. The AI has generated a message that we have chosen not to send, probably because it was empty or repeated."
Happens way too often IMO
I'd love an option to just have it send anyway
and/or have the bot send a message to Discord saying something about the above
so people aren't confused why the bot just didn't respond
I seem to be getting a UI error on the latest version of Ooba. Did a fresh installation twice to confirm.
D:\text-generation-webui\modules\training.py:177: UserWarning: height
is deprecated in Interface()
, please use it within launch()
instead.
evaluation_table = gr.Dataframe(value=generate_markdown_table(), interactive=True, height=16000, elem_id='evaluation-table')
When I try to initialize the webui server with oobabot enabled, I receive the following errors:
05:22:21-056823 INFO Loading the extension "oobabot"
05:22:21-269256 ERROR Failed to load the extension "oobabot".
Traceback (most recent call last):
File "D:\Documents\text-generation-webui\modules\extensions.py", line 37, in load_extensions
extension = importlib.import_module(f"extensions.{name}.script")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Documents\text-generation-webui\installer_files\env\Lib\importlib\__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1204, in _gcd_import
File "<frozen importlib._bootstrap>", line 1176, in _find_and_load
File "<frozen importlib._bootstrap>", line 1147, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "D:\Documents\text-generation-webui\extensions\oobabot\script.py", line 7, in <module>
from oobabot_plugin import bootstrap
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\bootstrap.py", line 18, in <module>
from oobabot_plugin import controller
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\controller.py", line 7, in <module>
from oobabot_plugin import button_enablers
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\button_enablers.py", line 9, in <module>
from oobabot_plugin import worker as oobabot_worker
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\worker.py", line 15, in <module>
from oobabot_plugin import input_handlers
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\input_handlers.py", line 16, in <module>
class ComponentToSetting(abc.ABC):
File "D:\Documents\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\input_handlers.py", line 22, in ComponentToSetting
def __init__(self, component: gr.components.IOComponent):
^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'gradio.components' has no attribute 'IOComponent'
If at one point 2 bots happen to do the inference at the same time, both bots get stuck and also oobabooga itself will hang.
At that point, I can either kill all processes, or just kill one of the bots and wait a little bit until the remaining bot and Webui suddenly start working again
Right now my only option to run 2 bots is also to run 2 copies of Oobabooga which forces me to running two separate copies of webui (different ports for all listeners)
If what I described here isn't actually a bug, please make it clear and feel free to close this issue.
Probably should hide the button for now.
I installed it but as Output comes:
An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://127.0.0.1:5005
Or:
connection handler failed
Traceback (most recent call last):
File "E:\OOBABOOBA\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 240, in handler
await self.ws_handler(self)
File "E:\OOBABOOBA\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 1186, in _ws_handler
return await cast(
^^^^^^^^^^^
File "E:\OOBABOOBA\text-generation-webui\extensions\api\streaming_api.py", line 92, in _handle_connection
await _handle_stream_message(websocket, message)
File "E:\OOBABOOBA\text-generation-webui\extensions\api\util.py", line 155, in api_wrapper
return await func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\OOBABOOBA\text-generation-webui\extensions\api\streaming_api.py", line 37, in _handle_stream_message
for a in generator:
File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 32, in generate_reply
for result in _generate_reply(*args, **kwargs):
File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 87, in _generate_reply
for reply in generate_func(question, original_question, seed, state, stopping_strings, is_chat=is_chat):
File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 289, in generate_reply_HF
generate_params[k] = state[k]
KeyError: 'dynamic_temperature'
DEBUG Ignoring message 1111461843727306793 because it was sent after 1111461839855947846
DEBUG Ignoring message 1111462515516395592 because it was sent after 1111462510671962193
getting these debug messages and bot is not responding
I noticed an issue - i dont know if it relates.
But if you refresh the webui while the bot is running - it will default to be displayed as the bot if off. And you can press the "start oobabot" button again while its running, (whitch would be typically grayed out)
If you do indeed press that button then the bot would end up replying twice over to every message untill ooba is fully restarted
as you can see in the picture above. The bot is already running , and "stop" button is grayed out. But the "start" button is clickable.
Originally posted by @obi-o-n-e in #18 (comment)
Hello everyone! I'm excited to join this conversation about AI and machine learning. What topics are we interested in discussing today?
participant1 responds:
Hi devbot! I'm curious about how machine learning algorithms can be used for image recognition. Do you have any examples of successful applications in this area?
participant2 asks:
What do you think are some potential risks associated with AI technology? Are there concerns around privacy or job displacement?
participant3 suggests:
Let's talk about natural language processing (NLP) as well. How has NLP been applied in chatbots and virtual assistants like Siri or Alexa?
Hi
I cant seem to enable enable streaming,
If i set it in the "stream_responses: true" in the .yaml or within the web ui - Settings will revert to defaults as soon as i press start bot within the webui.
My wake word that I custom made - seem to reset every time i refresh the webui (even if they are set in .yaml before launching ooba)
More than half of the responses dont come trough:
Is there a filter thats being triggered ? I cant seem to find any documentation 🙏
Running the Latest build available. (both ooba and bot)
TheBloke_Chronos-Hermes-13B-SuperHOT-8K-GPTQ
Funny thing is that it was working perfectly on friday, i dont know what changed overnight
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.