blarc / ai-commits-intellij-plugin Goto Github PK
View Code? Open in Web Editor NEWAI Commits for IntelliJ based IDEs/Android Studio.
Home Page: https://plugins.jetbrains.com/plugin/21335-ai-commits
License: MIT License
AI Commits for IntelliJ based IDEs/Android Studio.
Home Page: https://plugins.jetbrains.com/plugin/21335-ai-commits
License: MIT License
plugin version: 1.6.0
If the Language Pack plugin is installed in IDEA, you will not be able to find the ai commits setting option in the settings. Even after uninstalling the Language Pack plugin, properly configuring ai commits, and reinstalling the Language Pack plugin, it still cannot be used normally.
i got an error after generating an api key and try to generate a commit message:
You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.
{local} seems to have incorrect values
I get the following message and cannot install, is there a solution?
Not compatible with the version of your running IDE (PhpStorm 2022.2.3)
I'm a South Korean
Even though I changed Locale to Korea during the setup, the message generation phrase comes out only in English
I opened IntelliJ Idea 2023.3.1 (Community Edition), Get from VCS, pasted the clone URL, when project opens I get this:
`A problem occurred configuring root project 'ai-commits-intellij-plugin'.
Could not resolve all files for configuration ':classpath'.
Could not resolve org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1.
Required by:
project : > org.jetbrains.intellij:org.jetbrains.intellij.gradle.plugin:1.16.1
> No matching variant of org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 was found. The consumer was configured to find a library for use during runtime, compatible with Java 8, packaged as a jar, and its dependencies declared externally, as well as attribute 'org.gradle.plugin.api-version' with value '8.0.2' but:
- Variant 'apiElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'javadocElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'runtimeElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'sourcesElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'testFixturesApiElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin-test-fixtures:1.16.1 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'testFixturesRuntimeElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin-test-fixtures:1.16.1 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
Could not resolve org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0.
Required by:
project : > org.jetbrains.changelog:org.jetbrains.changelog.gradle.plugin:2.2.0
> No matching variant of org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 was found. The consumer was configured to find a library for use during runtime, compatible with Java 8, packaged as a jar, and its dependencies declared externally, as well as attribute 'org.gradle.plugin.api-version' with value '8.0.2' but:
- Variant 'apiElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'javadocElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'runtimeElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'sourcesElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
Run with --stacktrace option to get the stack trace.
Run with --info or --debug option to get more log output.
Run with --scan to get full insights.
`
https://platform.openai.com/docs/assistants/overview/agents
I think especially for code analysis and commit message generation the new assistant feature can work very well. It would be great if this were supported.
I noticed that the diff in "Edit Prompt" window in plugin settings is generated as if "before" and "after" are swapped.
It looks like it is just in "Edit Prompt" window, and not in the diffs that are sent to the AI model.
Version: 1.4.0
Openai also has many more models available, would you mind adding them?
Here the list of them all
MODEL = Literal[
"gpt-3.5-turbo",
"gpt-3.5-turbo-16k",
"gpt-3.5-turbo-0613",
"gpt-3.5-turbo-16k-0613",
"gpt-4-0613",
"gpt-4",
"gpt-4-32k",
"gpt-4-32k-0613",
"gpt-3.5-turbo-instruct",
"gpt-4-0314",
"gpt-3.5-turbo-0301",
"gpt-4-32k-0314"
]
Sometimes the diff will be very long, I think maybe the -32k
model can handle it well.
This morning I suppose OpenIA resply slowly,
Without any feedback I click on the IA commit button again, still nothing I click a 3rd time
And now my commit message change like 3 time in a row qucikly
I would like some feedback where the request is and/or not be able to send the request while the first one is not done? 🙏
Thank you for making this plugin and open sourcing it!
Have you considered supporting the --unified
flag?
-U<n>
--unified=<n>
Generate diffs with lines of context instead of the usual three. Implies --patch.
To begin with, making this an option in the plugin settings would be great.
(To be even more sophisticated, the plugin could set --unified
dynamically based on the diff, for example to the largest value that will still allow the diff+prompt to fit in the context)
Task:
Write a clean and comprehensive github commit message in the conventional commit convention.I'll send you an output of 'git diff --staged' command, and you convert it into a commit message.Lines must not be longer than 74 characters.Use {locale} language to answer.
Do NOT add any descriptions to the commit, only commit message.No Co-authored.
Raw:
{diff}
CommitMessageFormat:
[emoji] <type>[optional scope]: <description>
[optional body]
[optional footer(s)]
TaskResult:
In order to improve the standardization of the output generated by the LLM, I propose we implement an advanced configuration that uses simulated interactions. This would allow for a more refined, context-aware generation of commit messages based on specific interaction patterns.
The intended operational flow is as follows:
system: [basic prompt]
user: [sample diff]
assistant: [sample commit message]
user: [real diff]
In this scenario, the LLM would then generate a commit message that is more accurately formatted according to the provided sample. This could potentially lead to increased efficiency in crafting commit messages that adhere to the preferred style of the individual or team, thereby improving overall project workflow.
Looking forward to your feedback.
why i couldn't use this plugin in Pycharm?
Plugin thinks about 30 sec and then outputs:
Error occurred: Rate limit reached for default-gpt-3.5-turbo in organization org-qhdxxxxslxlioFX7inZ on tokens per min. Limit: 90000 / min. Current: 0 / min. Contact us through our help center at help.openai.com if you continue to have issues.
Tryied different prompts - same result.
OpenAI account uses pay as you go billing, and works fine with any other intgration.
After restart webstorm (2023)
locale is changed to default, not early selected
Add {branch} variable that resolves to current branch for prompt customisation
Add an exclusion list to settings, to which you can add globs that describe file paths of files that should not be used when generating commit message.
The place for setting up a proxy needs some hints, because now it is common to use a self-deployed reverse proxy server as an HTTP proxy, while the application actually sets up an HTTP/SOCKS proxy address.
Is it possible to use a proxy setting located mainly within the editor (Settings > Appearance & Behavior > System Settings > HTTP Proxy) to make requests to OpenAI?
Auzure OpenAI is compatible and shall just work.
It's a more dedicated way of uzing private owned models
Thank you for great work on this plugin.
I connected plugin to local OpenAI - like server and generation of response takes sometime longer than currently set socket-timeout:30000
ms.
Can you add field to customize this timeout?
Thanks!
version 0.6.0
java.lang.NullPointerException: null cannot be cast to non-null type com.intellij.vcs.commit.AbstractCommitWorkflowHandler<*, *>
at com.github.blarc.ai.commits.intellij.plugin.AICommitAction.actionPerformed(AICommitAction.kt:31)
at com.intellij.openapi.actionSystem.ex.ActionUtil.doPerformActionOrShowPopup(ActionUtil.java:333)
at com.intellij.openapi.keymap.impl.ActionProcessor.performAction(ActionProcessor.java:47)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher$1.performAction(IdeKeyEventDispatcher.java:585)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.lambda$doPerformActionInner$9(IdeKeyEventDispatcher.java:707)
at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:105)
at com.intellij.openapi.application.TransactionGuardImpl.performUserActivity(TransactionGuardImpl.java:94)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.lambda$doPerformActionInner$10(IdeKeyEventDispatcher.java:707)
at com.intellij.openapi.actionSystem.ex.ActionUtil.performDumbAwareWithCallbacks(ActionUtil.java:356)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.doPerformActionInner(IdeKeyEventDispatcher.java:704)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processAction(IdeKeyEventDispatcher.java:648)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processAction(IdeKeyEventDispatcher.java:596)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processActionOrWaitSecondStroke(IdeKeyEventDispatcher.java:480)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.inInitState(IdeKeyEventDispatcher.java:469)
at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.dispatchKeyEvent(IdeKeyEventDispatcher.java:225)
at com.intellij.ide.IdeEventQueue.dispatchKeyEvent(IdeEventQueue.kt:598)
at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.kt:568)
at com.intellij.ide.IdeEventQueue.access$_dispatchEvent(IdeEventQueue.kt:68)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:349)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:348)
at com.intellij.openapi.progress.impl.CoreProgressManager.computePrioritized(CoreProgressManager.java:787)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:348)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:343)
at com.intellij.ide.IdeEventQueueKt.performActivity$lambda$1(IdeEventQueue.kt:994)
at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:113)
at com.intellij.ide.IdeEventQueueKt.performActivity(IdeEventQueue.kt:994)
at com.intellij.ide.IdeEventQueue.dispatchEvent$lambda$4(IdeEventQueue.kt:343)
at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.kt:385)
at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:207)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:117)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:113)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:105)
at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:92)
The gpt-4-1106-preview model is much cheaper to use than gpt-4 (and supposedly better), it would be nice to be able to select this model in the plugin settings.
Hello,
I've confirmed my API key works. I can see many more models in my playground, and have confimed the same via Postman (below).
But I don't see any additional models in the model dropdown. Is anyone using a 16k model?
What I've done:
Am I missing something?
Any key won't work
See #35 (comment)
EDIT: Added original comment:
i am missing the "deployment name" configuration
https://learn.microsoft.com/en-us/rest/api/cognitiveservices/azureopenaistable/deployments/create?tabs=HTTP
https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#completions
I guess you already know. But now you know it even more.
| __ \ | | | | /\ / || |
| |) || | | | / \ | ( | |__
| / | | | | / /\ \ _ \ | |
| | | | | | / ____ \ ) || |
|| ||||// _|/ |_____|
We currently send all changes files to OpenAI to generate the commit message. We should send only the files that are selected in the commit dialog.
at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:280)
at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:85)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:59)
at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:38)
at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
at com.intellij.idea.Main.mainImpl(Main.kt:63)
at com.intellij.idea.Main.mainImpl$default(Main.kt:49)
at com.intellij.idea.Main.main(Main.kt:46)
Suppressed: com.intellij.diagnostic.PluginException: com.github.blarc.ai.commits.intellij.plugin.settings.AppSettings [Plugin: com.github.blarc.ai-commits-intellij-plugin]
at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2(ComponentManagerImpl.kt:854)
at com.intellij.serviceContainer.ComponentManagerImpl.registerServices(ComponentManagerImpl.kt:806)
at com.intellij.serviceContainer.ComponentManagerImpl.registerComponents(ComponentManagerImpl.kt:403)
at com.intellij.openapi.client.ClientAwareComponentManager.registerComponents(ClientAwareComponentManager.kt:58)
at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invokeSuspend(ApplicationLoader.kt:197)
at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invoke(ApplicationLoader.kt)
at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invoke(ApplicationLoader.kt)
at kotlinx.coroutines.intrinsics.UndispatchedKt.startUndispatchedOrReturn(Undispatched.kt:78)
at kotlinx.coroutines.BuildersKt__Builders_commonKt.withContext(Builders.common.kt:167)
at kotlinx.coroutines.BuildersKt.withContext(Unknown Source)
at com.intellij.platform.diagnostic.telemetry.impl.TracerKt.span(tracer.kt:56)
at com.intellij.platform.diagnostic.telemetry.impl.TracerKt.span$default(tracer.kt:49)
at com.intellij.platform.ide.bootstrap.ApplicationLoader.initServiceContainer(ApplicationLoader.kt:196)
at com.intellij.platform.ide.bootstrap.ApplicationLoader.access$initServiceContainer(ApplicationLoader.kt:1)
at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$1.invokeSuspend(ApplicationLoader.kt)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.UndispatchedCoroutine.afterResume(CoroutineContext.kt:270)
at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:102)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
Caused by: com.intellij.platform.instanceContainer.internal.InstanceAlreadyRegisteredException: com.github.blarc.ai.commits.intellij.plugin.settings.AppSettings
at com.intellij.platform.instanceContainer.internal.InstanceRegistrarImpl.registerInitializer(InstanceRegistrarImpl.kt:32)
at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2Inner(ComponentManagerImpl.kt:910)
at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2(ComponentManagerImpl.kt:847)
... 23 more
Pycharm 2023.3 - Ai Commits 1.5.0
Bug description
Toggling the model for gpt-4, or 3, or any variant thereof in PyCharm gives the error:
"the model <> does not exist"
aicommits version
2.2.1
Environment
Windows 10 2023.1.2
Pycharm: PC-231.9011.38
Version 2023.1.2
PyCharm 2023.1.2 (Community Edition)
Build #PC-231.9011.38, built on May 16, 2023
Runtime version: 17.0.6+10-b829.9 amd64
VM: OpenJDK 64-Bit Server VM by JetBrains s.r.o.
Windows 10.0
GC: G1 Young Generation, G1 Old Generation
Memory: 2048M
Cores: 4
Registry:
ide.balloon.shadow.size=0
Support for Ollama
I have a custom prompt that I am using. When I modify and update it again, and then save the settings, I find that the prompt selection has reset to Basic
.
Need to test what happens when non-text files are used for computing the diff and consequently the commit message.
Even with only one file with barely any diff it still says "You exceeded your current quota, please check your plan and billing details"
I have chatgpt free but there should be an option
I've said the OpenAI token and verified it. I can use it just fine. But whenever I restart my PyCharm, the token will be empty and I have to add it again. How can I fix this please?
Hey 👋
I think commit message should stay concice and short cf https://cbea.ms/git-commit/
Change in my code
const joinClanButton = new ActionRowBuilder<ButtonBuilder>()
.addComponents(
new ButtonBuilder()
+ .setLabel(`Rejoindre ${clan.name}`)
- .setLabel(`Rejoindre clan name`)
.setStyle(ButtonStyle.Link),
);
Here mutliple result I feel to much
The only change in this code is the label of the joinClanButton. Instead of a fixed label "Rejoindre le Clan", it now uses a dynamic label that includes the name of the clan:
Rejoindre ${clan.name}
. This makes the button more informative for the user and helps to differentiate between multiple clans. No other changes were made in this file.
344 CHARACTERS
The change in this diff is a simple update to a label in a Discord embed message. The label for a button that allows users to join a Clash of Clans clan is updated to include the name of the clan. This change makes it clearer to users which clan they are joining when they click the button.
290 CHARACTERS
The change in this commit modifies the label of the "Join Clan" button in the embed message sent by the BrulerieBot. Instead of just displaying "Rejoindre le Clan," it now displays "Rejoindre" followed by the name of the clan. This change makes it clearer to the user which clan they are joining when they click the button. No other changes were made in this commit.
366 CHARACTERS
All of theses commits message would not fit a git history, github commit page, or anything that list commits
Would you like to make some changes on this? 🙏
Is there a way to downgrade to 0.4 on the build 231 ? 🤔
I'm using the ai-commits-intellij-plugin
, but I cannot do so from mainland China due to internet restrictions. Therefore, I would like to request the addition of a new feature that allows users to set a network proxy in the plugin's settings, so that users in China can access the openai API.
We can't request openai in China.
Adding support for network proxies would greatly benefit users in China who want to use the ai-commits-intellij-plugin to access openai API.
Awesome plugin - thanks for this!
I've run into a small bug where my selected prompt gets reset to Basic
.
Here you should see the selected prompt is Basic
can support Access Token?
Unhandled exception in [StandaloneCoroutine{Cancelling}@65296840, EDT]
com.aallam.openai.api.exception.OpenAIAPIException: This model's maximum context length is 4097 tokens. However, your messages resulted in 747057 tokens. Please reduce the length of the messages.
at com.aallam.openai.client.internal.http.HttpTransport.handleException(HttpTransport.kt:43)
at com.aallam.openai.client.internal.http.HttpTransport.perform(HttpTransport.kt:25)
at com.aallam.openai.client.internal.http.HttpTransport$perform$1.invokeSuspend(HttpTransport.kt)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
at com.intellij.openapi.application.impl.DispatchedRunnable.run(DispatchedRunnable.kt:35)
at com.intellij.openapi.application.TransactionGuardImpl.runWithWritingAllowed(TransactionGuardImpl.java:209)
at com.intellij.openapi.application.TransactionGuardImpl.access$100(TransactionGuardImpl.java:21)
at com.intellij.openapi.application.TransactionGuardImpl$1.run(TransactionGuardImpl.java:191)
at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
at com.intellij.openapi.application.impl.ApplicationImpl$3.run(ApplicationImpl.java:456)
at com.intellij.openapi.application.impl.FlushQueue.doRun(FlushQueue.java:79)
at com.intellij.openapi.application.impl.FlushQueue.runNextEvent(FlushQueue.java:122)
at com.intellij.openapi.application.impl.FlushQueue.flushNow(FlushQueue.java:41)
at java.desktop/java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:318)
at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:788)
at java.desktop/java.awt.EventQueue$3.run(EventQueue.java:739)
at java.desktop/java.awt.EventQueue$3.run(EventQueue.java:731)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:86)
at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:758)
at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.kt:666)
at com.intellij.ide.IdeEventQueue._dispatchEvent$lambda$7(IdeEventQueue.kt:570)
at com.intellij.openapi.application.impl.ApplicationImpl.withoutImplicitRead(ApplicationImpl.java:1446)
at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.kt:570)
at com.intellij.ide.IdeEventQueue.access$_dispatchEvent(IdeEventQueue.kt:68)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:349)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:348)
at com.intellij.openapi.progress.impl.CoreProgressManager.computePrioritized(CoreProgressManager.java:787)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:348)
at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:343)
at com.intellij.ide.IdeEventQueueKt.performActivity$lambda$1(IdeEventQueue.kt:994)
at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:105)
at com.intellij.ide.IdeEventQueueKt.performActivity(IdeEventQueue.kt:994)
at com.intellij.ide.IdeEventQueue.dispatchEvent$lambda$4(IdeEventQueue.kt:343)
at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.kt:385)
at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:207)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:117)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:113)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:105)
at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:92)
Suppressed: kotlinx.coroutines.DiagnosticCoroutineContextException: [StandaloneCoroutine{Cancelled}@65296840, EDT]
Caused by: io.ktor.client.plugins.ClientRequestException: Client request(POST https://api.openai.com/v1/chat/completions) invalid: 400 Bad Request. Text: "{
"error": {
"message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 747057 tokens. Please reduce the length of the messages.",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
}
"
at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invokeSuspend(DefaultResponseValidation.kt:54)
at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invoke(DefaultResponseValidation.kt)
at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invoke(DefaultResponseValidation.kt)
at io.ktor.client.plugins.HttpCallValidator.validateResponse(HttpCallValidator.kt:51)
at io.ktor.client.plugins.HttpCallValidator.access$validateResponse(HttpCallValidator.kt:43)
at io.ktor.client.plugins.HttpCallValidator$Companion$install$3.invokeSuspend(HttpCallValidator.kt:152)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
... 38 more
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.