GithubHelp home page GithubHelp logo

blarc / ai-commits-intellij-plugin Goto Github PK

View Code? Open in Web Editor NEW
124.0 5.0 17.0 753 KB

AI Commits for IntelliJ based IDEs/Android Studio.

Home Page: https://plugins.jetbrains.com/plugin/21335-ai-commits

License: MIT License

Kotlin 100.00%
ai chatgpt commit intellij intellij-plugin commit-message generation jetbrains

ai-commits-intellij-plugin's People

Contributors

actions-user avatar aronbraun avatar blarc avatar dependabot[bot] avatar dylandelobel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

ai-commits-intellij-plugin's Issues

Locale setup problems

I'm a South Korean
Even though I changed Locale to Korea during the setup, the message generation phrase comes out only in English

Fails to build

I opened IntelliJ Idea 2023.3.1 (Community Edition), Get from VCS, pasted the clone URL, when project opens I get this:

`A problem occurred configuring root project 'ai-commits-intellij-plugin'.

Could not resolve all files for configuration ':classpath'.
Could not resolve org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1.
Required by:
project : > org.jetbrains.intellij:org.jetbrains.intellij.gradle.plugin:1.16.1
> No matching variant of org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 was found. The consumer was configured to find a library for use during runtime, compatible with Java 8, packaged as a jar, and its dependencies declared externally, as well as attribute 'org.gradle.plugin.api-version' with value '8.0.2' but:
- Variant 'apiElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'javadocElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'runtimeElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'sourcesElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin:1.16.1 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'testFixturesApiElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin-test-fixtures:1.16.1 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'testFixturesRuntimeElements' capability org.jetbrains.intellij.plugins:gradle-intellij-plugin-test-fixtures:1.16.1 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
Could not resolve org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0.
Required by:
project : > org.jetbrains.changelog:org.jetbrains.changelog.gradle.plugin:2.2.0
> No matching variant of org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 was found. The consumer was configured to find a library for use during runtime, compatible with Java 8, packaged as a jar, and its dependencies declared externally, as well as attribute 'org.gradle.plugin.api-version' with value '8.0.2' but:
- Variant 'apiElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a library, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component for use during compile-time, compatible with Java 11 and the consumer needed a component for use during runtime, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'javadocElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'runtimeElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a library for use during runtime, packaged as a jar, and its dependencies declared externally:
- Incompatible because this component declares a component, compatible with Java 11 and the consumer needed a component, compatible with Java 8
- Other compatible attribute:
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')
- Variant 'sourcesElements' capability org.jetbrains.intellij.plugins:gradle-changelog-plugin:2.2.0 declares a component for use during runtime, and its dependencies declared externally:
- Incompatible because this component declares documentation and the consumer needed a library
- Other compatible attributes:
- Doesn't say anything about its target Java version (required compatibility with Java 8)
- Doesn't say anything about its elements (required them packaged as a jar)
- Doesn't say anything about org.gradle.plugin.api-version (required '8.0.2')

  • Try:

Run with --stacktrace option to get the stack trace.
Run with --info or --debug option to get more log output.
Run with --scan to get full insights.

`

Support More Model?

Openai also has many more models available, would you mind adding them?

Here the list of them all

MODEL = Literal[
    "gpt-3.5-turbo",
    "gpt-3.5-turbo-16k",
    "gpt-3.5-turbo-0613",
    "gpt-3.5-turbo-16k-0613",
    "gpt-4-0613",
    "gpt-4",
    "gpt-4-32k",
    "gpt-4-32k-0613",
     "gpt-3.5-turbo-instruct",
    "gpt-4-0314",
    "gpt-3.5-turbo-0301",
    "gpt-4-32k-0314"
]

Sometimes the diff will be very long, I think maybe the -32k model can handle it well.

Add feedback on the progress of the request

This morning I suppose OpenIA resply slowly,

Without any feedback I click on the IA commit button again, still nothing I click a 3rd time
And now my commit message change like 3 time in a row qucikly

I would like some feedback where the request is and/or not be able to send the request while the first one is not done? 🙏

Customize number of lines of context in diff

Thank you for making this plugin and open sourcing it!

Have you considered supporting the --unified flag?

-U<n>
--unified=<n>
Generate diffs with lines of context instead of the usual three. Implies --patch.

To begin with, making this an option in the plugin settings would be great.

(To be even more sophisticated, the plugin could set --unified dynamically based on the diff, for example to the largest value that will still allow the diff+prompt to fit in the context)

Share Template Prompt:Standardized

Task:
Write a clean and comprehensive github commit message in the conventional commit convention.I'll send you an output of 'git diff --staged' command, and you convert it into a commit message.Lines must not be longer than 74 characters.Use {locale} language to answer.
Do NOT add any descriptions to the commit, only commit message.No Co-authored.
Raw:
{diff}
CommitMessageFormat: 
   [emoji] <type>[optional scope]: <description>
   [optional body]
   [optional footer(s)]
TaskResult:

Advanced Configuration for Better Standardized Outputs through Simulated Interactions

In order to improve the standardization of the output generated by the LLM, I propose we implement an advanced configuration that uses simulated interactions. This would allow for a more refined, context-aware generation of commit messages based on specific interaction patterns.

The intended operational flow is as follows:

system: [basic prompt]
user: [sample diff]
assistant: [sample commit message]
user: [real diff]

In this scenario, the LLM would then generate a commit message that is more accurately formatted according to the provided sample. This could potentially lead to increased efficiency in crafting commit messages that adhere to the preferred style of the individual or team, thereby improving overall project workflow.

Looking forward to your feedback.

"Rate Limit reached" on valid OpenAI account

Plugin thinks about 30 sec and then outputs:

Error occurred: Rate limit reached for default-gpt-3.5-turbo in organization org-qhdxxxxslxlioFX7inZ on tokens per min. Limit: 90000 / min. Current: 0 / min. Contact us through our help center at help.openai.com if you continue to have issues.

Tryied different prompts - same result.
OpenAI account uses pay as you go billing, and works fine with any other intgration.

locales

After restart webstorm (2023)
locale is changed to default, not early selected

Exclusion list

Add an exclusion list to settings, to which you can add globs that describe file paths of files that should not be used when generating commit message.

Proxy settings need hints

The place for setting up a proxy needs some hints, because now it is common to use a self-deployed reverse proxy server as an HTTP proxy, while the application actually sets up an HTTP/SOCKS proxy address.

Replace token not working

image

image

Steps:

  1. Delete previous token from OpenAi
  2. Generate new token
  3. Add new token to plugin and verify
  4. Run generate commit

option to set custom timeout

Thank you for great work on this plugin.
I connected plugin to local OpenAI - like server and generation of response takes sometime longer than currently set socket-timeout:30000 ms.
Can you add field to customize this timeout?

Thanks!

java.lang.NullPointerException: null cannot be cast to non-null type

version 0.6.0

java.lang.NullPointerException: null cannot be cast to non-null type com.intellij.vcs.commit.AbstractCommitWorkflowHandler<*, *>
	at com.github.blarc.ai.commits.intellij.plugin.AICommitAction.actionPerformed(AICommitAction.kt:31)
	at com.intellij.openapi.actionSystem.ex.ActionUtil.doPerformActionOrShowPopup(ActionUtil.java:333)
	at com.intellij.openapi.keymap.impl.ActionProcessor.performAction(ActionProcessor.java:47)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher$1.performAction(IdeKeyEventDispatcher.java:585)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.lambda$doPerformActionInner$9(IdeKeyEventDispatcher.java:707)
	at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:105)
	at com.intellij.openapi.application.TransactionGuardImpl.performUserActivity(TransactionGuardImpl.java:94)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.lambda$doPerformActionInner$10(IdeKeyEventDispatcher.java:707)
	at com.intellij.openapi.actionSystem.ex.ActionUtil.performDumbAwareWithCallbacks(ActionUtil.java:356)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.doPerformActionInner(IdeKeyEventDispatcher.java:704)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processAction(IdeKeyEventDispatcher.java:648)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processAction(IdeKeyEventDispatcher.java:596)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.processActionOrWaitSecondStroke(IdeKeyEventDispatcher.java:480)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.inInitState(IdeKeyEventDispatcher.java:469)
	at com.intellij.openapi.keymap.impl.IdeKeyEventDispatcher.dispatchKeyEvent(IdeKeyEventDispatcher.java:225)
	at com.intellij.ide.IdeEventQueue.dispatchKeyEvent(IdeEventQueue.kt:598)
	at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.kt:568)
	at com.intellij.ide.IdeEventQueue.access$_dispatchEvent(IdeEventQueue.kt:68)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:349)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:348)
	at com.intellij.openapi.progress.impl.CoreProgressManager.computePrioritized(CoreProgressManager.java:787)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:348)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:343)
	at com.intellij.ide.IdeEventQueueKt.performActivity$lambda$1(IdeEventQueue.kt:994)
	at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:113)
	at com.intellij.ide.IdeEventQueueKt.performActivity(IdeEventQueue.kt:994)
	at com.intellij.ide.IdeEventQueue.dispatchEvent$lambda$4(IdeEventQueue.kt:343)
	at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
	at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.kt:385)
	at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:207)
	at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
	at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:117)
	at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:113)
	at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:105)
	at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:92)

Add gpt-4-turbo to models list

The gpt-4-1106-preview model is much cheaper to use than gpt-4 (and supposedly better), it would be nice to be able to select this model in the plugin settings.

Unable to see additional models

Hello,

I've confirmed my API key works. I can see many more models in my playground, and have confimed the same via Postman (below).

But I don't see any additional models in the model dropdown. Is anyone using a 16k model?

What I've done:

  • Added the API Key
  • Confirmed it works by generating a commit message
  • Restated PHP Storm
  • Pressed Refresh

Am I missing something?

Refresh in action
2023-11-09 08 40 13

Model List
image

Model Confirmation

image image

Crash when startup

    at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:280)
    at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:85)
    at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:59)
    at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source)
    at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:38)
    at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source)
    at com.intellij.idea.Main.mainImpl(Main.kt:63)
    at com.intellij.idea.Main.mainImpl$default(Main.kt:49)
    at com.intellij.idea.Main.main(Main.kt:46)
    Suppressed: com.intellij.diagnostic.PluginException: com.github.blarc.ai.commits.intellij.plugin.settings.AppSettings [Plugin: com.github.blarc.ai-commits-intellij-plugin]
        at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2(ComponentManagerImpl.kt:854)
        at com.intellij.serviceContainer.ComponentManagerImpl.registerServices(ComponentManagerImpl.kt:806)
        at com.intellij.serviceContainer.ComponentManagerImpl.registerComponents(ComponentManagerImpl.kt:403)
        at com.intellij.openapi.client.ClientAwareComponentManager.registerComponents(ClientAwareComponentManager.kt:58)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invokeSuspend(ApplicationLoader.kt:197)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invoke(ApplicationLoader.kt)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$2.invoke(ApplicationLoader.kt)
        at kotlinx.coroutines.intrinsics.UndispatchedKt.startUndispatchedOrReturn(Undispatched.kt:78)
        at kotlinx.coroutines.BuildersKt__Builders_commonKt.withContext(Builders.common.kt:167)
        at kotlinx.coroutines.BuildersKt.withContext(Unknown Source)
        at com.intellij.platform.diagnostic.telemetry.impl.TracerKt.span(tracer.kt:56)
        at com.intellij.platform.diagnostic.telemetry.impl.TracerKt.span$default(tracer.kt:49)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader.initServiceContainer(ApplicationLoader.kt:196)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader.access$initServiceContainer(ApplicationLoader.kt:1)
        at com.intellij.platform.ide.bootstrap.ApplicationLoader$initServiceContainer$1.invokeSuspend(ApplicationLoader.kt)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.UndispatchedCoroutine.afterResume(CoroutineContext.kt:270)
        at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:102)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
    Caused by: com.intellij.platform.instanceContainer.internal.InstanceAlreadyRegisteredException: com.github.blarc.ai.commits.intellij.plugin.settings.AppSettings
        at com.intellij.platform.instanceContainer.internal.InstanceRegistrarImpl.registerInitializer(InstanceRegistrarImpl.kt:32)
        at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2Inner(ComponentManagerImpl.kt:910)
        at com.intellij.serviceContainer.ComponentManagerImpl.registerServices2(ComponentManagerImpl.kt:847)
        ... 23 more

Pycharm 2023.3 - Ai Commits 1.5.0

Models other than gpt-3.6 turbo do not work

Bug description

Toggling the model for gpt-4, or 3, or any variant thereof in PyCharm gives the error:
"the model <> does not exist"

aicommits version

2.2.1

Environment

Windows 10 2023.1.2
Pycharm: PC-231.9011.38
Version 2023.1.2
PyCharm 2023.1.2 (Community Edition)
Build #PC-231.9011.38, built on May 16, 2023
Runtime version: 17.0.6+10-b829.9 amd64
VM: OpenJDK 64-Bit Server VM by JetBrains s.r.o.
Windows 10.0
GC: G1 Young Generation, G1 Old Generation
Memory: 2048M
Cores: 4
Registry:
ide.balloon.shadow.size=0

How to handle non-text files

Need to test what happens when non-text files are used for computing the diff and consequently the commit message.

OpenAI token is reset on new IDE launch

I've said the OpenAI token and verified it. I can use it just fine. But whenever I restart my PyCharm, the token will be empty and I have to add it again. How can I fix this please?

Commit are too much verbose since the latest update

Hey 👋
I think commit message should stay concice and short cf https://cbea.ms/git-commit/

Change in my code

const joinClanButton = new ActionRowBuilder<ButtonBuilder>()
    .addComponents(
        new ButtonBuilder()
+            .setLabel(`Rejoindre ${clan.name}`)
-            .setLabel(`Rejoindre clan name`)
            .setStyle(ButtonStyle.Link),
     );

Here mutliple result I feel to much

The only change in this code is the label of the joinClanButton. Instead of a fixed label "Rejoindre le Clan", it now uses a dynamic label that includes the name of the clan: Rejoindre ${clan.name}. This makes the button more informative for the user and helps to differentiate between multiple clans. No other changes were made in this file.

344 CHARACTERS

The change in this diff is a simple update to a label in a Discord embed message. The label for a button that allows users to join a Clash of Clans clan is updated to include the name of the clan. This change makes it clearer to users which clan they are joining when they click the button.

290 CHARACTERS

The change in this commit modifies the label of the "Join Clan" button in the embed message sent by the BrulerieBot. Instead of just displaying "Rejoindre le Clan," it now displays "Rejoindre" followed by the name of the clan. This change makes it clearer to the user which clan they are joining when they click the button. No other changes were made in this commit.

366 CHARACTERS

All of theses commits message would not fit a git history, github commit page, or anything that list commits
Would you like to make some changes on this? 🙏
Is there a way to downgrade to 0.4 on the build 231 ? 🤔

New feature request: Allows user set network proxy

Description:

I'm using the ai-commits-intellij-plugin , but I cannot do so from mainland China due to internet restrictions. Therefore, I would like to request the addition of a new feature that allows users to set a network proxy in the plugin's settings, so that users in China can access the openai API.

Expected behavior:

  1. The plugin should allow users to set a network proxy in its settings.
    Additional information:

We can't request openai in China.
Adding support for network proxies would greatly benefit users in China who want to use the ai-commits-intellij-plugin to access openai API.

Selected prompt resets after editing custom prompt

Awesome plugin - thanks for this!

I've run into a small bug where my selected prompt gets reset to Basic.

Steps to reproduce

  1. Open the plugin settings and create a custom prompt
  2. Select the custom prompt to be the one used for message generation
  3. Hit Apply then OK to apply the settings change
  4. Open the settings back up
  5. Edit and update your custom prompt
  6. Hit Apply then OK again to apply the settings change
  7. Open the settings back up

Here you should see the selected prompt is Basic

Error if "This model's maximum context length is 4097"

Unhandled exception in [StandaloneCoroutine{Cancelling}@65296840, EDT]

com.aallam.openai.api.exception.OpenAIAPIException: This model's maximum context length is 4097 tokens. However, your messages resulted in 747057 tokens. Please reduce the length of the messages.
	at com.aallam.openai.client.internal.http.HttpTransport.handleException(HttpTransport.kt:43)
	at com.aallam.openai.client.internal.http.HttpTransport.perform(HttpTransport.kt:25)
	at com.aallam.openai.client.internal.http.HttpTransport$perform$1.invokeSuspend(HttpTransport.kt)
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:104)
	at com.intellij.openapi.application.impl.DispatchedRunnable.run(DispatchedRunnable.kt:35)
	at com.intellij.openapi.application.TransactionGuardImpl.runWithWritingAllowed(TransactionGuardImpl.java:209)
	at com.intellij.openapi.application.TransactionGuardImpl.access$100(TransactionGuardImpl.java:21)
	at com.intellij.openapi.application.TransactionGuardImpl$1.run(TransactionGuardImpl.java:191)
	at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
	at com.intellij.openapi.application.impl.ApplicationImpl$3.run(ApplicationImpl.java:456)
	at com.intellij.openapi.application.impl.FlushQueue.doRun(FlushQueue.java:79)
	at com.intellij.openapi.application.impl.FlushQueue.runNextEvent(FlushQueue.java:122)
	at com.intellij.openapi.application.impl.FlushQueue.flushNow(FlushQueue.java:41)
	at java.desktop/java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:318)
	at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:788)
	at java.desktop/java.awt.EventQueue$3.run(EventQueue.java:739)
	at java.desktop/java.awt.EventQueue$3.run(EventQueue.java:731)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
	at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:86)
	at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:758)
	at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.kt:666)
	at com.intellij.ide.IdeEventQueue._dispatchEvent$lambda$7(IdeEventQueue.kt:570)
	at com.intellij.openapi.application.impl.ApplicationImpl.withoutImplicitRead(ApplicationImpl.java:1446)
	at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.kt:570)
	at com.intellij.ide.IdeEventQueue.access$_dispatchEvent(IdeEventQueue.kt:68)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:349)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1$1.compute(IdeEventQueue.kt:348)
	at com.intellij.openapi.progress.impl.CoreProgressManager.computePrioritized(CoreProgressManager.java:787)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:348)
	at com.intellij.ide.IdeEventQueue$dispatchEvent$processEventRunnable$1$1.invoke(IdeEventQueue.kt:343)
	at com.intellij.ide.IdeEventQueueKt.performActivity$lambda$1(IdeEventQueue.kt:994)
	at com.intellij.openapi.application.TransactionGuardImpl.performActivity(TransactionGuardImpl.java:105)
	at com.intellij.ide.IdeEventQueueKt.performActivity(IdeEventQueue.kt:994)
	at com.intellij.ide.IdeEventQueue.dispatchEvent$lambda$4(IdeEventQueue.kt:343)
	at com.intellij.openapi.application.impl.ApplicationImpl.runIntendedWriteActionOnCurrentThread(ApplicationImpl.java:831)
	at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.kt:385)
	at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:207)
	at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:128)
	at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:117)
	at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:113)
	at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:105)
	at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:92)
	Suppressed: kotlinx.coroutines.DiagnosticCoroutineContextException: [StandaloneCoroutine{Cancelled}@65296840, EDT]
Caused by: io.ktor.client.plugins.ClientRequestException: Client request(POST https://api.openai.com/v1/chat/completions) invalid: 400 Bad Request. Text: "{
  "error": {
    "message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 747057 tokens. Please reduce the length of the messages.",
    "type": "invalid_request_error",
    "param": "messages",
    "code": "context_length_exceeded"
  }
}
"
	at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invokeSuspend(DefaultResponseValidation.kt:54)
	at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invoke(DefaultResponseValidation.kt)
	at io.ktor.client.plugins.DefaultResponseValidationKt$addDefaultResponseValidation$1$1.invoke(DefaultResponseValidation.kt)
	at io.ktor.client.plugins.HttpCallValidator.validateResponse(HttpCallValidator.kt:51)
	at io.ktor.client.plugins.HttpCallValidator.access$validateResponse(HttpCallValidator.kt:43)
	at io.ktor.client.plugins.HttpCallValidator$Companion$install$3.invokeSuspend(HttpCallValidator.kt:152)
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
	... 38 more

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.