GithubHelp home page GithubHelp logo

Comments (7)

navidkpr avatar navidkpr commented on July 17, 2024 2

Hi,

Cursor's Interpreter mode is custom-built and does not use the Open AI public APIs. So unfortunately we're not able to support it for API key users.

As for the model, we use whatever model you select to send the request. If your requests are being routed to GPT-3.5 while you've chosen gpt-4 as the model, there might be a bug in the system.

You can double check that your request is being sent to GPT-4 by asking the model "who are you and which model are you powered by?"

from cursor.

pryh4ck avatar pryh4ck commented on July 17, 2024 2

Screenshot 2024-03-08 at 7 55 26 PM

and i just looked and i have 25$ of credits in my openAI
plus account (I also pay for the API usage which includes gpt-4, which includes interpreter mode) and.. what the heck. and why dont you guys let us use GPT-4 interpreter... i guess i will just build my own assistant with code interpreter enabled and see if i can run that in the terminal or create my own extension.. idk, but thats kinda ridiculous to be honest. and like i said if you offered open source models, i wouldnt need to pay for openAI plus and thus I would be willing to pay you guys, but making poeple pay for something they are already paying to be able to do,, idk is that even legal? Like, you are letting us use a service we are paying for, but restrict parts of that service we are already paying for, and forcing us to pay you? Very Apple like of you... Oh wait Apple has made an open source LLM production SDK and and open source MLLM "Ferret" wow.. you guys a like, worse than apple in this regard. Love your IDE.. but do not like the path you guys are heading.. Just saying and i guaruntee im not the only one who feels this way...

from cursor.

LionSR avatar LionSR commented on July 17, 2024 1

wouldn't it make sense to automatically switch to the subscription GPT-4 model if exists when on API?

from cursor.

LionSR avatar LionSR commented on July 17, 2024 1

Screenshot 2024-03-08 at 7 55 26 PM

and i just looked and i have 25$ of credits in my openAI plus account and.. what the heck. and why dont you guys let us use GPT-4 interpreter... i guess i will just build my own assistant with code interpreter enabled and see if i can run that in the terminal or create my own extension.. idk, but thats kinda ridiculous to be honest. and like i said if you offered open source models, i wouldnt need to pay for openAI plus and thus I would be willing to pay you guys, but making poeple pay for something they are already paying to be able to do,, idk is that even legal? Like, you are letting us use a service we are paying for, but restrict parts of that service we are already paying for, and forcing us to pay you? Very Apple like of you... Oh wait Apple has made an open source LLM production SDK and and open source MLLM "Ferret" wow.. you guys a like, worse than apple in this regard. Love your IDE.. but do not like the path you guys are heading.. Just saying and i guaruntee im not the only one who feels this way...

and I honestly think your logic is messed up - the hardworking cursor developers do not deserve this. Read this from above "Cursor's Interpreter mode is custom-built and does not use the Open AI public APIs". It is already amazing that cursor manage to even have this when there is no public apis.

from cursor.

pryh4ck avatar pryh4ck commented on July 17, 2024

Uhhh I agree cursor is amazing? and what the hell are you talking about no public api's? I use my gpt-4 API all the freaking time, in other applications. And I THOUGHT i was using gpt-4 with cursor and my API key, since you know they offer that functionality however I didnt know they were going to LIMIT the functionality of the API key itself. According to OpenAI: (from their own documentation:)
Screenshot 2024-03-11 at 4 33 19 PM

So i dont understand this:

wouldn't it make sense to automatically switch to the subscription GPT-4 model if exists when on API?

and I do not understand this:

"Cursor's Interpreter mode is custom-built and does not use the Open AI public APIs". It is already amazing that cursor manage to even have this when there is no public apis.

what the hell are you talking about no public APIs? PLease refer to the screenshot showing how OpenAI's gpt-4 model is the one MADE to work with their interpreter mode, PUBLICALLY. So by custom building their own interpreter mode, which purposely limits the API key that you already pay for (and includes interpreter mode), so that you have to spend 20 MORE dollars a month to use THEIR interpreter mode? Like i already said cursor is great but why would I pay for something, that i am already paying for, JUST to use it in THEIR IDE? I may as well go back to vscodium and OpenAI assistants, Literally anyone could fork VSCode and offer a TRUE "use your own API key" and then allow you to use ALL functionality of said API key. That, OR, allow the use of open sourse models, so that I am not paying for an API key, AND paying for interpreter mode (wich is kinda sus, i would trust OpenAI and the assistant I made myself for actually executing things on my computer than I would this small company that Limits the API key you already are buying, yet allows you to use it, but not all of it, and wont allow you to use any open source models either.

And to all you who say, "well then do it go fork it and make it yourself"..

That is a GREAT idea. It would be an excellent project as NO one seems to be able to do it RIGHT. even if it doesnt take off, I will have my own IDE with my OWN API KEY and its FULL use (including interpreter mode). This option looks especially appealing now that it seems after asking this question, NOW I AM NOT ABLE TO USE GPT 4 AT ALL WITH ANY OF MY API KEYS (which all have GPT-4 access, obviously. I pay for openAI plus so.. ) because now, normal chat or interpreter mode, it doesnt even matter because this is the response I get now from cursor every time:
Screenshot 2024-03-11 at 4 12 29 PM

and that wasnt always the case, just yesterday I had a very long conversation with what I THOUGHT was gpt-4 (because cursor said it was) AND i was using my API key... which makes it even stranger when it suddenly starts telling me that its not even AWARE of a GPT-4... until i had to show it a screenshot of its own user input where it shows that GPT-4 is the chosen model I am chatting with, but that has never happened. and Now, I cant even use my OWN API. To make matters worse I checked my API billing usage to make sure I had credit.. I do, still have about 23$ left..
Screenshot 2024-03-11 at 4 19 33 PM
Screenshot 2024-03-11 at 4 19 52 PM

and Cursors OWN WEBSITE settings say:
Screenshot 2024-03-11 at 4 21 46 PM

Hmm I have used zero of my alotted 50 gpt-4 requests, slow or not. I figured this was because I was using my API key....

now it is starting to look like Cursor never even had gpt-4 functionality, at least on my end it looks like that as I have not changed API keys but now even though I was chatting with what Cursor SAID was gpt-4, apparently it wasnt:
Screenshot 2024-03-08 at 5 06 35 PM

and now I cant even have a NORMAL chat with gpt-4 after posting this and trying interpreter mode. Now i ONLY EVER get this:
Screenshot 2024-03-11 at 4 12 29 PM

strange huh? should i upload a video showing my api key working for chatGPT 4 in other places? but now suddenly it doesnt work in cursor, literally right after putting this issue online.

would someone explain to me where my logic is flawed please? because I am a pretty appreciative guy. I love(d) cursor and turned as many of my friends to it as I could. It was so freaking helpful.. and now after asking a simple question, I am being berrated and being told im illogical? WHY IS IT ILLOGICAL TO BE PAYING FOR A SERVICE, AND USE ANOTHER SERVICE THAT LETS YOU (reportedly) USE THE SERVICE YOU ARE ALREADY PAYING FOR, (or lets you buy the pro version of theirs, OR EVEN gives you 50 "gpt-4" uses and 100 or whatever gpt-3 (since cursor's chat, even with my PUBLICALLY AVAILABLE openAI key that includes gpt-4 (and I can even import openai into a jupyter notebook or google colab right now, give it my API key as a variable and ask it for a list of availble model names I am able to use, and show you that gpt-4, gpt-4-32k and all the way up to vision and dalle and its instruct models as well are availble for me to use again show you @LionSR that there most definitely are publically available interpreter modes APIs from openAI themselves
((((
Screenshot 2024-03-11 at 4 33 19 PM
))))
was made to work with gpt-4 according to their own docs. BUT, according to the cursor chat its never even HEARD of gpt 3.5 (as I am talking to GPT-4, as I always am), and repeatedly says that it is gpt 3, and everytime I tried to tell it otherwise, it was "sorry for the confusion, but it is an LLM by openAI called chat gpt-3 and that there was no such thing as a gpt-4.. (I think I am going to send all this to OpenAI themselves, dig up my system logs and do some foreniscs and see if my cursor usage called out to openAI, and what it came back with, and which model actually did what, taken from my API or from my 10 or whatever requests of 3.5 i made that cursors settings apparently say I have made. along with 0 gpt-4 requests. Out of 50. But NOW, suddenly, after using cursor for months, always keeping my openAI key up to date, now it wont even answer a question using normal chat OR interpreter mode, and it says
Screenshot 2024-03-11 at 4 12 29 PM

But that is even stranger, as I should still have 50 slow requests according to the settings in cursor, again:
Screenshot 2024-03-11 at 4 21 46 PM

So strongly, literally the day after asking this question, my API key simply doesnt work for gpt-4, (in cursor only, remember that i can use gpt-4 through API or in the playground in fact, i have few GPTs I made myself I could send you a link too, since GPT plus users get to do that, as well as use gpt-4, whose API includes interpreter mode.

so yes, good job cursor for creating LLM automation before openAI streamlined it so non-programmers could even do it amazing.. helpful, but if you know python and langchain, or pytorch even, or tensorflow, I mean, especially augmenting it with an LLM like chat gpt-3, I mean, Its not REALLY that amazing? for one, its been in beta mode for as long as I can remember. 2, They said, I could buy the pro version, or continue to use my own API key. I wish i would have known that the API key itself was going to be limited. Is that not flawed logic? To tell someone, Sure you can keep using our product, but it cost money, so you have to put in your own api key cus we aint gonna be paying for all that. So you say ok got it that makes sense, cool. and then you put in your own API key, well, if they are already using your API key for chat becuase they dont want to pay for it, then isnt it flawed logic and also a little shady/forshadowing of dark patterns to be like,

"except for THIS part of the API, we dont want to pay for it, but we dont want you to pay someone esle for it, even though you already are, and even though we said you could use that API, we kinda lied, we want you to pay us for this part of it. even though you are paying for it yourself, and thats the whole reason we let you put your own API key in, is so you pay for it, and not us, we refuse to even LET you pay for this part of your API key, even though, you already are and we gave you that option."

But now, I CANT EVEN HAVE A NORMAL CHAT with gpt-4 because suddenly either my api key isnt set up for that (suddenly, even though it has been up until now , and i still have API credit to use their PUBLICALLY AVAILABLE @LionSR API for interpreter mode, which like i said, was always, like, possible. If you know any sort of automation with ML and especially python, or even just using Langchain, agents, and an LLM (langchain came out like 3 years ago) then creating your own interpreter mode, is not that AMAZING.. but now that my API key supports interpreter mode, why wouldnt you just allow me to pay for the cose of using interpreter mode as well, just like you do with the chat mode.... (or do you? as I said, i thought i had been using gpt-4 but when i asked it about gpt-4, it had only ever heard of 3. even though it said I was talkin to gpt-4 (or was i?) so what were they even using (mining) my api key for, if this whole time i was only chatting with GPT-3? 🤔, that offset in LLM processing power could (very cleverly) and rather easily be redirected to fuel a certain...activity. It would be interesting to work with OpenAI and maybe watch wireshark. its a long shot, but ive seen crazier things. none the less, it would be interesting to see OpenAIs opinion on this. And if they say that Cursor has every right to restrict use of my own paid for API, even after giving me option to use it. Then i will concede, AND i will have a new project to work on!

but seriously @LionSR if you read ALL of what I wrote, where exactly is my logic flawed?

from cursor.

pryh4ck avatar pryh4ck commented on July 17, 2024

wouldn't it make sense to automatically switch to the subscription GPT-4 model if exists when on API?

I... AM on that and have been on the gpt-4 API for literally since it was available months ago. i dont understand your question

from cursor.

pryh4ck avatar pryh4ck commented on July 17, 2024

And again, "Interepreter mode" from cursor has been in beta for like ever. If it even works. I would not trust an already weirdly answering AI, that is supposed be be using my API key that includes gpt-r and an interpreter mode that is NOT in beta, but instead, denies that gpt-4 even exists, therefor, I have to imagine their interpreter mode is eithier using gpt-4, Which is the same model I AM supposed to be using, since you know, it ALLOWS you to use your OWN paid for gpt-4 access... You dont think its a LITTLE bit shady or strange that they expect you to HAVE a paid for API, (as who would be able to develop anything on a free API?) yet they make you pay for something that your PAID API that they already give you access too can do? Fuck your custom model. Either drop ALL use of people being able to use their API that they are paying for, if youre going to offer that as an option anyway. Then either offer it as an option, or dont offer it as an option. Don't pull some dark patterned shady stuff like this.. so they are either USING gpt-4 for interpreter mode, AND since the only model they offer IS openAI, they SAY its a custom built interpreter mode... what using gpt-4 and langchain? otherwise i would not trust an unnamed, non open source model to attempt interpreter mode on my computer that is.. asinine. Am I THE ONLY ONE HERE? So, its "CUSTOM" interpreter mode (which is still in beta) is actually just gpt-4 but they just want more money, THEN DONT OFFER THE OPTION for poeple to use thier api key at all and save people the trouble.... or IT ITSELF is using an open source model and asking money for it, (in which case it must disclose that license and what it is using it for, custom built or not) OR, it is using its own foundation model for code interpretation mode, not disclosing what that model is or looks like, and then FORCING people to use their unnamed, unreleased, beta version, interpreter mode (WHICH DIRECTLY MAKES CHANGES TO YOUR FILESYSTEM AND COMPUTER)... then I would MUCH more trust gpt-4 (who i should have been talking to this entire time, when i learn apparently I wasnt.. according to the chatbot itself. I am now asking OpenAI to look into the matter for me, and thank god we already have a repport because it is ridiculously hard to get ahold of them. so we will get to the bottom of this). and Do i think the Cursor developers deserve this?

If i am right about any of this, and just the fact that my paid for API which they gave us an option to use in the first place, is being denied its full functionality, whch i was given the option to use in the first place, in favor of its "custom built" BETA interpreter mode.

so whoever downvoted me for saying it looks like it is time to just build an assistant with the APIs i already pay for through OpenAI that has intrepreter mode, then you are more than welcome to pay for their unknown AI that changes and makes new or deletes files on your computer with no transparency at all. I for one like to know EXACTLY what is happening when i run something through an LLM, and which LLM or foundation model i am running it through, AND I DO NOT APPRECIATE being offered the ability to use my own API key, but not ALL of it, when especially the only alternative (through cursor) is to pay for their beta "custom built" interpreter mode? like I said, at that point i might aswell built my own. Shit it wouldnt even be hard to create an atom like text editor with a built in LLM chat AND assistant with interpreter mode, put it on GitHub as open source, and let people ACTUALLY use their own API keys... In fact if you are reading this and this sounds like a project you would want to pursure (because the LAST THING WE NEED is a CLOSED-source IDE with a built in AI. Plus, I already have a local LLM powered co-developer (forked but i have added some cool changes. just been waiting for a good opporotunity to show them.. maybe this is it? anyway my LLM code helper thing IS able to use open source models, and that could EASILY be integrated into an IDE or advanced Text Editor. Think DevGPT.. its very similar. But as far as interpreter mode, That would be the only thing, if they were not using their own API that they would need to pay us for, say they were using this new text editor with only open source models. so its totally free. Unless an open source model that is as good as GPT-4s interpreter mode is, then I only would want to use what we know works. gpt-4 interpreter mode is the only one i trust, and if poeple are using their own openAI paid for API key, they of course they can use it free still. if they are using our (mine and whoever else would like to join said project (again get in touch)) product with open source models, but they want to use interpreter mode, then I would only want them to use gpt-4.. but im thinking, no subscriptions. Just, if you want to use interpreter mode, but dont want to pay for GPT-4 api use, then they just pay US as they go, either wholesale to keep it truly open source, or FOSS it and charge just a SMALL fee, if they use OUR product, but if it is FOSS then people can BUY it if they want to change any of the code from inside it. Free and open source is different from open source and ironically free and open source doesnt always mean the product is free while open source means the product is actually free. and if you dont understand that then PLEASE dont even bother responding to this thread, much less try to tell me i am work, and MUC MUCH MUCH less, talk to me at all. thanks. I am a developer, noobish, but not dumb and a quick learner. for example i dont ask dumb things like "wouldnt it make sense to automatically switch to the paid version of woasrkf aqhgwoinkg;asd f....????" like what does that even mean? am I the asshole here? please tell me, after readin all this that Im not crazy and am not the only one who thinks this way... do they deserve this? YES they do, what the hell else is the ISSUES section of github for? I didnt call them names or tell them they were amazing devs or tell them i hated their product. I dont them I dont like where they are headed and its the mindset of @LionSR that allowed microsoft to almost monopolize the PC market and why so many people implicitly trust that awful and unethical, most security vulnerable company in the history of tech and still do and will sadly continue too.

from cursor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.