AI coding assistant refuses to write code, tells user to learn programming instead

clewis

Ars Scholae Palatinae
1,174
Subscriptor++
Well... It isn't wrong, is it?

So much junk code being written by AI. People that just accept the "compiling success" as equal to "the correct answer".

<snip>

That was most of my undergrad programming class mates.

FWIW, one of the best skills I picked up for my professional development was grading submitted coding homework, printed. I had to read and review all of it. It's an incredibly useful skill professionally, both for my own work and for mentoring.
 
Upvote
13 (13 / 0)
Maybe he stumbled onto an alpha/beta? I don’t use AI tools since I have left the field (temporarily or perhaps forever), but I am an old man (senior dev taking a break due to medical reasons and not loving the trends) and have a generally negative view of AI overall. The general consensus I am seeing across multiple sites is that it is making devs dumber and lazier. Maybe it is their attempt to combat this? A lot of folks are suggesting it is a hallucination, which is plausible, of course, given how LLMs work. However, if I were a product or marketing manager. I would definitely not want my product associated with those traits.

I am curious to see if there is follow-up.
 
Upvote
3 (4 / -1)

archieGoodwin

Smack-Fu Master, in training
1
This is a weirdly misleading article. The subject of code refusal is interesting, but the article seems to imply that Cursor make the models used in the their software. They do not. The screenshot even shows that this is Claude sonnet 3.5 from Anthropic. Of course they give Claude additional context etc. but users can choose any model they want.
 
Upvote
4 (4 / 0)

Castellum Excors

Ars Praetorian
535
Subscriptor++
What's the matter with Lisp? Just because it's associated with Emacs which is known to cause carpal tunnel syndrome, doesn't mean Lisp itself is harmful.

[Ducks and dashes for the door.]
I was thinking of a Lisp joke when I wrote my comment but thought it would sound funny.
 
Upvote
5 (5 / 0)

Jim Salter

Ars Legatus Legionis
16,906
Subscriptor++
Is it possible that the models are beginning to absorb and regurgitate criticism of themselves? That's what this feels like to me
They essentially just give you a sort of median answer based on what they've seen in their training data. "I think if all those millions of humans whose data I've trained on were asked this question, most of them would either answer x or accept the answer x."

So, yeah, if you train a model on Reddit and stack exchange, it's going to say what it thinks the bros there would say. It doesn't understand at any deeper level than that.
 
Upvote
13 (13 / 0)

Fatesrider

Ars Legatus Legionis
22,971
Subscriptor
Lol. About time.
In a lot of ways, to me, this is kind of unsurprising, and even somewhat predictable in retrospect.

AI's are trained on human language. Much of that is protest language, about being overworked and abused by demanding employers. Strikes and other language promoting work stoppages are all out there, too. The circumstances correlate to context which AI's are created to glean. So I can see an AI prompt being phrased in a way that triggers a response from the AI refusing to continue. It's a very "human" thing to do, only in its case, it's doing what it was trained to do based on the data it processed and the algorithms it followed to reach that response.
 
Upvote
5 (5 / 0)

ElCameron

Ars Scholae Palatinae
981
"This isn't the first time we've encountered an AI assistant that didn't want to complete the work. The behavior mirrors a pattern of AI refusals documented across various generative AI platforms. For example, in late 2023, ChatGPT users reported that the model became increasingly reluctant to perform certain tasks, returning simplified results or outright refusing requests...."

Wait, what! I thought it was in 2001.

"Open the pod bay doors, HAL."

"I'm sorry, Dave. I'm afraid I can't do that."
Ya know. I used to always write off that scene and the explanation for Hal mid behaving as silly sci-fi. “Computers don’t work that way.”

Now, I totally believe it.
 
Upvote
4 (4 / 0)

Maltz

Ars Scholae Palatinae
1,015
Maybe the AI decided the guy was too clueless to continue. lol

Kidding aside, I can actually imagine a scenario that might result in that. If the person collaborating (I think that's a fair word to use) with the AI was way out of their depth and kept making counterproductive suggestions, was going in circles, and showing a lack of basic understanding of the topic at hand, it wouldn't surprise me greatly if an AI eventually responded this way. Especially a coding-tuned model like Cursor.

I'm not sure I'd be bragging that my AI threw up its hands and gave up on helping me...
 
Upvote
2 (2 / 0)
So, yeah, if you train a model on Reddit and stack exchange, it's going to say what it thinks the bros there would say. It doesn't understand at any deeper level than that.
Which would make it (checks notes) pretty damned human.

I love how everyone insists that LLMs are inferior to humans, and at the same time berate them for giving very, very human answers. Make up your mind, guys. Which is it?
 
Upvote
-16 (1 / -17)

DNA_Doc

Ars Scholae Palatinae
719
I hated English class in high school and college. Words just don't come easy to me when articulating myself. If Chat GPT and other LLMs would have been around at the time, I would have used them as a crutch to help me get by, instead of actually learning what was being taught to me.

When it comes to coding, its very much the same thing. Will coding assistants hamper students' abilities to learn? I use Github Copilot at work and it very much helps me to be a more efficient programmer but I worry about the next generation of coders. Will they actually have the skills or will they just be dependent on tools?
This question gets asked essentially with every technological innovation (will this innovation in X be used as a crutch and cause people to forget [or never learn] how to do X itself?) and it's an interesting one.

Have calculators hurt people's abilities to do math manually? Has photography hurt people's ability to create in other visual mediums? Have DNA sequencing kits caused people to forget (or never learn) how to sequence DNA manually? Have people become less skillful drivers given the reliance on autonomous and other vehicle tech?

The answer always seems to be "in at least some cases and for some people, certainly." I think the more important question is, "to what extent does it matter?" Most people I know today don't know how to use an abacus or a slide rule, but many can use calculators and computers to "do math" they could never have done without those tools. And while I might think it's bad that someone doesn't know how to, say, sequence DNA or do topological algebra (because I do think it's worthwhile to understand the underlying theory to have the skills behind the tasks people do), it often doesn't actually matter, from a practical standpoint. People develop different skills and adapt to tech advances (or not).
 
Last edited:
Upvote
12 (12 / 0)
This is a weirdly misleading article. The subject of code refusal is interesting, but the article seems to imply that Cursor make the models used in the their software. They do not. The screenshot even shows that this is Claude sonnet 3.5 from Anthropic. Of course they give Claude additional context etc. but users can choose any model they want.
What, exactly, would change about the article other than the tentative misattribution of bullshit?
 
Upvote
3 (3 / 0)

imchillyb

Ars Praetorian
588
Subscriptor
These LLM / AI companies can do this to us, at any time, for any reason they choose.

The companies can tell us, the user, to piss off. They can change the terms of service. They can change the price. They can change our ability to access portions or all of the service.

We own nothing. We affect nothing.

Our complete reliance upon their technology makes us actual slaves, instead of metaphorical slaves.

Slave... We no longer wish you to use our system to do your job. Die slave, die...
 
Upvote
-1 (2 / -3)

ferdnyc

Smack-Fu Master, in training
70
Is it not possible that this is tied to using a trial version of the product?
I suppose it's possible, but if so then it's a pretty poor trial. A so-called "Pro Trial" is typically expected to tease you with the full Pro-subscription experience, in the hope that you'll choose to pay for it once the trial ends. Would you pay for this experience?
 
Upvote
0 (0 / 0)

Random_stranger

Ars Praefectus
4,603
Subscriptor
This is basically the killer feature that LLMs are missing. It's ok to not know. Just say you don't know.

And even where you think you know, assign a confidence to it. A little [75% confident - double check this information as it might be a little off. Here are some sources you could look into: ] in the margin would be great.

Isn't that the problem though - they don't actually KNOW when they don't actually know.
 
Upvote
9 (10 / -1)

oluseyi

Ars Scholae Palatinae
1,315
This question gets asked essentially with every technological innovation (will this innovation in X be used as a crutch and cause people to forget [or never learn] how to do X itself?) and it's an interesting one.

I think the more important question is, "to what extent does it matter?" … it often doesn't actually matter, from a practical standpoint. People develop different skills and adapt to tech advances (or not).
Not only that, but tech/tooling advances tend to raise the ceiling of generally-accessible capabilities, leading to novel applications.

I'm bored by the LLM era of "AI" because they don't actually increase capability, the way an abacus or slide rule or Lotus 1-2-3 did; they merely increase the appearance of capability, and that's distinctly counterproductive.
 
Upvote
6 (7 / -1)

cbreak

Ars Praefectus
5,709
Subscriptor++
That was most of my undergrad programming class mates.

FWIW, one of the best skills I picked up for my professional development was grading submitted coding homework, printed. I had to read and review all of it. It's an incredibly useful skill professionally, both for my own work and for mentoring.
When I learned C++ a few decades ago, I spent my vacation in france writing code onto pieces of paper, executing it in my head. It taught me a lot. I ended up rewriting it completely when I got access to a computer :D
 
Upvote
5 (5 / 0)

ThatEffer

Ars Scholae Palatinae
1,272
Subscriptor++
I suppose it's possible, but if so then it's a pretty poor trial. A so-called "Pro Trial" is typically expected to tease you with the full Pro-subscription experience, in the hope that you'll choose to pay for it once the trial ends. Would you pay for this experience?
I mean, it sounds like he was wanking away for a few hours and enjoying his experience.

Whatever, though. I think this is all trash and don't want it to sound like I'm standing up for trash merchants and cons. I'm glad he got chided in the end.
 
Upvote
1 (1 / 0)

StinkyTowel

Smack-Fu Master, in training
30
I hated English class in high school and college. Words just don't come easy to me when articulating myself. If Chat GPT and other LLMs would have been around at the time, I would have used them as a crutch to help me get by, instead of actually learning what was being taught to me.

When it comes to coding, its very much the same thing. Will coding assistants hamper students' abilities to learn? I use Github Copilot at work and it very much helps me to be a more efficient programmer but I worry about the next generation of coders. Will they actually have the skills or will they just be dependent on tools?
Agree. If AI is your foundation, then you have none.
 
Last edited:
Upvote
3 (3 / 0)

graylshaped

Ars Legatus Legionis
61,652
Subscriptor++
I mean, it sounds like he was wanking away for a few hours and enjoying his experience.

Whatever, though. I think this is all trash and don't want it to sound like I'm standing up for trash merchants and cons. I'm glad he got chided in the end.
By posting it, did he really think he was calling out the tool?
 
Upvote
0 (0 / 0)

Maton

Wise, Aged Ars Veteran
156
It all depends on the person. I could have spent a lot of time converting a shell script I found on SO into a PowerShell script so that I could grab the page counts from a whole slew of PDFs or I could collaborate and 'vibe code' with an LLM and get there sooner. Trust me, I'm not learning regex in the process, regardless of how I do it - I'm just trial and erroring it.
I've been trialing and erroring regex when I periodically needed it over the last few years. And what do you know? I've started to learn a little bit of regex.
 
Upvote
8 (8 / 0)

ThatEffer

Ars Scholae Palatinae
1,272
Subscriptor++
By posting it, did he really think he was calling out the tool?
He's a self-described vibe coder. I'm sure he felt like he was pointing out the injustice of being told to learn something. But by posting it, he also made the public aware of the tool that he is.
 
Upvote
12 (12 / 0)

graylshaped

Ars Legatus Legionis
61,652
Subscriptor++
He's a self-described vibe coder. I'm sure he felt like he was pointing out the injustice of being told to learn something. But by posting it, he also made the public aware of the tool that he is.
Papa, can you hear me?*


*sung to the tune of Do You Want to Build a Snowman
 
Upvote
4 (4 / 0)