24

Why do otherwise intelligent people think chatgpt code is a good idea if they don't know what the code does?

I am a bit in shock by this prospect. I asked about some lines of code that was using some templates I had not used before. The response was "I dunno, chatgpt." This person is really really smart. Yet deploying code that they don't understand completely. This seems dangerous and irresponsible. I ended up rewriting the function I had questions about. It was significantly shorter and didn't do a fuckton of copying strings around.

WTF is wrong with people? Are people afraid to think? Now I want to get out before this kind of shit becomes the norm.

Comments
  • 3
    “now I want to get out before this shit becomes a norm”-too late…
  • 4
    What happens to me is that I'm thinking about the bigger picture of what I'm working on, and if the generated code looks at all like it could be correct, I jump to the next section because that's what I'm already thinking about. I'm supposed to go through the generated code later, but it's easy to forget that part. The urge to try out the code is strong.

    In any case, the response should've been "ah shit, I forgot to polish that part". Luckily, it won't get past code review if the reviewer doesn't understand the code.
  • 3
    I wrote down a trick question for most variations on jackoff's algorithm, where I have them try to perform OpenGL batched draw calls in assembly. If it passes this test I'll consider it a reliable programmer, else no fucking way. A Turing of sorts, for the modern age; none have passed it yet.

    But to the point, the ones that don't outright refuse to answer can merely produce something that 'looks' right, but very much isn't. If you just glance at this output you won't spot subtle but downright dogshit register allocation, meaning the algorithm has done it's job alright.

    It's the exact same story at every higher level: you will only get whatever fools the eye. It *may* be correct and it may not -- that is not a factor in bullshit generation, by definition.

    The core of artificial intelligence is about this illusion, not intelligence per se. Those who know this have known it since the 50s, and those that don't have merely lined themselves up to get fucked in the ass. Smart? Nope.
  • 10
    Pasting in some code from ChatGPT without reading it is no worse than adding a dependency on some random package without thinking about where it's come from, and people have been doing that for years and patting themselves on the back for not reinventing the wheel.
  • 2
    @donkulator you are talking about npm aren't you?...
  • 2
    I can detects bad code. I never gets bad code from green internet mind. Green internet mind writes code. Me runs code. Code works. Me goes to next task.
  • 4
    That thing is really mediocre...
    I only really use it for bigger regexes when my brain shuts off.
  • 9
    Pasting in some code from ChatGPT without reading it is no worse than pasting some code from StackOverflow without reading.
  • 2
    Ah yes, what happened to the good old days when the answer to that question was "dunno copied from stack overflow".

    Edit: see I was late to the party...
  • 1
    I got exactly the same question being blamed for copy pasting. My excuse was that it contained regex. Regex is just complicated. I tested the gpt code. It worked.
  • 2
    @bad-practice I find regexs that are close to what I want. Then I learn what everything means and test with a regex tester online. Once I am satisfied I know it works I implement. Then I promptly forget everything I learned. So the next time I regex I relearn it all over again. lol
  • 2
    I blame the corporate incentive to do it as fast as possible rather than as quality as possible
  • 0
    I started using grimoire recently, gives me a decent code for very simple stuff. I use it to automate simple tasks and it really helps.

    Regexes and simple functions are fine. I often ask it to also give me the output and to use it with some inputs - sort of testing the code in advance. It works ok so far.

    I do not use it for code that handles sensitive data.

    However I didn't really tested it for more complex stuff for which I usually use my brain.
Add Comment