• RedFrank24@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    20 hours ago

    I have certainly found that to be the case with developers working with me. They run into a small problem so they instantly go to Copilot and just paste what it says the answer is. Then, because they don’t know what they’re asking or fully understand the problem, they can’t comprehend the answer either. Then later, they come to me having installed a library they don’t understand and code that’s been hallucinated by Copilot and then ask me why it’s not working.

    A little bit of stepping back and going “What do I hope to achieve with this?” and “Why do I have to do it this way?” goes a long way. It stops you going down rabbit holes.

    Then again, isn’t that what people used to do with StackOverflow?

    • WhatsTheHoldup@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      20 hours ago

      Then again, isn’t that what people used to do with StackOverflow?

      Yes, one of the major issues with StackOverflow that answerers complained about a lot was the “XY problem.”.

      https://meta.stackexchange.com/questions/66377/what-is-the-xy-problem

      Where you’re trying to do X, but because you’re inexperienced you erroneously decide Y must be the solution even though it is a dead end, and then ask people how to do Y instead of X.

      ChatGPT drives that problem up to 11 because it has no problems enabling you to focusing on Y far longer than you should be.

      • superglue@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        20 hours ago

        I find that interesting because, sometimes AI actually does the opposite for me. It suggests approaches to a problem that I hadn’t even considered. But yes, if you push it a certain direction it will certainly lead you along.