• root
    link
    fedilink
    English
    15 months ago

    I’m confused, were you expecting a specific answer?

      • slazer2au
        link
        fedilink
        English
        25 months ago

        Ah ambiguous user input leading to incorrect output. Aka shin in shit out.

        • @nieceandtows@lemmy.worldOP
          link
          fedilink
          25 months ago

          I mean, regular ok Google used to get it right. Understanding the context is the basic thing digital assistants used to have. If I tell you to come meet me on the 13th, would you assume Feb or April?

        • Drew
          link
          fedilink
          15 months ago

          Idk why people go to hurdles to defend bad software engineering only when it comes to LLMs

          • @MadhuGururajan@programming.dev
            link
            fedilink
            English
            12 months ago

            Because that other stuff used to work before the company fired all of QA and local devs. Then the bugs after that were used to justify LLMs to replace the “shitty coders” that outsourcing to a sweatshop usually entails. At least with the sweatshops there was some argument to be made that the people working there either had no other choice, or a slim chance they actually cared about their output and made it so at least it would do the bare minimum.

            Now that corporate wants to justify their hype and investment in AI to attract the moneyed entities, they will go to any lengths to show it actually works. Even if the Emperor has no clothes on!