• @CarbonatedPastaSauce@lemmy.world
    link
    fedilink
    English
    82 months ago

    You missed my point. The hammers you’re using aren’t ‘wrong’, i.e. smacking you in the face 15% of the time.

    Said another way, if other tools were as unreliable as ChatGPT, nobody would use them.

    • @essell@lemmy.world
      link
      fedilink
      92 months ago

      You’ve missed my point.

      ChatGPT can be wrong but it can’t hurt you unless you assume it’s always right

          • That lady, presumably, is one lady.

            But imagine, for a moment, if satnavs had directed hundreds of thousands of people into this lake or that. Don’t you think that would be a problem? Like, we put glass guards around whirring saws for a reason.

            • @essell@lemmy.world
              link
              fedilink
              22 months ago

              Yeah, I’m all for proper regulations on these things. Definitely a lack of guard rails.

              My point was more about the reactions and rhetoric that emerged around that lady, but you make a good point too

    • xor
      link
      fedilink
      English
      62 months ago

      Hammers are unreliable.

      You can hit your thumb if you use the tool wrong, and it can break, doing damage, if e.g. it is not stored properly. When you use a hammer, you accept these risks, and can choose to take steps to mitigate them by storing it properly, taking care when using it and checking it’s not loose before using it.

      In the same regard, if you use LLMs for what they’re good at, and verify their outputs, they can be useful tools.

      “LLMs pointless because I can write a shopping list myself” is like saying “hammers are pointless because I can just use this plank instead”. Sure, you can do that, but there’s other scenarios where a hammer would be kinda handy.

      • @CarbonatedPastaSauce@lemmy.world
        link
        fedilink
        English
        62 months ago

        if you use LLMs for what they’re good at, and verify their outputs

        This is the part the general public is not prepared for, and why the whole house of cards falls apart.

        • xor
          link
          fedilink
          English
          52 months ago

          I agree - but that’s user error, not a bad tool