• Tar_Alcaran
    link
    fedilink
    102 months ago

    It takes a lot of skill and knowledge to recognise a wrong answer that is phrased like a correct answer. Humans are absolutely terrible at this skill, it’s why con artists are so succesful.

    And that skill and knowledge is not formed by using LLMs

    • @essell@lemmy.world
      link
      fedilink
      22 months ago

      Absolutely.

      And you can’t learn to build a fence by looking at a hammer.

      My point all over really. Tools and skills develop together and need to be seen in context.

      People, whether for or against, who describe AI or other tool in isolation, who ignore detail and nuance, are not helpful or informative.