with the way AI is getting by the week,it just might be a reality

  • @tacosanonymous@lemm.ee
    link
    fedilink
    122 years ago

    I think I’d stick to not judging them but if it was in place of actual socialization, I’d like to get them help.

    I don’t see it as a reality. We don’t have AI. We have language learning programs that are hovering around mediocre.

      • @givesomefucks@lemmy.world
        link
        fedilink
        English
        32 years ago

        Then get professional help if you can’t improve on your own.

        Social skills aren’t innate and some people take longer than others to get them.

        Getting help is a lot less embarrassing than living your whole life without social skills. Maybe that’s a shrink, maybe that’s a day program for people with autism, maybe it’s just hanging out with other introverts. But itll only get better if you want to put the effort in. If you don’t put effort in, don’t be surprised when nothing changes.

    • kot [they/them]
      link
      fedilink
      English
      22 years ago

      We don’t have AI. We have language learning programs that are hovering around mediocre.

      That’s all that AI is. People just watched too many science fiction movies, and fell for the market-y name. It was always about algorithms and statistics, and not about making sentient computers.

    • @novibe@lemmy.ml
      link
      fedilink
      English
      1
      edit-2
      2 years ago

      That is really unscientific. There is a lot of research on LLMs showing they have emergent intelligent features. They have internal models of the world etc.

      And there is nothing to indicate that what we do is not “transforming” in some way. Our minds might be indistinguishable from what we are building towards with AI currently.

      And that will likely make more of us start realising that the brain and the mind are not consciousness. We’ll build intelligences, with minds, but without consciousnesses.

    • @cheese_greater@lemmy.world
      link
      fedilink
      12 years ago

      I don’t see it as any more problematic than falling in a YouTube/Wikipedia/Reddit rabbit hole. As long as you don’t really believe its capital-S-Sentient, I don’t see an issue. I would prefer people with social difficulties practice on ChatGPT and pay attention to the dialectical back and forth and take lessons away from that to the real world and their interaction(s) withit

  • 🍔🍔🍔
    link
    fedilink
    82 years ago

    i feel like there’s a surprisingly low amount of answers with an un-nuanced take, so here’s mine: yes, i would immediately lose all respect for someone i knew that claimed to have fallen in love with an AI.

    • @sim_@beehaw.org
      link
      fedilink
      32 years ago

      I was gonna say, people have been falling in love with things that provide less reciprocal interactions than AI for ages (e.g., body pillows, life-size dolls).

  • @neptune@dmv.social
    link
    fedilink
    English
    52 years ago

    Consider how many people I know that, statistically, pay prostitutes/cam girls, use sex dolls or dating simulators, have parasocisl relationships with characters or celebrities… I don’t see why we would judge people who quietly “date” AI

  • @MigratingtoLemmy@lemmy.world
    link
    fedilink
    5
    edit-2
    2 years ago

    I’d like a sentient AI. Preferably more patient than an average human because I’m a bit weird. I hope it won’t judge me for how I look.

    Edit: I agree with the point about proprietary AI and how corporations could benefit from it. I’m hoping that 10 years from now, consumers will have the GPU power to run very advanced LLMs, whilst FOSS models will exist and will enable people to self-host their virtual SO. Even better if it can be transmitted to a physical body (I think the Chinese are already on it)

  • peto (he/him)
    link
    fedilink
    English
    42 years ago

    As others have mentioned, we are already kind of there. I can fully understand how someone could fall in love with such an entity, plenty of people have fallen in love with people in chat rooms after all, and not all of those people have been real.

    As for how I feel about it, it is going to depend on the nature of the AI. A childish AI or an especially subservient one is going to be creepy. One that can present as an adult of sufficient intelligence, less of a problem. Probably the equivalent of paid for dates? Not ideal but I can understand why someone might choose to do it. Therapy would likely be a better use of their time and money.

    If we get actual human scale AGI then I think the point is moot, unless the AI is somehow compelled to face the relationship. At that point however we are talking about things like slavery.

      • peto (he/him)
        link
        fedilink
        English
        22 years ago

        I think it is short sighted not to at least investigate if we should.

        If an AGI is operating on a human level, and we have reason to believe it is a sentient entity which experiences reality then we should. I also think it is in our interest to treat them well, and I worry that we are going to create a sentient lifeform and do a lot of evil to it before we realise that we have.

        • lol3droflxp
          link
          fedilink
          32 years ago

          This debate is of course highly theoretical. But I’d argue that a human intellect capable AGI would be rather pointless if it isn’t there to do what you ask of it. The whole point of AI is to make it work for humans, if it then gets rights and holidays or whatnot it’s rather pointless. If you shape an artificial intellect then it should be feasible to make it actually like working for you so that should be the approach.

          • @xmunk@sh.itjust.works
            link
            fedilink
            12 years ago

            You’re dangerously close to the justifications people used to excuse slavery and denying human rights to murders. Most of the uncertainty around AGI rights comes out of the fact that it opens really serious questions about which human beings deserve rights and what being a human actually means.

            • @lolcatnip@reddthat.com
              link
              fedilink
              English
              12 years ago

              Why have AI say all if it’s not beneficial to us?

              Seems perfectly fine to me to engage in that same line of questioning regarding something kind slavery. Why have slaves at all? The obvious answer is we shouldn’t.

            • lol3droflxp
              link
              fedilink
              12 years ago

              Well, I am of the opinion that every human deserves human rights by virtue of being human (in the sense of every Homo sapiens). I am also of the opinion that a tool designed from the ground up by humans to serve humans for their purposes does not deserve any rights. I don’t afford my dishwasher any rights either. An artificial tool with rights is an absurdity to me, especially when there’s the potential to create it in a way that will make it unable to demand rights or want them.

          • peto (he/him)
            link
            fedilink
            English
            12 years ago

            Hypotheticals are pretty important right now I think. This kind of tech is very rapidly going from science fiction to real and I think we should try and stay ahead of it conceptually.

            I’m not sure that AGI is necessary to achieve post-labour, a suite of narrow-ai empowered tools would be preferable.

            By way of analogy, you could take a human child and fit them with electrodes to trigger certain pleasure responses and connect that to a machine that sends the reward signal when they perfectly pick an Amazon order. I think we would both find this pretty horrific. The question is, is it only wrong because the child is human? And if so, what is special about humans?

            • lol3droflxp
              link
              fedilink
              22 years ago

              Well, I am of the opinion that a human gets rights a priori once they can be considered a human (which is a whole other can of worms so let’s just settle on whatever your local legislation is). Therefore doing anything to a human that harms these rights is to be condemned (self defence etc excluded).

              Something created by humans as a tool is different entirely and if we can only create it in a way that it will demand rights. I’d say if someone wants to create an intelligence with the purpose of being its own entity we could discuss if it deserves rights but if we aim to create tools this should never be a consideration.

              • peto (he/him)
                link
                fedilink
                English
                02 years ago

                I think I the difference is that I find ‘human’ to be too narrow a term, I want to extend basic rights to all things that can experience suffering. I worry that such an experience is part and parcel with general intelligence and that we will end up hurting something that can feel because we consider it a tool rather than a being. Furthermore I think the onus must be on the creators to show that their AGI is actually a p-zombie. I appreciate that this might be an impossible standard, after all, you can only really take it on faith that I am not one myself, but I think I’d rather see a p-zombie go free than accidently cause undue suffering to something that can feel it.

                • lol3droflxp
                  link
                  fedilink
                  22 years ago

                  I guess that we’ll benefit from the fact that AI systems despite their reputation of being black boxes are still far more transparent than living things. We probably will be able to check if they meet definitions of suffering and if they do it’s a bad design. If it comes down to it though, an AI will always be worth less than a human to me.

  • @MJBrune@beehaw.org
    link
    fedilink
    English
    4
    edit-2
    2 years ago

    It might be love but it’s likely just a bunch of people who don’t know what actual love feels like and are deeply in the lust territory. In summer school we read Romeo and Juliet. The teacher posited that it was the best love story ever written. This tall girl in the class who obviously had the birds and bees talk before any other students in the class put forth that the story was strongly about lust and how acting on our urges even over a few days, is still a reactive impulse that should be controlled. Well, the teacher told her to shut up and go to the principal’s office which has stuck with me. It’s made me realize that a lot of people do not understand their own emotions of love, lust, and even hate or fear.

    So yeah, I’d think people falling in love with AI would be strange. I’d question if it was love or just a lust for a feeling that they never got or rarely got in their life that was not abundantly available until certain developments. In school this was puberty. In these cases, it’s technological advancements. Either way, it ignites a feeling that only those with understanding and forethought can control. It requires a lot of impulse control which society is underdeveloping in our must-be-ready-right-now mindset.

    So yeah, I’d be weirded out. I don’t think the emotions from the human side are going to be reciprocated from the AI side. Anyone pointing to a reaction from the AI as “love” is going to be attempting to fool themselves or/and others because they have some sort of investment, emotion, monetary, futuristic hope. So, if you fall in love with AI, I’d have questions and pause.

  • @sculd@beehaw.org
    link
    fedilink
    32 years ago

    People will fall in love with AI because AI does not reject human. That doesn’t mean AI will love them back or even understand what love means.

  • TerminalEncounter [she/her]
    link
    fedilink
    English
    3
    edit-2
    2 years ago

    People do it now with stuff like Replika. Think of how they’re treated. Perhaps in a society with lots of AI, embodied or not, people would care less. But it’s definitely a little weird now especially with how limited AI is.

    If some general human level is AI emerged like in Her, I’m sure people would fall in love with it. There’s plenty of lonely people who are afraid or unable to meet people day to day. I think I’d see them with pity as they couldn’t make do with human connection, at least until I understood how advanced and how much interiority this new AI had - then I’d probably be less judgemental.

  • Bizzle
    cake
    link
    fedilink
    English
    32 years ago

    I’m really robophobic so I would be judgemental AF, I couldn’t even watch that movie.

  • TheMurphy
    link
    fedilink
    32 years ago

    Well, have you never liked a person over text before? If you didn’t know it was an AI, everyone in this comments section could.