• @Xaphanos@lemmy.world
    link
    fedilink
    English
    2517 days ago

    A major bottleneck is power capacity. Is is very difficult to find 50Mwatts+ (sometime hundreds) of capacity available at any site. It has to be built out. That involves a lot of red tape, government contracts, large transformers, contractors, etc. the current backlog on new transformers at that scale is years. Even Google and Microsoft can’t build, so they come to my company for infrastructure - as we already have 400MW in use and triple that already on contract. Further, Nvidia only makes so many chips a month. You can’t install them faster than they make them.

      • moonking
        link
        fedilink
        2517 days ago

        Humans don’t actually think either, we’re just electricity jumping to nearby neural connections that formed based on repeated association. Add to that there’s no free will, and you start to see how “think” is a immeasurable metric.

      • @themurphy@lemmy.ml
        link
        fedilink
        617 days ago

        And it’s pretty great at it.

        AI’s greatest use case is not LLM and people treat it like that because it’s the only thing we can relate to.

        AI is so much better and many other tasks.

      • Caveman
        link
        fedilink
        316 days ago

        How closely do you need to model a thought before it becomes the real thing?

        • justOnePersistentKbinPlease
          link
          fedilink
          316 days ago

          Need it to not exponentially degrade when AI content is fed in.

          Need creativity to be more than random chance deviations from the statistically average result in a mostly stolen dataset taken from actual humans.

      • @daniskarma@lemmy.dbzer0.com
        link
        fedilink
        116 days ago

        Maybe we are statistical engines too.

        When I heard people talk they are also repeating the most common sentences that they heard elsewhere anyway.