• @FooBarrington@lemmy.world
    link
    fedilink
    English
    9
    edit-2
    2 years ago

    While you are right about baseload being more satisfiable through nuclear, you are wrong that it’s in any way important for AI model training. This is one of the best uses for solar energy: you train while you have lots of energy, and you pause training while you don’t. Baseload is important for things that absolutely need to get done (e.g. powering machines in hospitals), or for things that have a high startup cost (e.g. furnaces). AI model training is the opposite of both, so baseload isn’t relevant at all.

    • @eestileib@sh.itjust.works
      link
      fedilink
      English
      32 years ago

      It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.

      That equipment is expensive, and has a relatively short useful lifespan even if not running.

      This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.

    • @guacupado@lemmy.world
      link
      fedilink
      English
      22 years ago

      “And you pause training while you dont.” lmao I don’t know why people keep giving advice in spaces they’ve never worked in.

      • @FooBarrington@lemmy.world
        link
        fedilink
        English
        22 years ago

        What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?

        I don’t know why people keep commenting in spaces they’ve never worked in.

        • @guacupado@lemmy.world
          link
          fedilink
          English
          12 years ago

          No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning.