One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.

Fossil fuel-burning plants, whether that’s natural gas, coal, or oil, produce even less. There’s no way to ramp up nuclear capacity in the time it will take to supply these millions of chips, so much, if not all, of that extra power demand is going to come from carbon-emitting sources.

      • bizarroland@fedia.io
        link
        fedilink
        arrow-up
        11
        ·
        2 months ago

        I’m sure that if you asked any AI they would give you recipes for room temperature superconductors.

        I asked an AI what lava would feel like if you took the heat out of it and it told me, but then it asked me if I would like to know if I would be interested in some delicious lava recipes.

        So I said yes.

    • Fermion@feddit.nl
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      Resistive heating is not the dominant energy loss mechanism in modern computing. Since the advent of field effect transistors, switching losses dominate. Room temperature super conductors could be relevant in power generation, distribution, and manuafacturing, but would not radically alter the power requirements for computing.

      I personally don’t think any possible room temperature super conductors would be economical to produce at a large enough scale to make a large difference in energy demands. Researchers have pretty thoroughly investigated the classes of materials that are easy to manufacture, which suggests a room temperature superconductor would be prohibitevely expensive to produce.

      • Chocrates@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        The one last summer broke me. I have a healthy skepticism of any announcement, but that one seemed so credible I bought in.