• iii@mander.xyz
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    4 hours ago

    Would love to see a source for that claim. How many 9’s uptime do they target? 90%, 99%

      • iii@mander.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 hours ago

        Source (1)

        Later this month the LA Board of Water and Power Commissioners is expected to approve a 25-year contract that will serve 7 percent of the city’s electricity demand at 1.997¢/kwh for solar energy and 1.3¢ for power from batteries.

        The project is 1 GW of solar, 500MW of storage. They don’t specify storage capacity (MWh). The source provides two contradicting statements towards their ability to provide stable supply: (a)

        “The solar is inherently variable, and the battery is able to take a portion of that solar from that facility, the portion that’s variable, which is usually the top tend of it, take all of that, strip that off and then store it into the battery, so the facility can provide a constant output to the grid”

        And (b)

        The Eland Project will not rid Los Angeles of natural gas, however. The city will still depend on gas and hydro to supply its overnight power.

        Source (2) researches “Levelized cost of energy”, a term they define as

        Comparative LCOE analysis for various generation technologies on a $/MWh basis, including sensitivities for U.S. federal tax subsidies, fuel prices, carbon pricing and cost of capital

        It looks at the cost of power generation. Nowhere does it state the cost of reaching 90% uptime with renewables + battery. Or 99% uptime with renewables + battery. The document doesn’t mention uptime, at all. Only generation, independant of demand.

        To the best of my understanding, these sources don’t support the claim that renewables + battery storage are costeffective technologies for a balanced electric grid.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          24 minutes ago

          It looks at the cost of power generation

          Yes.

          But then you added the requirement of 90% uptime which is isn’t how a grid works. For example a coal generator only has 85% uptime yet your power isn’t out 4 hours a day every day.

          Nuclear reactors are out of service every 18-24 months for refueling. Yet you don’t lose power for days because the plant has typically two reactors and the grid is designed for those outages.

          So the only issue is cost per megawatt. You need 2 reactors for nuclear to be reliable. That’s part of the cost. You need extra bess to be reliable. That’s part of the cost.

    • mosiacmango@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      4 hours ago

      Uptime is calculated by kWh, I.E How many kilowatts of power you can produce for how many hours.

      So it’s flexible. If you have 4kw of battery, you can produce 1kw for 4hrs, or 2kw for 2hrs, 4kw for 1hr, etc.

      Nuclear is steady state. If the reactor can generate 1gw, it can only generate 1gw, but for 24hrs.

      So to match a 1gw nuclear plant, you need around 12gw of of storage, and 13gw of production.

      This has come up before. See this comment where I break down the most recent utility scale nuclear and solar deployments in the US. The comentor above is right, and that doesn’t take into account huge strides in solar and battery tech we are currently making.

      The 2 most recent reactors built in the US, the Vogtle reactors 3 and 4 in Georgia, took 14 years at 34 billion dollars. They produce 2.4GW of power together.

      For comparison, a 1 GW solar/battery plant opened in nevada this year. It took 2 years from funding to finished construction, and cost 2 billion dollars.

      So each 1.2GW reactor works out to be 17bil. Time to build still looks like 14 years, as both were started on the same time frame, and only one is fully online now, but we will give it a pass. You could argue it took 18 years, as that’s when the first proposals for the plants were formally submitted, but I only took into account financing/build time, so let’s sick with 14.

      For 17bil in nuclear, you get 1.2GW production and 1.2GW “storage” for 24hrs.

      So for 17bil in solar/battery, you get 4.8GW production, and 2.85gw storage for 4hrs. Having that huge storage in batteries is more flexible than nuclear, so you can provide that 2.85gw for 4 hr, or 1.425 for 8hrs, or 712MW for 16hrs. If we are kind to solar and say the sun is down for 12hrs out of every 24, that means the storage lines up with nuclear.

      The solar also goes up much, much faster. I don’t think a 7.5x larger solar array will take 7.5x longer to build, as it’s mostly parallel action. I would expect maybe 6 years instead of 2.

      So, worst case, instead of nuclear, for the same cost you can build solar+ battery farms that produces 4x the power, have the same steady baseline power as nuclear, that will take 1/2 as long to build.

      • iii@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        Uptime is calculated by kWh, I.E How many kilowatts of power you can produce for how many hours.

        That’s stored energy. For example: a 5 MWh battery can provide 5 hours of power at 1MW. It can provide 2 hours of power, at 2.5MW. It can provide 1 hour of power, at 5MW.

        The max amount of power a battery can deliver (MW), and the max amount of storage (MWh) are independant characteristics. The first is usually limited by cooling and transfo physics. The latter usually by the amount of lithium/zinc/redox of choice.

        What uptime refers to is: how many hours a year, does supply match or outperform demand, compared to the number of hours a year.

        So to match a 1gw nuclear plant, you need around 12gw of of storage, and 13gw of production.

        This is incorrect. Under the assumption that nuclear plants are steady state, (which they aren’t).

        To match a 1GW nuclear plant, for one day, you need a fully charged 1GW battery, with a capacity of 24GWh.

        Are you sure you understand the difference between W and Wh?