Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the US, PV solar capacity factor is about 0.24 (partly because lots of people live near very sunny regions like California and Texas). The National Renewable Energy Laboratory claims that in some parts of the US utility-scale solar capacity factors are 0.33 [0]. Wind is about 0.36, Hydro also 0.36, Nuclear about 0.93 [1]. If you apply these capacity factors to the nameplate capacities (ignoring batteries), you get 58.5% solar, 19.7% wind, 15.1% natural gas, and 6.6% nuclear.

[0]: https://atb.nrel.gov/electricity/2021/utility-scale_pv

[1]: https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...



Of course those capacity factors hide one important bit, the dispatchability. While the capacity factor is the same in this list for wind and hydro, the wind and solar generation are naturally capped by nature. When demand exceeds the natural availability of these, you need to dispatch some extra generation. Hydro and natural gas are well suited for this, coal can do it even if dirtily. Nuclear generation could be made dispatchable, but due to the essentially zero marginal cost of running it all the time instead of limiting generation, it seldom is used in this fashion.

Electricity demand could be elastic too, but there's a limit to that. It's not always feasible to limit or shut down industrial processes for the few hours of expensive power. And home consumers are loath to not use their stoves when they want. There's potential in elastic demand though.


The problem with running nuclear in a dispatchable fashion (apart from technical problems) is not the marginal cost, but the upfront capital cost. Dispatchable capacity needs to have relatively low upfront cost, because it's seldomly used. The marginal cost is less important.

Nuclear has a high capital cost, it's a really complex tech.


Take a close look at the title again, second word after "solar", you have to pass by "and" to get there.


I guess I earned the snark for not mentioning the batteries.

Weirdly the news item doesn't tell what sort of energy storage capacity these battery installation have, only the peak power output of 14.3 GW for this year's addition. While that is certainly a gigantic addition no matter what the energy capacity of these battery installations is going to be, running the whole grid on batteries over the periods of low renewable generation is going to require still orders of magnitude more batteries. I suppose there's enough lithium in the ground for this, but elastic demand and dispatchable generation are probably going to be part of the equation for economical reasons.


A misconception I often read on YouTube, Reddit, HackerNews is that everyone equates batteries with lithium ion batteries. A battery is a chemical storage device for energy and there are already many different types.

1. There are also functioning batteries without lithium, for example with salt, which are now already being tested in Swiss and German households and bring some advantages compared to lithium batteries. Not least the price. One should always remember that the lower energy density is a problem for an electric vehicle, but it doesn't matter if we install a battery in a cellar. Here, the energy density plays a subordinate role because there is enough space.

2. Would it make more sense to talk about *energy storage* in general instead of just batteries (which are by definition chemical energy storage) Kinetic, chemical, thermal and so on. Lithium ion batteries should not be considered for back-up alone. We definitely need more choices and we have them, mostly with today's technology and definitely easier and faster to develop and install than any new nuclear reactor technology.

3. You need different types of batteries short term storage, medium term storage and long term storage. There are different concepts for each use. Batteries, compressed air storage, pumped storage, thermal storage as well as power-to-x systems are able to absorb the increasing summer power from solar, autumn wind, etc. and make the energy available again in the short term, medium term or seasonally shifted. Examples:

    1. https://www.research-collection.ethz.ch/handle/20.500.11850/445597

    2. https://tu-dresden.de/tu-dresden/newsportal/news/meilenstein-in-der-energy-transition-scientists-at-the-tu-dresden-build-unique-energy-storage (German)

    3. https://www.siemensgamesa.com/products-and-services/hybrid-and-storage/thermal-energy-storage-with-etes-switch

4.The best approach, however, is to build a decentralised grid, which is also intercontinently connected. This is the perfect way to compensate for any "dark lulls". There is research on this at some universities around the world that is already out of laboratory status.


Very good points. I do think the grid battery capacity being added this year is going to be mostly li-ion.


Thats (sadly) true! I think the transition to a diverse heat and electricity storage landscape will take time. Citing from "Handbook Energy Storage SCCER" https://doi.org/10.3929/ethz-b-000445597:

"The SCCER has proven in numerous demonstra- tions that storage technologies are essentially available and usable. Now it is necessary, above all, for political decisions to be taken in the inter- ests of a coherent energy policy in order to re- duce the regulatory obstacles that currently im- pede or make impossible the economical use of energy storage. This can guide business models and investment decisions necessary to advance the technologies developed in the SCCER and bring them from the laboratory into the ultimate energy system of the Energy Strategy 2050."


in industry here in europe I usually see it written as ESS/BESS (energy storage system or battery energy storage system). For new plants we usually simulate each of five or six technologies, however in a lot of cases yes lithium ion has many advantages.


is [2] offline?

maybe you meant this one?: https://tu-dresden.de/tu-dresden/newsportal/news/meilenstein...

... did you use a tool to translate your text from German -> English and the tool did also translate the url...?

However, sb. tried to access this url in Nov 2021, but it did not exist back then aswell: https://web.archive.org/web/20240216101723/https://tu-dresde...


Yes this one: https://tu-dresden.de/tu-dresden/newsportal/news/meilenstein...

Many of my texts and links are from the years 2019-2022 when I was researching for various publications on renewable energy. Most of them, unfortunately, without DOI links. I didn't check them before I uploaded them here. And I would have so much more that it would be enough for an entire book.


> no matter what the energy capacity of these battery installations is going to be, running the whole grid on batteries over the periods of low renewable generation is going to require still orders of magnitude more batteries.

I live in CA (Bay Area) the solar people just wandered up to my front door to "sell me" the other day. I do want to go solar in light of my PGE bill.

The sales guy was super sharp and addressed one of the concerns I had (I have an unusual roof) and we got very nerdy.

Because PGE has time dependent pricing, their model is to use battery to not only power the house in these windows but dump back to the grid during them too (and charge off solar when power is cheap).

SO an independent installer is pitching a system to me (the end consumer) in response to the market price conditions that are going to push "more battery" for "peak demand" into the market.

Now do the economics of that system and their sales pitch make sense? I dont know, Im still crunching those numbers (and they are some hard numbers to figure out), but at first blush im inclined to say "yes" cause fuck giving money to PGE.


At least a rooftop PV installation breaks even* here in Finland with scarce sun and cheap power. Batteries are not there yet. If you don't need the battery as a UPS, I would wait still a few years before going for a home battery.

*: Easily half of the cost is the installation labor. If you can DIY at least parts of it, you can get decent ROI.


Here (bay area), the power prices are so obscene that I almost makes sense to install batteries first.

Those with homes with solar are going to end up saturating the grid (duck curve) to the point where renters can buy batteries and "coast", through the peaks of power cost.


Are spot electricity contracts available, so that you can do it like that? Makes sense then. And of course the installation cost of battery is way smaller than for a PV plant as no roof work is required.


AFAIK, there is no residential spot market with PG&E. Just different seasonal and hour-by-hour rate schedules with published rates.

There is an "electric home" rate plan that has three periods per day: off-peak, partial-peak, and peak with three rates. This can apply to a home with batteries, where you can shift your load to different periods. The spread can be up to $0.22 per kWh in summer and up to $0.04 per kWh in winter.

There is another rate plan with two periods per day: off-peak and peak. The spread here can be up to $0.09 per kWh in summer and up to $0.03 per kWh in winter. This is a typical plan for homes without solar nor batteries and with moderate consumption. This plan has two pricing tiers. A lower rate for consumption up to a "baseline allowance" and then a higher price after that. This allowance is summed over a whole billing period, in contrast to the time of use variations each day.

The above discussion do not include any net-metering, so you never sell power back to the grid. You just optimize your load during different hours of the day. With a currently available net-metering plan, PG&E will pay for excess power only around $0.02 to $0.04 per kWh.

Also, it seems PG&E distinguishes a "paired storage" net metering system, and requires special metering to track the solar generation that goes into the battery versus recharging from the grid. They will only credit solar production delivered back to the grid, and not off-peak grid energy reflected back during peak hours. So, I'm not sure why some posters seem to be talking about this arbitrage scenario.

For context, the actual per kWh rates are around $0.36 to $0.65 in the different seasons and rate plans. So these peak price differences may range around 5% to 25%. There isn't any of the wild fluctuation or negative numbers we've heard from other energy markets.


> Weirdly the news item doesn't tell what sort of energy storage capacity these battery installation have, only the peak power output of 14.3 GW for this year's addition.

As a rule of thumb, the capacity will be a few hours worth. So if the power rating is 14 GW, maybe that will be 60 GWh of capacity.

That's almost enough to smooth over the most regular fluctuations in solar power: the day-night cycle (especially when you remember that demand drops at night). Not close to being economical for storing power from summer through to winter.

A source [0]: > The most common grid-scale battery solutions today are rated to provide either 2, 4, or 6 hours of electricity at their rated capacity

[0] https://www.energysage.com/business-solutions/utility-scale-...


Plenty of batteries in the US are built out of water. They pump the water up during the day and it generates electricity at night. No lithium involved.

> Pumped storage is by far the largest-capacity form of grid energy storage available, and, as of 2020, the United States Department of Energy Global Energy Storage Database reports that PSH accounts for around 95% of all active tracked storage installations worldwide, with a total installed throughput capacity of over 181 GW, of which about 29 GW are in the United States, and a total installed storage capacity of over 1.6 TWh, of which about 250 GWh are in the United States.

I recently visited this one https://en.m.wikipedia.org/wiki/Gianelli_Power_Plant


Pumped storage is a type of grid storage, but I don't think TFA includes it in battery storage. Pumped storage is a fine technology, but is there a lot of build potential left for it?


> Pumped storage is a fine technology, but is there a lot of build potential left for it?

"A lot" is probably subjective, but the two best known global estimates are Hunt et al. (2020) [1] from IIASA and Stocks et al. from ANU re100 (who incidentally also have some interactive maps [3]) which with different cost targets put potential at 17.3 and 23 PWh respectively, which works out to about 2 MWh per person. For comparison, for the past decade, the US ahs consumed about 13 MWh of electricity per person per year, down from a peak of slightly under 14 in 2000 and 2005. With very high levels of electrification, that could potentially rise to 24 to 28 MWh per person per year, or 8 or 9 PWh/yr for the whole country. Total primary energy use is a lot higher, around 90 MWh per person–year or 30 PWh total, this is because both not everything could be practically electrified and the things that could easily be electrified tend to be much more efficient when done electrically. Energy efficiency is also usually assumed to increase slightly in general.

The US specifically is actually above the world average at about 4.5 MWh per capita according to the ANU team's estimates. That's 1.5 PWh per year roughly. In any case, I would expect that there is likely to very likely sufficient potential in most (if not all) grids for pumped hydro to be a significant part of medium duration energy storage (if not all of it), though whether it actually would depends on the costs of other technologies as well.

[1]: https://doi.org/10.1038/s41467-020-14555-y

[2]: https://doi.org/10.1016/j.joule.2020.11.015

[3]: https://re100.eng.anu.edu.au/


There have been some concepts proposed for gravity based pumped storage that do not rely on existing terrain.

I find the concept compelling as it is incredibly simple but it hasn't yet been proven at scale.

https://gravity-storage.com


Please say what you mean.


Technically, solar and wind are dispatchable as well, and much better than even natural gas at that. Grids were both are regulated on the sub-second scale are much more even than is otherwise possible. The thing is just that we consider that as lost energy, even though it is simply the same as running any other power plant at full power. If wind were three times cheaper than natural gas and we hence build three times as much of it, the achievable capacity factor would not look that bad.


More on-site batteries can fix the need for dispatchable power on the grid side, and the cheaper batteries get, the more attractive such solutions start to look.


It is also resilient to grid failures. Even better if you add on-site generation.


The German energy market and therefore me as a consumer had to pay billions for dispatch operations.

The energy market got plenty of inefficiency through the fast build out of renewable that the battery projects, calculating with this excess are getting build but not fast enough.

There are already a few projects at old coal plants were they have connectivity.

And with the ev batteries alone there will be used but still very good batteries hitting this market very soon. Equivalent to the whole water energy storage of Germany


I've been out of the industry for a few years, but it used to take 9 hours to turn a coal generator on.


US nuclear averages below 0.9, individual plants can break 0.93 for a single year, but long term maintenance and refueling cycles lower the average. Ex: 95.73% (2017) 80.25% (lifetime) https://en.wikipedia.org/wiki/Beaver_Valley_Nuclear_Power_St...

94.43% (2017) 75.20% (lifetime) https://en.wikipedia.org/wiki/Brunswick_Nuclear_Generating_S...

96.04% (2017) 78.07% (lifetime) https://en.wikipedia.org/wiki/Browns_Ferry_Nuclear_Plant

So you can cherry pick a few numbers from the best years, but the lifetime averages tell a different story.


Nuclear had much lower capacity factors in the past, which brings the lifetime capacity factor down. But the relevant point is that the capacity factor of nuclear plants running today is >90%


US Nuclear had a capacity factor of 86.1% as recently as 2012, it just varies through time and over a 40+ year lifetime you don’t have outstanding results every single year. So sure you can argue 71.1% in 1997 no longer applies, but it was almost exactly the same fleet of reactors in use back then.

Yet the person I was replying to said “about 93%” when averaging over 93% has been achieved exactly once in 2019. That kind of nonsense is actively harmful when people hear something and then later realize it’s simply incorrect.

Effective advocacy requires accuracy including technology specific issues and how to mitigate them.


Planned or unplanned is the question.


The larger question is what capacity factors would look like if you tried to double the amount of nuclear as many advocates wish. And what that would do to profitability / the need for subsidies.

Or as the industry has been concerned with for a decade, what happens when renewable energy is regularly sending wholesale prices near zero for hours a day.


Is a 15% difference that much of a different story?


Look at your paycheck, and imagine the number is 15% lower. Should answer the question.


If alternate pay is 90% lower, not big of a difference. i.e. If I lose job, I would prefer new one to pay 85% than 10%.


It does. And no, it's not that significant.

I'm not a nuclear fanboy (well, I wish I made sense since I like the idea of free energy, but renewables are just as good), but I think it's important to make accurate arguments.


An accurate argument would include the financial aspects of +/- 15% in generated output and the impact that has on grid capacity....


With such a small difference, the burden of proof in this case is probably on the person arguing that it is a big difference, since you'd generally expect a roughly linear relationship between these.

In fact, I think I'd suspect that downtime is less relevant for nuclear, since I believe most of it will be schedulable, as opposed to renewables where downtime is random and based on conditions. Since it's schedulable, it could be done in the off season, or different plants could be planned to be out at difference periods.


You actually have no idea about electricity transmission. For long term problems, look no further than France. Short term, look up how electricity generation, transmission, markets and grid stability are linked. Wikipedia is a good start. Followed by studies about 100% renewable grids, those explain why baseload is much less of an issue than people think. And too much inflexible capacity, aka baseload, can actually be a problem itself.


While I'm not a practicing economist, I have a bachelor's degree in economics and took multiple classes on the economics of electricity markets.

While all of those are certainly relevant if you're comparing nuclear and other sources of power, I fail to see how that's relevant to the question of whether there's a significant difference between 80 and 95% capacity factor over a year.


Now imagine you are working in controlling at a power plant operator, and your yearly generated output is 5% lower that whatever plan has before. What do you think would happen?

- investors, management and board saying "no biggy, 5% is not relevant"

- something else

Companies are doing restructuring and mass lay-offs to save less than 5% on bottom line costs, they are incredibly happy when the top line grows by 5% and worried, if not in crisis mode, if the top line declines by 5%. And for the financing part of a new power plant, those 5% are the difference between the investment being a good or a bad one...


Hold on you are an economist and are telling me the difference between a capacity factor of 95 and 85% is not relevant? What do you think the ROIs and margins are for nuclear investments? Considering that nuclear is almost completely dominated by upfront investment, I'd be surprised if the that difference is not the difference between comfortably making a profit on your investment and loosing a large amount of money.


This discussion makes me realize we could have a 100% renewable energy grid and discussion will still be pushed with the same old “arguments” against renewable being feasible. Ridiculous.


I do not understand your math. How did you go from .36 for wind, apply it to nameplate capacity and get 15.1%? And percent of what? I thought applying the capacity factor to nameplate would result in some measure of energy produced, not a percentage.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: