Of course it's cheaper and easier to deliver services in one language, but the United States is not a monolingual nation, nor has it ever been in its history. It would be a disservice to significant populations to assume otherwise.
Well then, you'll be ok with having Spanish as the official language the because the majority of American citizens in my area speak Spanish. Or would you prefer German, because for more generations my family has spoken German while being American citizens than English.
Benefit to whom? I only see a benefit to the companies and government here (saving money), but it seems to me that you've forgotten to consider the people.
It would be nice if government letters were written only in English. I.e. getting an 8-page letter with only one page in English is annoying for the people. I pay for that with my taxes and would wholeheartedly support a bill to stop doing that and reduce taxes or increase spending somewhere else that matters more, like road maintenance.
Well, I'm sure when your neighbor gets their 8-page letter in Spanish, Vietnamese, Russian, or whatever their language might be, they too find it annoying that the government finds it necessary to send them a letter in English. I mean, what a waste, right?
The logic of how you get from there to road maintenance... my god, the mental gymnastics it must take.
I think the context you’re deliberately omitting is that we’re talking about the US, not Vietnam, Spain, Mexico, or Russia.
If you moved to one of those countries, you’re saying you would demand official government programs be in English? That seems arrogant to me personally.
I actually think it is easier overall for companies and government to provide services in multiple languages than for every individual to have to learn fluent English including businessese and legalese. It is also cheaper to hire some translators and interpreters than to offer free and extensive courses to everybody.
And to refuse to provide services in other languages and then also refuse to offer courses, as in, “you're on your own now, good luck!”, is a real dick move.
Tesla tried battery swapping in 2015 and abandoned it due to a lack of customer interest (and also due to various problems that made the process less straightforward than you'd think). Both the Model S and Model X were designed from the outset to have swappable batteries.
The battery swap feature was implemented only to maximize California clean energy credits. Only enough infrastructure was built to claim the credits.
“In 2013, California revised its Zero Emissions Vehicle credit system so that long-range ZEVs that were able to charge 80% in under 15 minutes earned almost twice as many credits as those that didn’t. Overnight, Tesla’s 85 kWh Model S went from earning four credits per vehicle to seven. Moreover, to earn this dramatic increase in credits, Tesla needed to prove to CARB that such rapid refueling events were possible. By demonstrating battery swap on just one vehicle, Tesla nearly doubled the ZEV credits earned by its entire fleet even if none of them actually used the swap capability.”
Premature reject. It's working in China, it would work in trucking. "Tesla tried" doesn't mean jack. No car manufacturers want to do this because it means loosing a point of innovation. It has to be regulated to happen at scale. It won't happen but not because it can't work. After all... look at 12v car batteries
Horrid quality video and not about the trucking you mean (i think), but these [0] electric dump trucks are a very welcome sight everywhere in Shenzhen, China.
Battery swapping for trucks is far different from cars though. Trucks are reasonably standardised, they commonly have predestined routes and purpose built depots to operate from. If you're say Pepsi and you've got a fleet of trucks going between your warehouses you can build the infrastructure around your route.
If you look at old mobile phones with removable batteries, you'll notice that there is usually a lot of space taken up by the plastic around the battery which is designed to allow a user to replace it repeatedly. A car battery that's rapidly replaceable would need a large, strong structure around it to allow it to be replaced but also to hold together in the event of a crash. If batteries had to be swapped out, you would lose more cabin space and structural rigidity. Then you get to standardised connectors and mountings, data protocols, the list goes on. And that's before you think of the automated equipment to actually swap the batteries.
In a world where we can charge a car today from 10-80% in 10 minutes, it doesn't seem like a worthwhile engineering challenge.
Cheapest source is accurate. It may not eliminate the need for expensive LPG, but reducing the amount needed is still a massive benefit to the economy.
It's misleading, and those who use it generally know it's misleading. It's used to persuade the layperson that electricity generated by wind turbines is cheaper to the consumer or industry than electricity generated by thermal sources or hydro turbines, when in fact that is only true if you disregard the requirement for said electricity to be available on demand - a fairly important omission.
So by this argument all baseload generation prices are a lie too?
The coal or nuclear plant that commits to generating a set amount consistently but never even attempting to meet the actual demand is some kind of hoax?
In reality we're moving from baseload and peaker gas plants to follow demand to renewables and firming (the same gas plants just running at different times). It's a holistic system with parts working together.
The main difference is that renewables are cheaper and cleaner which gets them built faster and displaces more and more coal and gas from the market. With batteries eating the market from the other direction (starting with daily peaks and expanding out from there).
You can see this in carbon intensity of electricity production and the ever increasing share of renewables around the world.
Of course that plan falters a bit if you ban cheap onshore wind across an entire nation for a decade.
Most electricity use is not required to be available on demand, there is a high, constant level of demand and the occassions where there is enough wind to exceed demand are rare enough to generate headlines. World Cup final kettles are the exception, not the rule.
There are also interconnections between the UK and Belgium, Denmark, France, Ireland, Netherlands and Norway. Excess supply can be exported, assuming there's demand.
I often wonder how many people would accept automatic black outs when buying green energy. A process that "happens" here. You can buy for example purely wind powered electricity. But somehow that does not stop being delivered when there is no wind.
Instead there would be electric relay connected to mains in your home and when there is less supply than demand you would get blackout. That would be similar comparison to this.
Assuming you also get closer to wind generation prices not just overall wholesale prices, it would be really popular.
People would install a battery at home the way most of the world installs solar and they’d see massive reductions in electricity bills more than paying for the battery. The UK is absolutely terrible location for solar, and it’s still installed because UK’s electricity prices are so high.
That said, wind going to absolutely zero nationwide is extremely unlikely but the more people who signed up for such a system eventually just a little power wouldn’t be enough for all of them. So there’s be an economic feedback loop.
There is no reason for anyone with a wind park to sell cheap if there are customers willing to pay a high price.
That said with a battery and dynamic prices, there are many days where you can charge a battery when the prices are low and use them battery when the prices are high.
Constant pricing means they make more when wholesale prices tank and less when whole prices rise. I’ll sell you X% of my output for Y$/kWh is a perfectly valid strategy and batteries can absorb output spikes just as they absorb blackouts.
Hedges like this are a useful risk mitigation strategy as going bankrupt is a much larger downside than making slightly more money.
This fine. But from the fact that prices are sometimes high, we can conclude that overlll there is a shortage of windpower. Which will factor into the prices. The owner of the windpark can take the prices for each hour, and compute weighted average with production. And set that as the constant price.
Curtailment means there’s a different between what wind farms can produce and what the grid is willing to buy.
Wholesale prices are really just one aspect of grid manufacturing and paying them doesn’t mean you’re getting the equivalent of a percentage of wind farm productivity.
Large users like the supermarkets have agreements with electricity providers that they will reduce their use at certain times in return for a discount.
Tesco doesn't care if the freezers in store run at 04:00-04:15, or 04:30-04:45, and will pick whichever is cheaper.
Which completely ignores the differences between investment costs and running costs across different technologies.
Open cycles gas turbines are extremely cheap to build and expensive to run. For a green future these can be run on biofuels, hydrogen or hydrogen derivatives.
Therefore they perfectly complement renewables.
Nuclear power on the other hand is an awful companion due to having extremely large fixed costs and acceptable marginal running costs.
This may be true, on a watt basis. It also ignores the physics of a synchronous grid. You need to produce exactly what you use at any given moment. If you fail to do that by a little bit bad things happen. If you fail by a lot, the grid fails. You need to be able to get power when you need it, not when it's convenient to your generation plant. If you want to compare solar or wind to something more dispatchable you should really be using numbers from either a pretty massive distributed overbuild or including storage or both. Otherwise it simply isn't apples to apples. I am an electrical engineer specializing in the design of control systems for renewable generation.
That's what over building is. It also requires massive investment in transmission infrastructure because your core assumption is that power in one place get get to load in another. It turns out that transmission costs many many times what the generation does
Check out "The Price is Wrong" by Brett Christophers[1]. It explains at length that what matters is not price, but how profitable an investment is. And how wind and PV don't look great without subsidies in various guises.
You distribute pv and wind over large areas and they get destroyed by weather, get dirty, require significant maintenance. If individuals want to have wind turbines or pv installations that's great - but these things are a giant mess at grid scale - absolutely awful.
We get anything from storms to hail few times a year here. My patio roof got holes in it from the ice balls, but the panels are fine. Are you missing some qualifiers on that one?
> get dirty
You clean them every few months or monitor for issues per group of panels.
> require significant maintenance
Just like every other device out in the real world. Coal, gas, wind, solar, nuclear, thermal generators require maintenance.
What I think the GP is blowing completely out of proportion is:
> they get destroyed by weather
A few of them, every year. It makes a visible dent on their average longevity.
But I don't think distributing them has any impact on this. They just create a risk situation that nobody seems to be insuring and that large farms will self insure without problems. (Anyway, with the price going down the way it is, that will soon become irrelevant.)
> get dirty
Each person stopping to clean their own panels is much less efficient than professional cleaning centralized panels. It does increase your electricity costs.
> require significant maintenance
Home maintenance is an entire other level of inefficiency. That extends to any kind of equipment in your home.
But again, none of those is a big deal. Solar is mostly operation-free, so distribution mostly doesn't matter.
We were originally discussing offshore wind. These things have to function in some of the harshest conditions imaginable. We don't really fully understand how weather patterns will change over time with climate change. The idea that these factors won't represent serious risks to output over 50-year lifespans is delusional. We should be building modern nuclear reactors. Small scale distributed solar in sunny environments is fine - the rest of this stuff is just a massive waste.
That's not a significant issue. O&M costs are a given and not wildly out of step with traditional generation. If you want to talk about cost effectivness the thing that matters is either a)transmission capacity and interconnects for distributed generation b)storage for centralized generation. As long as you're ok investing in 1 of the 2, distributed generation is great.
Yeah of course distributed infrastructure is ... Bad???
Oh no we have no single point of failure, empower people to invest into the grid and have huge redundancies in the grid...
Batteries literally solve most of the problems
Nickel-Iron batteries are very good for this purpose: practically unlimited charge-discharge cycles and overcharging/overdischarging won't damage them. They should be dirt-cheap too, but almost there are very few manufacturers so there's not much competition.
Yes it is true, study after study has shown that LCOE for renewables (in particular wind) is the lowest. It is also quite obvious from the fact that wind and solar installations are what investors are actually investing in, in contrast to nuclear which nobody wants to invest in even with large government guarantees.
LCOE omits delivery issues. Energy isn't just about the cost to produce an electron. It's the cost of getting that electron to people when they need it. For things like wind or solar to ever become a major player you need to deal with intermittency and dispatchability.
In other words you need to deal with times when the wind isn't blowing, or when people need more (or less) power than you're producing. The way you'd do this is through excessive production during good times, and then storing the surplus in batteries, artificial hydroelectric, or other such means - and then delivering from those sources as necessary. But doing this sends the real cost per unit much higher. The storage process also entails some (to a significant amount - depending on the type of storage) energy loss as well, so you end up needing to produce more than 1 unit of electricity to get 1 unit.
FWIW I'm a huge advocate for solar, so this isn't some random smear on renewables - it's something that needs to be accounted for and which LCOE fails to do.
I find them much more appealing than the power plant near me that dumps columns of soot into my skyline.
> and remarkably bad for wildlife
This talking point is really exaggerated. It's effectively fossil fuel propaganda. The effect on wildlife is downright cuddly compared to the effects of burning fossil fuels. You might have an argument if you're comparing wind to solar.
I never heard this term before so I Googled it. Gemini (Google AI) defined it as:
> "Low capacity factor power" refers to a power source that generates electricity at a significantly lower average output compared to its maximum potential, meaning it doesn't operate at full capacity for a large portion of the time, typically due to factors like weather dependence or intermittent availability; examples include solar and wind power, which experience fluctuations based on sunlight and wind speed respectively, resulting in a lower capacity factor compared to more consistent sources like nuclear power.
Ok, sure, makes sense, but what are the alternatives to wind and solar for carbon neutral power sources? (Yes, we all know that nuclear power can do it, but almost no highly developed countries are interested in making large nuclear power investments at this point.) Our power supply structure will need to fundamentally change over the next 30 years. Probably, home- and utility-scale batteries will play a much bigger role.
Another point: Isn't the purpose of building wind turbines on the open ocean to capture more regular winds (compared to land-based wind turbines)? Wouldn't that improve capacity factor power?
About "soul-crushingly ugly": I never once saw a chemical refinery, nor a large-scale, modern hospital, that was anything other than "soul-crushingly ugly", but we need them in a modern society. So we try to carefully plan where/when/how they are built.
This is why the Netherlands builds them out in the sea. Just far enough so that you cannot see them.
Even the biggest reactionary NIMBY has no complaints.As for wildlife come on man who the fuck cares?
Minus the last sentence, this is a great point. Do any downvoters have any issue with everything but the last sentence? If anything, the Netherlands is probably Europe's most intensely developed country. There is hardly a square meter that hasn't been carefully planned out over the last 500 years.
No, I don't have an issue with anything but the last sentence, but the down arrow is all or nothing. We're facing a biodiversity crisis of massive scale (call it the 6th mass extinction if you like). "Fuck wildlife" isn't an appropriate policy position.
The Netherlands still produces 64% more CO2 per capita compared to France, despite having only 45% higher GDP per capita. And France has way more dirty industry, if we looked purely at power generation they would look even better.
This is what significant investment in nuclear does. Even the countries that invested the most in renewables can't beat it, yet. I'm very curious to see how long it will take the renewable-only countries to catch up, especially considering that emissions accumulate.
Many great points. Thank you to reply. I have read a few times calling the UK the "Saudi Arabia of Wind". I must say: They have a metric-ton of great sites, both onshore and offshore for wind. That said, great sites don't automatically become energy production sites unless they get funded, approved, and built.
> France has way more dirty industry
I'm not here nitpick, but do you have source? I know, this one is very hard to debate. On a per capita basis, the ports of Rotterdam and IJmuiden must have a staggering amount of polluting industry (steel, chemical, etc.) And, while Antwerp (Belgium) port isn't in Netherlands, it is literally right on the border. From Google maps sky-view, you can many, many chemical plants in the area. To be clear for all readers: I'm not here to point the finger specifically at NL/BE as being any worse than other highly developed nations.
Unfortunately no, and maybe I should have phrased that with a little more doubt. Still, I think it's very likely true, given the sheer size difference between the two (geographically), and the fact that France has a massive auto industry, while the Netherlands does not.
Capacity factor is calculated into lcoe, what's your point? Moreover, downtime for wind turbines is much less of an issue for a grid than large power plants (even with a significantly higher capacity factor), because you run into much bigger issues if your GW plant is down, compared to a couple of MW (and no the probabilities of all your renewables mix going down at the same time is very low, unless you're Luxemburg).
Capacity factor is calculated in. But intermittency is not. The issue is that once demand is saturated during periods of peak production, the excess energy is wasted so the effective capacity factor drops as adoption grows. E.g. once you saturate daytime energy demand, further investment in solar energy yields no more useable energy.
Intermittent sources are a good way to supplement dispatchable sources of energy like gas plants or hydroelectricity. But as a primary source of energy, they're not feasible without a massive breakthrough in energy storage.
The effect of overcapacity is null or negative price, which has the property to make more storage viable (who cares if it only gives back 25% if the input is free or very cheap), so I'd say intermittent sources overcapacity is an enabler of on grid storage.
E.g. today in Germany you can buy MWh at 14€ at 13:00 and sell it back at 180€ at 18:00. I didn't look all of Europe but it looked like the biggest spread today... You can make money with crappy storage under those conditions...
This is precisely why intermittent sources aren't viable without a breakthrough in energy storage. Existing storage mechanisms aren't capable of delivering at the tens to hundreds of terawatt hour scale required to make intermittent sources viable.
Remember, 66.8 TWh of electricity is used daily. Intermittent sources don't just experience daily fluctuations, but seasonal fluctuations lasting days or weeks. Even 12 hours of storage would still leave us with periods of insufficient production multiple orders of magnitude more frequent than the status quo: https://www.nature.com/articles/s41467-021-26355-z
> Capacity factor is calculated in. But intermittency is not. The issue is that once demand is saturated during periods of peak production, the excess energy is wasted so the effective capacity factor drops as adoption grows. E.g. once you saturate daytime energy demand, further investment in solar energy yields no more useable energy.
>
> Intermittent sources are a good way to supplement dispatchable sources of energy like gas plants or hydroelectricity. But as a primary source of energy, they're not feasible without a massive breakthrough in energy storage.
Intermittent sources are baseload, your argument applies to any baseload system, I.e. you always need some additional dispatchable energy source (unless you over build by large amounts). Again if your main energy would be e.g. nuclear you need even higher amount of dispatchable power because if your nuclear plant goes down (planned or unplanned) you need to compensate for a lot of power.
This statement is about as incorrect as it is possible to be, as even a cursory attempt to check this before posting would show.
It is difficult to understand why anyone makes claims such as this, unless they are consciously or unconsciously attempting to redefine a word that already has a well-understood meaning.
"Base load" refers to electricity demand, not sources of electricity. Things that consume electricity are a "load". The "base load" is the level of energy demand that is always present in the grid. E.g. if a grid consumes 5 GW of electricity at peak demand, and 4 GW at minimum demand, then 4 GW is the base load.
For residential uses, heating, cooling, and refrigeration are the main uses.
For commercial electricity use: computing, refrigeration, cooling, and ventilation.
For industrial electricity use: machine drive (lathes, mills, etc.), process and boiler heating, facility heating and cooling, electrochemical process.
The only categories that I guess could be easily shifted is process and boiler heating. But some industrial processes need to run uninterrupted for weeks. Machine drive, perhaps, but then workers would not be able to work a regular schedule. Not to mention, industrial applications in total is less than 25% of electricity use.
Demand shifting is a lot easier said than done. I see it proposed very frequently, but I've yet to see a detailed plan for what electricity uses will be shifted, and how.
> For residential uses, heating, cooling, and refrigeration are the main uses.
Heating and cooling can be offloaded into grid peak availability hours relatively easily with the price serving as a reliable trigger. This assumes proper insulation for the most part, but is viable and using the price as an indicator automatically sets up the right incentives. As for refrigeration, the energy use for that in a private household seems to be overstated.
> For commercial electricity use: computing, refrigeration, cooling, and ventilation
For cooling the same applies as for private households, maybe to a lesser extent. The other loads remain pretty static in their demand, but once a commercial operation has a certain scale building out the own battery storage to optimize for purchasing price (assuming a flexible price that reflects spot pricing) may be a viable strategy.
> For industrial electricity use: machine drive (lathes, mills, etc.), process and boiler heating, facility heating and cooling, electrochemical process.
For boiler heating and facility heating and cooling the same applies as for commercial and residential uses. For other energy intense workloads, demand shift is already frequently happening because the ROI is fairly quick. It’s not easy to assess from the outside because you do need an in depth process understanding that you just cannot provide as an outsider. But I have personally witnessed plenty of examples that demonstrate it is well within the realm of possibility
Heating and cooling cannot be easily load shifted. Daily fluctuations in energy production aren't the only forms of fluctuations. Seasonal fluctuations are large, too. And the seasonal variation has the unfortunate tendency to line up with periods of high energy demand. "Just don't heat your house in the winter" is not a viable form of demand shifting.
> For boiler heating and facility heating and cooling the same applies as for commercial and residential uses.
Note that this refers to "process and boiler heating". There's plenty of industrial processes that need to be kept at temperature for long periods of time, otherwise the batch is ruined. Titanium smelting is one example. I've yet to see a breakdown of what specific industrial processes can be shifted.
Heating and cooling can only be offloaded in extremely wellinsulated houses. A lot of the ones in the UK do not make the cut. Even some new EU ones do not.
If you try to offload it otherwise you just waste power heating/cooling yourself at wrong hours.
A boiler in this setup is a thermal battery. These are good, but space consuming and relatively failure prone and expensive to maintain.
Inefficient compared to central too.
Nuclear power is indeed a silver bullet. France supplied > 85% of its electricity demand with nuclear (with the rest being filled by pre-existing hydroelectricity). As is hydroelectricity and geothermal power for those countries with the appropriate geography. E.g. Norward produces 100% of electricity through hydro. Non-intermittent sources of energy don't need to be supplemented by alternative sources of energy.
They could always build more nuclear plants to fill additional demand. Again, non intermittent sources don't need supplemental sources of energy, as long as there's sufficient supply. By comparison, a country cannot possibly run their grid entirely with solar on account of intermittency.
I'd suggest reading people's comments in greater detail, before accusing people of lying.
So now you suggest that we should build peaking nuclear plants in an attempt at covering your previous blunder with pure insanity.
Lazard expects peakers to run at 10-15% capacity factor because you know, how often do we have cold spells in France or whatever other reason causes them to run? A couple of weeks a year at most. Lets say 15%.
Lets calculate what Hinkley Point C costs when running as a peaker. It has a CFD at $170/MWh for 30 years. Lets assume it runs at a 85% capacity factor and that $20/MWh are O&M costs.
153/0.15 + 20 = $1040/MWh
You want to solve the problem by forcing electricity costs on the consumers at double of the peak of the energy crisis.
All because you view the world in nuclear fanclub fantasy land glasses.
If you've already provisioned enough nuclear plants to meet peak energy demand, producing less energy has no marginal cost. Alternatively, you can just keep operating at full capacity, and give energy away for free and use it for energy-intensive tasks like desalination or arc furnaces. The idea that we'd build nuclear plants that only operate a few weeks per year is a strawman of your own construction.
You're right that nuclear is more expensive than continuing to burn fossil fuels. And the reality is nobody has a plan to build fossil fuel free grid based on wind and solar. Absent a miraculous breakthrough in energy storage, solar and wind will always have to be deployed in tandem with fossil fuels. If we're looking at actually eliminating carbon emissions, nuclear is the only viable option besides geographically limited sources like hydropower.
> They could always build more nuclear plants to fill additional demand.
And then
> If you've already provisioned enough nuclear plants to meet peak energy demand, producing less energy has no marginal cost.
If the magic tooth fairy comes with free nuclear plants... Nuclear cult member fantasy land.
So at what capacity factor will the entire fleet run at when built out to manage both outages and cold spells requiring 30 GW of fossil fuels to handle?
France currently run their fleet of 63 GW at a ~70% capacity factor. Add another 30 GW (lets call it 100% reliable when a cold spell hits) and the capacity factors vastly lower due to extremely low utilization factors of the last 30 GW.
You can spread out the lower of capacity factors across the entire fleet or just let the peakers bear them.
But in the end the results are the same because you still need to finance the your fleet now delivering a measly 45% capacity factor.
Lets translate a 45% capacity factor to Hinkley Point C numbers:
Now you are forcing the consumers to pay $355/MWh or 35.5 cents per kWh for all electricity delivered the whole year.
All you have done is take the ~$1000/MWh cost from 15% of the time and spread it out over the whole year.
Do you see the pure insanity of what you keep proposing now?
For the third time, I never said nuclear was cheaper than contuing to burn natural gas. It has the distinction of being the only non-intermittent source of carbon-free electricity besides geographically contrained sources like hydroelectricity and geothermal power. It is the only viable path to decarbonization for most countries.
What's the alternative to nuclear power for reaching a carbon-free grid? No doubt, your plan will assume a breakthrough in energy storage that delivers orders-of-magnitude more scale than existing solutions.
Why do you keep trying to alter what you said? Can't you stick to the truth?
> It is the only viable path to decarbonization for most countries.
The research disagrees with you.
See the recent study on Denmark which found that nuclear power needs to come down 85% in cost to be competitive with renewables when looking into total system costs for a fully decarbonized grid, due to both options requiring flexibility to meet the grid load.
> Focusing on the case of Denmark, this article investigates a future fully sector-coupled energy system in a carbon-neutral society and compares the operation and costs of renewables and nuclear-based energy systems.
> The study finds that investments in flexibility in the electricity supply are needed in both systems due to the constant production pattern of nuclear and the variability of renewable energy sources.
> However, the scenario with high nuclear implementation is 1.2 billion EUR more expensive annually compared to a scenario only based on renewables, with all systems completely balancing supply and demand across all energy sectors in every hour.
> For nuclear power to be cost competitive with renewables an investment cost of 1.55 MEUR/MW must be achieved, which is substantially below any cost projection for nuclear power.
Or the same for Australia if you went a more sunny locale finding that renewables ends up with a grid costing less than half of "best case nth of a kind nuclear power":
You are being purposefully aggravating here because your argument is weak but it's been socially supported for some time now. Nuclear power lagged behind renewables due primarily to proliferation fears and subsequent over-regulation in most of the world, not technical flaws, missing out on innovations like modular reactors. China’s pushing ahead with 150 GW by 2030, leveraging nuclear’s advantages: it’s compact (1-4 sq mi/GW vs. solar’s 10-20), reliable, and resilient to extreme (and simply changing) weather, without reliance on rare earths or massive storage (with their own host externalizations and supply risks). Costs can drop to $50-100/MWh with new tech and long lifespans, rivaling renewables when accounting for their hidden expenses (storage, grid upgrades). Proliferation risks exist but can be managed with oversight. Nuclear remains the best bet for scalable, clean energy.
Nuclear power has famously had negative learning by doing throughout its entire life.
There was a first large scale attempt at scaling nuclear power culminating 40 years ago. Nuclear power peaked at ~20% of the global electricity mix in the 1990s. It was all negative learning by doing.
Then we tried again 20 years ago. There was a massive subsidy push. The end result was Virgil C. Summer, Vogtle, Olkiluoto and Flamanville. We needed the known quantity of nuclear power since no one believed renewables would cut it.
How many trillions in subsidies should we spend to try one more time? All the while the competition in renewables are already delivering beyond our wildest imaginations.
China is barely investing in nuclear power. At their current buildout which have been averaging 5 construction starts per year since 2020 they will at saturation reach 2-3% total nuclear power in their electricity mix.
China is all in on renewables [1]() and [2] storage.
Then rounding of with some typical ”SMRs” nonsense!!!
SMRs have been complete vaporware for the past 70 years.
Again, why are you talking about cost, when the real question is viability? How does the study you linked plan to accommodate intermittency? The answer is just a vague statement about storage mechanisms:
> Storage of energy is an important element of 100% RE systems, especially when using large shares of variable sources
like solar and wind [14], [40]–[42], and it can take various forms [43]–[45]. Batteries can supply efficient short term storage, while e-fuels can provide long-term storage solutions. Other examples are mechanical storage in pumped hydro energy storage [46], [47] and compressed air energy storage [48], [49], and thermal energy in a range of storage media at various temperature levels [43], [50].
Nowhere do they actually outline how much storage of each system they will provision. How many TWh of batteries? How many TWh of pumped hydro? Totally unanswered. They just mention the existence of storage, and avoid any tangible discussion of scale. Like I said, there's no realistic plans for a grid primarily powered by intermittent sources. The storage required for such a grid is orders of magnitude larger than what can be feasibly provisioned.
This isn't a tiny insignificant detail. It's is a foundational part of a primarily renewable grid. And nobody has a plan to solve it that doesn't amount to "assume some different system, which has never been deployed at scale, can tens of terawatt hours of storage".
Love that you try to avoid the issue of cost. Yeah, in the land of infinite money and resources you can do anything.
In the real world the energy crisis was a cost crisis. But you seem to no care the slightest about massively increasing the ratepayers bills and by that creating a new self made energy crisis. This time fueled by nuclear subsidies.
So you skipped the first two studies. I suppose because you found nothing to complain about in them. Good to know.
Then you go on a meta-analysis on the entire field and demand them to produce a TWH figure for some energy system you can't even specify.
You truly are grasping for the straws.
Here's the quote you missed:
> Much of the resistance towards 100% RE systems in the literature seems to come from the a-priori assumption that an energy system based on solar and wind is impossible since these energy sources are variable. Critics of 100% RE systems like to contrast solar and wind with ’firm’ energy sources like nuclear and fossil fuels (often combined with CCS) that bring their own storage. This is the key point made in some already mentioned reactions, such as those by Clack et al. [225], Trainer [226], Heard et al. [227] Jenkins et al. [228], and Caldeira et al. [275], [276]. However, while it is true that keeping a system with variable sources stable is more complex, a range of strategies can be employed that are often ignored or underutilized in critical studies: oversizing solar and wind capacities; strengthening interconnections [68], [82], [132], [143], [277], [278]; demand response [279], [172], e.g. smart electric vehicles charging using delayed charging or delivering energy back to the electricity grid via vehicle-to-grid [181], [280]– [282]; storage [40]– [43], [46], [83], [140], [142], such as stationary batteries; sector coupling [16], [39], [90]– [92], [97], [132], [216], e.g. optimizing the interaction between electricity, heat, transport, and industry; power-to-X [39], [106], [134], [176], e.g. producing hydrogen at moments when there is abundant energy; et cetera. Using all these strategies effectively to mitigate variability is where much of the cutting-edge development of 100% RE scenarios takes place.
> With every iteration in the research and with every technological breakthrough in these areas, 100% RE systems become increasingly viable. Even former critics must admit that adding e-fuels through PtX makes 100% RE possible at costs similar to fossil fuels. These critics are still questioning whether 100% RE is the cheapest solution but no longer claim it would be unfeasible or prohibitively expensive. Variability, especially short term, has many mitigation options, and energy system studies are increasingly capturing these in their 100% RE scenarios.
With the conclusion based on the meta-analysis:
> The main conclusion of the vast majority of 100% renewable energy systems studies is that such systems can power all energy in all regions of the world at low cost. As such, we do not need to rely on fossil fuels in the future. In the early 2020s, the consensus has increasingly become that solar PV and wind power will dominate the future energy system and new research increasingly shows that 100% renewable energy systems are not only feasible but also cost effective. This gives us the key to a sustainable civilization and the long-lasting prosperity of humankind.
Since the study was released in mid 2022 has it become easier to harder to create 100% renewable energy systems? Easier.
Labour have pledged to bring it back down to 2030, but when they begin the talks with the motor industry to try to achieve this they will fold like they have done several times so far in this government.
Batteries aren't produced at remotely enough scale to be viable for grid storage. To put this in perspective, the world uses 60 TWh of electricity per day. By comparison global lithium ion battery production was 1.1 TWh [1]. Remember, production capacity is distinct from the actual production figures. It's typical for actual production figures to be ~50% of production capacity.
Intermittent sources don't just experience daily fluctuation, but also seasonal fluctuation. Even just 3 days of storage amounts to an impossible amount of batteries to provision, even assuming growing battery production capacity. Not to mention, even modest amounts of battery grid storage would severely hamper EV adoption, which would increase emissions.
There's a reason why most plans for a primarily wind and solar grid assume that there will be some technological breakthrough that solves storage: hydrogen, compressed air, alternative battery chemistries, etc. are really common to see in plans for a primarily renewable grid.
Modelling in Australia with simulations that multiply up current wind and solar (using real-time data of actual renewable generation) [1] showed that well over 99% of demand can be delivered with only about five hours of storage, so we're not really talking about days.
There will likely always be some gas peaking but we're talking less than a percent per year (maybe a few single digit percent in some places where solar isn't as good but still not much).
Australia is a hot country, with lots of sunshine, lots of windy coast and not a lot of people per square mile. And peak demand (summer days for AC) corresponds for peak solar irradiation.
Europe, for example, is the other way around for most of the above.
> Modelling in Australia with simulations that multiply up current wind and solar (using real-time data of actual renewable generation) [1] showed that well over 99% of demand can be delivered with only about five hours of storage, so we're not really talking about days.
That's still a gap two orders of magnitude larger than existing standards:
> Meanwhile, reliability standards in industrialized countries are typically very high (e.g., targeting <2–3 h of unplanned outages per year, or ~99.97%17). Resource adequacy planning standards for “1-in-10” are also high: in North America (BAL-502-RF-03)18, generating resources must be adequate to provide no more than 1 day of unmet electricity demand—or in some cases 1 loss of load event—in 10 years (i.e., 99.97% or 99.99%, respectively)19.
Even leaving 1% of demand unfulfilled amounts to multiple orders of magnitude more frequent electricity production shortfalls. Figures like "fulfill 99% of electricity demand" might sound promising, until you compare against the standards of reliability modern society expects of the electrical grid.
And that's in Australia, quite literally the best-case scenario for renewables. By comparison, in Germany even 12 hours of storage would only satisfy 80-90% of demand.
The thing is you can calculate when there will be short falls and you can amount to that perfectly.
We now know the weather pretty much 1 day before it happens. adjusting that 1% of course amounts to maintaining natural gas facilities but it's really not that big of a deal.
Using natural gas means climate change still progresses. Not to mention you'll be paying all of the overhead cost of maintaining natural gas plants, but only use them for a fraction of the time. So net cost per watt hour will be very high.
> "Batteries aren't produced at remotely enough scale to be viable for grid storage."
The idea that if we can't have a renewable grid identical to the fossil fuel grid, then we may as well stick with the fossil fuel grid even if it means the end of the world is a bit weird.
The UK's biggest energy need is heating, but the housing stock in the UK is famously shitty, 38% of homes were built before 1946 and it's the worst value for money of any developed country[1]. It isn't well insulated, triple-glazed, heat-pump fitted, using local industrial waste heat for home heating.
Heat is harder to move than electricity, but easy and cheap to store - this 2019 pilot project can store 130MWh of heat for up to a week[2], something we couldn't reasonably or cheaply do with 130MWh of electricity.
It's possible that energy and electricity requirements could be reduced meaningfully without dropping quality of life, and that meaningful amounts of energy could be stored in heat and synthetic gas[3] rather than in more expensive electric charge storage.
[Is this used Nissan Leaf at 30kWh for £2,000 the cheapest battery storage I could buy in the UK right now?[4]]
Nobody, really nobody asks for the whole world consumption to be stored in batteries.
This is such a bullshit argument it really paints the rest of your comment in a bad light.
You can get so far by just storing up to 12 hours OF NIGHT TIME on a local level. Who cares about the 5% of times where we have to burn natural gas to stabilise the grid.
95% renewable is orders of magnitude better than today. Anyone saying different is literally a grifter.
Also battery storage cost prognosis is 50% less in 5-6 years. Batteries are already cost effective and there are a lot of grid storage options build right now.
If we are heating with electricity and cannot use gas storage to make up for the seasons, chemical batteries are nowhere near price effective, or even physically or projected to be available.
You can find out how much energy Europe stores in gas fields to get through the winter. You can divide that number by what you think is a reasonable sCOP for your heat pump. That number you can put next to total battery capacity ever produced, and I'll even let you add any other convertible energy storage capacity. You will find a gap off by orders of magnitude, and powerwalls in every home are not going to cover it. As you said, they'll cover a day, maybe a few, which leaves us 3 months short in the season where we have virtually no home solar.
Of course burning that gas to generate electricity for use in heat pumps would get you the same heat for half the gas even before you include any wind power (or hydro etc.)
That then doubles your storage for gas that you may or may not need to burn depending on the weather.
So really the path to fully renewable just goes through a series of win-wins on the journey to fully phase out fossil fuels.
Sure, switching to heat pumps doubles (probably triples) your storage, or, cut the storage energy in the form of gas by 66%. But, I thought we were discussing decarbonzing the storage too. Looking at that amount of energy, (chemical) electrical storage isn't even in the right ballpark yet. And other forms neither.
Because fuel costs only matter so much. You will still need exactly the same number of natural gas (or similar) plants with 12 hours of battery vs. 1 hour to cover seasonal events in most places on the planet. Until your battery backup can supply a week or more worth of power you have simply created an inevitable disaster in the making.
No one is building a natural gas plant to staff it and let it sit idle for 95% of the time. The natural gas burned is only a fraction of its input costs.
Battery storage is headed in the right direction but the fact almost all articles on the subject can’t even get the units correct as it would betray how ridiculously small the deployments actually are is quite telling in itself.
The grifting are those pretending magic natural gas backing plants are going to pop up out of nowhere and not including that capital or maintenance expense when quoting intermittent power source costs.
Right now those sources have been able to cherry pick the cheap and easy problems to solve since they’ve been using someone else’s power when they can’t meet demand. Eventually you run out of it though.
Cheap intermittent sources have their place, and should be used maximally wherever possible. For example every watt of hydro production should have a watt of solar or wind built on top of it. Store the water for when the intermittent sources can’t keep up with demand.
> No one is building a natural gas plant to staff it and let it sit idle for 95% of the time. The natural gas burned is only a fraction of its input costs.
Serious question: why do you think that’s true? If it costs X per year to run it 5% of the year and you save more than X with this strategy, then the maths is simple and someone will build it. Several energy companies could probably be convinced to each pay a share so no one is left footing the whole bill but everyone benefits from the existence of the facility. If the maths works, potentially even some of the cost could be passed on to the taxpayer.
In the UK we already have a couple of facilities that operate exactly like this, Cruachan for example (it’s not gas, it’s water). Over the years, ways to improve its utilisation have been found, but it’s still sitting there at a relatively low portion of its capacity so that it can black start the grid if it’s ever needed.
Because it has been, at least thus far. Perhaps in the hazy future this will change, and some regulatory/capacity/energy market will evolve into making such things profitable by paying someone to build underutilized power plants. I know of no such market currently.
I'm only somewhat familiar with the US market, not the UK. But a single plant is really not interesting for the discussion at hand. It can be considered a cost of doing business to have such a plant be useful for "black starts" - but that's all a single plant will ever be useful for. If it's ever being used for such a purpose you've already lost the game.
The scale is what matters. A single power plant that is 1% of your grid capacity being utilized 5% of the time is an expense that can probably be justified. Hundreds of power plants that match 100% (or close to it) of your grid capacity used 5% of the time would be an economically unjustifiable expense as you've effectively built your entire generation capacity twice.
Right now that's what we would be talking about building since every regional grid seems to experience week (or longer) periods where intermittent power generation is extremely unreliable due to weather events. It's not 100%, but it's close to it. You need to plan for the 1000 year event for something as critical as a national grid or folks literally start dying and the economic impact is astronomical.
I don't know what the exact capacity factor you'd need to have for a reasonable intermittent:dispatchable ratio, but it's certainly quite a lot higher than most would seemingly believe. Once batteries get to the point of backing the entire grid for a single night while the wind doesn't blow there might be signs of change. In most markets in the US where batteries are considered huge successes they have only recently (in the past year or two) transitioned from providing ancillary services to actual energy production for regular daily usage during the duck curve.
This can all be solved in time and in theory with a number of technologies and additional grid interconnection. But the trends simply are not as positive as one would like to see when you start delving into primary sources.
> But a single plant is really not interesting for the discussion at hand.
Maybe, but the UK has 4, and is building another 5 in the next 5 years.
Average grid consumption is somewhere around 30GW, the existing facilities have around 30GWh of storage. The additional 5 should bring around another 100GWh.
So we’re already at 1 hour’s worth of grid capacity stored, by 2030 we’ll be at 4 hours, and that’s assuming absolutely zero energy from other sources (wind, solar, nuclear, fossil fuel, biomass, other countries), although to be fair the existing facilities can only deliver at around 3GW, and the new facilities will only bring that up to 6GW.
I’m not sure if you’ve ever been to the UK, but a whole week without either sun or wind seems a bit unlikely, especially when half of the UKs wind comes from offshore wind farms.
Stick in a few more of these, and keep a couple of the existing fossil fuel plants around in case of emergencies, and I can definitely see how this continues to be just a “cost of doing business”.
I appreciate the situation in the US may be worse.
> This is such a bullshit argument it really paints the rest of your comment in a bad light.
I responded to a comment stating that excess storage can be stored in batteries: https://news.ycombinator.com/item?id=43249008 I'd suggest reading the comments people are responding to before calling them bullshit.
> You can get so far by just storing up to 12 hours OF NIGHT TIME on a local level.
"only" 12 hours of storage is 30 TWh of storage, at the world's current electricity consumption rates. This is an immense amount of storage, amounting to decades worth of global battery production. And that's ignoring the fact that the vast majority of batteries are going to electric vehicles, not grid storage. It's true that battery production is growing, but electricity demand will similarly grow as fossil fuel use in transportation and industrial processes are electrified. Out of all of our fossil fuel use, electricity production is only ~40%. Not to mention, poorer countries are developing and will eventually start deploying refrigeration and air conditioning on similar scales as developed countries.
Let's say it is 30Twh a day in 2030 - you can calculate 50% less energy usage during night making it 20 TWh during daytime and 10Twh during night time.
This excludes large wind farms that add to base load if you average over the world. There is always wind somewhere around you.
Realistically if we reach 5Twh storage we are able to be >90% renewable.
Having 5Twh of storage is of course not an easy feat if estimates are correct we will have 6.5TWh battery production in 2030. If we amount for 10% of that used in grid storage we would need a decade for a 90%+ renewable grid.
There is no faster method. It is realistic.
> 12 hours of night time is not 30TWh The world is currently at 26TWh A DAY
From your link:
> The global electricity consumption in 2022 was 24,398 terawatt-hour (TWh)
24,398. / 365 is 66.8 TWh of electricity used per day. And again, that's current electricity consumption. Before industrial processes are electrified. Before poor countries adopt air conditioning at the same rates as rich ones. Before transport is fully switched to EVs.
> Let's say it is 30Twh a day in 2030 - you can calculate 50% less energy usage during night making it 20 TWh during daytime and 10Twh during night time
That's not how th consumption curve works. Even in the summer, the ratio of daytime to nighttime energy use isn't so high. And in the winter it's inverted, with nighttime energy use exceeding daytime use.
why should I? I have no idea how fast adaption rate are, we can calculate what it would cost for the current grid to be feasible and work based off of that.
Ah yeah the magical 107 Twh hydrogen capacity. How far off is that? That's a pipe dream.
This is a plan for anything past 2050. I'm talking right now.
Here are high-ball numbers for going off the grid; 2000 sf house in California:
- 30 panels ~ 10kw: $20K
- batteries ~ 10kwh: $8K.
- permits + labor: $20K (California...)
- 100+kwh EV with v2h bidirectional charging: $50K
- comparable ICE car (offset): -$40K
- heat pump water heater $1.5K
- heat pump furnace: $15K
- induction range: $2K
That adds to: $76.5K. Typical PG&E bills are $500-1000 per month. Budget $200 / month for gas. (Again, California prices.). That’s 63-110 months till break even, which is less than the expected lifetime of the panels + battery.
For another $10-20K, you can add propane backup, but I assume extended storms are rare enough to just charge the car and drive the electrons home a few times a year. A fireplace is about $5k installed.
Not going full off-grid is cheaper. So is scaling up to beyond one house.
LLMs should not be used as a reliable source of numbers for research like that. You keep saying how obvious this is and trivial to research. Maybe just post a quality research link instead in that case?
I am suggesting it as a way to do a back of the envelope calculation that can be thoroughly checked manually. It's very easy to check the numbers yourself.
## Upfront Capital Cost
- *Nuclear*: Very high (£4,000-6,000/kW), with 10+ year construction time
- *Natural Gas (CCGT)*: Low to moderate (£500-1,000/kW), with 2-3 year construction time
- *Wind + Battery*: Moderate for turbines (£1,000-1,500/kW) plus substantial battery costs
- *Solar + Battery*: Moderate for panels (£800-1,200/kW) plus large battery costs, especially for winter
## Plant Lifespan
- *Nuclear*: Typically 60 years, with possible extensions; 2+ builds over 100 years
- *Natural Gas*: 25-30 years; requires 3-4 rebuilds over 100 years
- *Wind + Battery*: 25 years for turbines, 10-15 years for batteries; multiple replacements needed
- *Solar + Battery*: 25-30 years for panels (with declining output), 10-15 years for batteries
## Fuel & Operating Costs
- *Nuclear*: Low fuel cost, high operating cost (staffing, maintenance, safety)
- *Natural Gas*: Major cost is fuel (price volatility), plus potential carbon costs
- *Wind + Battery*: No fuel cost, moderate turbine O&M, plus battery replacement costs
- *Solar + Battery*: No fuel cost, low panel O&M, plus battery replacement costs
## Levelized Cost (No subsidies)
- *Nuclear*: £90-120/MWh
- *Natural Gas*: £50-60/MWh (without carbon cost), £100+/MWh with high carbon prices
- *Wind + Battery*: Base wind £40-50/MWh, potentially exceeding £100-150/MWh with storage for 90% CF
- *Solar + Battery*: Base solar £40-50/MWh, potentially exceeding £150-200/MWh with storage
## Reliability / Capacity Factor
- *Nuclear*: ~90% capacity factor, suited for baseload
- *Natural Gas*: 80-90% if run as baseload, highly flexible
- *Wind + Battery*: 35-50% raw CF for wind alone, requires battery + overbuild for 90% CF
- *Solar + Battery*: 10-15% raw CF in UK, requires massive overbuild and storage for 90% CF
Spotting bubbles and predicting they will burst at some point is not a particularly useful skill. Housing in Amsterdam was in a bubble for 37 years in the 1700s; identifying the bubble early on would have been completely pointless.
"I'll say I agree with the Deep Research criticisms. These products are very underwhelming."
I haven't shelled out the $200/month for OpenAI's Deep Research offering, but similar products from Google and Perplexity are extremely useful (at least for my use case). I would never present the results unchecked / unedited, but the Deep Research products will dig much deeper for information than Perplexity could be persuaded to previously. The results can then be fed into another part of the process.
Ed occasionally makes good points, but he's very very angry at Big Tech, and his anger often gets in the way of his message.
Reading his latest rant reminds me of Karl Denninger railing against Google around the time of their IPO, claiming they would never make enough money to justify an $85 share price (a $1000 investment then would be worth around $375,000 today).
I think there's the same logic flaw of looking at how things are at the start - so so - and how they may be in 20 years - Google getting an advertising cut for most of the world's commerce, AI replacing/doubling the ~100tn/yr labour market.
Google solved a real problem. They indexed the web and made search work, and they did it very cheaply. So cheaply, in fact, that they could give their service away to users and monetize it with ads. LLMs are not like this. They're both extremely expensive to run and they don't do anything truly valuable--there's no killer app. So how exactly is OpenAI or their ilk (or for that matter the rest of us) supposed to use these things to make money?
This is the only question, and the fact it's still an open question just screams "hype bubble". My bet is this AI stuff goes the way of the NFT.
I'd take that bet. Google offers a very expensive service for free, but is able to monetize it with ads. Sometimes connecting users to companies is what users actually want. But Google has this problem that since the service is free, its users feel entitled to everything for free. They can't just go and charge people what it costs to run a Google search.
OpenAI doesn't have this problem. ChatGPT has a free level to get you hooked, but it's restricted. So a lot of users pay them $20 or $200 or some other amount per month to use their service. So how OpenAI makes money is by selling access to their service. What you do with it is up to you, but their value proposition is simple. Pay us to get more/better access to our service.
How much it costs them to operate the service is a secret known only to them. There are a lot of very very educated guesses, but they're just guesses. After the VC money runs out they'll have to charge more than it costs to provide the service to stay afloat, and then we'll see. $20/month for ChatGPT plus is the $1 Uber that got people hooked. There's already a $200/month tier.
Whether OpenAI, specifically, will be standing in 20 years, only time will tell. But by this point it should be obvious that there's something to this LLM thing. Even if the product doesn't get any better than it is today, it'll still take 5-10 years for its effects to reverberate through society.
The killer app is LLM-accelerated programming. Sure, it doesn't work for all domains and it can't do everything, but even if the only thing it's good for is creating JavaScript react CRUD apps, well, there are a lot of those out there, and they're not actually limited to that. And since tool use means they can generate code and compile it and test that it works, it's possible to generate datasets for other languages and libraries, the only question is which ones is it worth it for.
It might not help at all in your line of work, but a friend who does contracting is able to use LLMs to cut the time it takes him to do a specific kind of job in half, if not more,
enabling him to take on twice as many clients and make more money. For him it would still worth it even at 100x the current price. thankfully competition means it'll take a while before it's that expensive.
I haven’t tried the OpenAI version yet, as I’m on their peasant-level $20 plan, but the Google equivalent is way superior to Perplexity (I use both extensively). The web search Perplexity carries out is superficial compared to the Google product; it misses a large percentage of what Gemini Deep Research finds, and for a particular task in my business this makes a huge difference.
It absolutely can replace the research done by one person, for my use case at least. It’s also available on their $20/month subscription, unlike OpenAI’s $200/month.
Do you think it is a) easier and cheaper or b) harder and more expensive, to deliver services in a vast range of languages or a single language?