• Hi Guest!

    The costs of running this forum are covered by Sea Lion Press. If you'd like to help support the company and the forum, visit patreon.com/sealionpress

WI: Nuclear Fusion Achieved

History Learner

Well-known member
Let's say Mike McCormack is able to win re-election in 1980 and solidify the Pro-Fusion lobbying force in Congress he was in the process of building IOTL that had already been able to pass the Magnetic Fusion Energy Engineering Act of 1980. With a strong force to push for Fusion, it's likely you'd see the Center for Fusion Engineering be created while projects like Princeton’s Tokamak Fusion Test Reactor (TFTR) and Lawrence Livermore Laboratory’s Mirror Fusion Test Reactor get a chance to actually operate, instead of shut down on the day they were supposed in the example of the latter. Others, like Oak Ridge's Elmo Bumpy Torus preliminary design for a 1200 MW magnetic fusion power plant would actually get built, same for Princeton's Compact Ignition Tokamak (CIT). It's important to note that the Reagan Administration wasn't opposed at all to Nuclear Fusion, indeed setting high funding for it during the first term, but said funding began to languish without a unified bloc in Congress to support the costs and especially so with declining oil prices in the second half of the decade. if McCormack had been able to stay and continue his work, combined with the openness of Reagan to said work, I think there would've been more than sufficient political will to see support for the costs continue and thus keep the United States on track for a working reactor by 2000. Such was stipulated by the previously mentioned legislation, which used 1976 projections by the ERDA to establish its time frame. With a decade of refinement of designs and more testing, commercialization starts around ATL 2010 and by ATL 2021 the adoption of nuclear fusion power plants into power grids and other facets of energy usage is well underway.

What likely happens from here? Carbon emissions in the 2010-2021 ATL period are likely significantly lower, probably putting us on a trajectory to avoid the worst of Global Warming by achieving the 1.5 degree target. Russia and the Middle East will probably be more unstable, however, with decreasing oil funds.
 
Fusion power has never even reached the energy breakeven point and the first demonstration reactor that might be able to do so (ITER) is scheduled for completion sometime in the 2020s. It will be a pure research reactor that can't use the heat to generate any electricity. The first demonstration power generating reactor (DEMO) hasn't even started construction yet and isn't scheduled to come online until the 2040s or 2050s. Even if those plants work it is unclear if they will be economically competitive with other forms of energy, including nuclear fission.

The money would be better spent on the breeder reactor program. Some of the earliest prototype nuclear reactors were breeder reactors and by 1980 large demonstration plants were already in operation in several countries. The United States and Soviet Union both had plans for demonstration commercial breeder reactors but the United States ended up cancelling Clinch River and BN-800 was suspended for decades after Chernobyl. Russia is now working on making the BN-1200 as economical to operate as the VVER-1200, a pressurized water reactor design. If that can be achieved there wouldn't be as much reason to build conventional nuclear power plants. Even if breeder reactors are never able to compete with conventional reactors they will still be a key technology in the coming decades for supporting the conventional plants in producing sustainable clean power.
 
The problem with this framing is essentially that it presupposes that if politicians had just pumped in more money into research in the United States, then nuclear fusion would have been achieved by now. That's frankly impossible to tell, given that we don't know how one would go about achieving commercially successful fusion power, so we cannot say whether or not such a scenario is possible.

It's like people in the 16th century speculating on how long it will take before they have steam engines, while having no idea at all about how to build one.
 
The problem with this framing is essentially that it presupposes that if politicians had just pumped in more money into research in the United States, then nuclear fusion would have been achieved by now. That's frankly impossible to tell, given that we don't know how one would go about achieving commercially successful fusion power, so we cannot say whether or not such a scenario is possible.

It's like people in the 16th century speculating on how long it will take before they have steam engines, while having no idea at all about how to build one.

The issue isn't that people don't know how to make nuclear fusion power commercially viable, the issue is that no one has proven that nuclear fusion power is even possible. No one has ever managed to achieve energy breakeven with any fusion reactor.

It seems very unlikely that the high levels of funding required for fusion power research could be consistently maintained by the United States government for decades with little to nothing to show for it. The current cost estimate for ITER alone is more than what the United States government has spent on all fission energy research and might even be more than what it has spent on all energy research in general.

ITER isn't even a commercial fusion power demonstration plant, as it won't actually have any means of generating electricity. Even more money will have to be spent on DEMO or another prototype plant, and that won't be until the 2040s or 2050s.
 
The problem with this framing is essentially that it presupposes that if politicians had just pumped in more money into research in the United States, then nuclear fusion would have been achieved by now. That's frankly impossible to tell, given that we don't know how one would go about achieving commercially successful fusion power, so we cannot say whether or not such a scenario is possible.

It's like people in the 16th century speculating on how long it will take before they have steam engines, while having no idea at all about how to build one.

The issue isn't that people don't know how to make nuclear fusion power commercially viable, the issue is that no one has proven that nuclear fusion power is even possible. No one has ever managed to achieve energy breakeven with any fusion reactor.

It seems very unlikely that the high levels of funding required for fusion power research could be consistently maintained by the United States government for decades with little to nothing to show for it. The current cost estimate for ITER alone is more than what the United States government has spent on all fission energy research and might even be more than what it has spent on all energy research in general.

ITER isn't even a commercial fusion power demonstration plant, as it won't actually have any means of generating electricity. Even more money will have to be spent on DEMO or another prototype plant, and that won't be until the 2040s or 2050s.

I linked to it, but it might help to directly post it since it might have been missed:

sjH5r.jpeg


These are projections, directly taken from ERDA (in 2012 dollars) about the length of time and necessary money to achieve an ITER-style demonstration reactor. The reason Fusion has always been delayed and seems so difficult is because we literally never fund it to what it should be. To put these figures into context, the U.S. gives between $5 billion to $62 billion per year in direct subsidies to the fossil fuel industry.
 
I linked to it, but it might help to directly post it since it might have been missed:

sjH5r.jpeg


These are projections, directly taken from ERDA (in 2012 dollars) about the length of time and necessary money to achieve an ITER-style demonstration reactor. The reason Fusion has always been delayed and seems so difficult is because we literally never fund it to what it should be. To put these figures into context, the U.S. gives between $5 billion to $62 billion per year in direct subsidies to the fossil fuel industry.

What data is that graph based on?

Like, how did they conclude in 1976 that with a certain amout of funding they'd have functioning fusion reactor by 1990?
 
I think focusing on the possible impossibility of fusion is ignoring the various ramifications of this PoD.

It really comes down to the extent to which we are to try to conceive of this discussion as something in the realm of fiction, or as a discussion regarding what is to be expected.

Like, I can totally engage in the former kind of discussions on the basis of "Let's assume that cryonics can be made to work, and that we achieve that technology by the 1940s, what would be the implications on world history if people (the super rich or world leaders, etc.) can get cryogenically frozen and then come back after 30-40 years à la Austin Powers?"

If we're going to have the latter kind of discussion, we kind of have to conclude that, well, we cannot really. We don't know how much money/time would be necessary to get commercially viable fusion reactors by 1990, or 2000, or 2010, or 2020, on the basis of a PoD in 1980. We don't really know what scientific discoveries and breakthroughs need to be made to get there.
 
I think focusing on the possible impossibility of fusion is ignoring the various ramifications of this PoD.

There are some fundamental economic issues with any new energy technology after 2000, the main one being that electricity demand has been almost stagnant in the United States since that year. That means that it would only be able to grow at the expense of other forms of electricity production: 1 MW of capacity of something else would be replaced by 1 MW of the new technology.

Besides constituency networks for existing energy sources there is also the fact that they already exist and have stickiness to them in a form of technological and infrastructural lock-in. Those plants have already been paid for or are still going to have to be paid for. For someone to want to replace existing and functional generation capacity with something new that new generation capacity will have to be even cheaper. Once a facility is up and running the only additional cost an operator incurs are the variable costs of the actual operation, the capital costs are essentially a fixed expense. There are coal and fission plants operating in countries where the economics would never justify new construction because the operators just have to pay those variable costs. Some nuclear power plants have continued in operation after the bankruptcy of their original owner from capital costs because the plant become a very competitive asset once those costs were written off. Another example is the survival of petroleum companies that produce using hydraulic fracturing instead of more conventional methods. While the conventional producers could overproduce and drive the cost of petroleum down they weren't able to do much to impact the production of wells that were already paid for, as their operators would at least be losing less money if the continued cost of production was lower than what the petroleum could sell for.

However, there is one key factor that threatens existing coal and fission plants and makes the construction of major new capacity difficult: negative power costs. Subsidies can make energy economics even more strange than coal plants continuing on while having to pay carbon taxes. With subsidies the cost of production now doesn't even have to be competitive with the market rate, but instead with an artificial price control. All the producer has to do is cover variable power costs and they can still potentially profit even if they pay consumers to consume their power. This is bad for coal and fission plants because they are large capacity, difficult to adjust output for, and usually don't receive the subsidies. They don't get back any of the money they pay people for consuming their power.

There are some other issues that would make nuclear fusion undesirable even if these factors weren't in play: it's a new technology, it's probably a large and expensive plant with hundreds or thousands of MW of capacity in a time of little/no demand growth, and it has nuclear in the name.
 
What data is that graph based on?

Like, how did they conclude in 1976 that with a certain amout of funding they'd have functioning fusion reactor by 1990?

The original report is in the citations, but basically the same way the Government calculates a lot of major scientific/military/economic projects. With Fusion a lot of the theoretical work has been done for a long time; basically, we just need to throw money at it to solve the material constraints, which are the main obstacle anymore, given the high temperatures and confining them. To show this in the real world, private sources have stepped in to make up for the lack of public investment to the tune of $2 Billion. That infusion is why we are hearing about a lot of recent breakthroughs in Fusion research in the media.
 
The original report is in the citations, but basically the same way the Government calculates a lot of major scientific/military/economic projects.

Well, I mean, "the Government" calculates the effects of economic projects in a whole variety of different ways, and it is certainly not unheard of that those calculations can be waaaaaaay off mark. I'm afraid I need a bit more information than just assurances that there is a government report from the 1970s.
 
Fusion power has never even reached the energy breakeven point and the first demonstration reactor that might be able to do so (ITER) is scheduled for completion sometime in the 2020s. It will be a pure research reactor that can't use the heat to generate any electricity. The first demonstration power generating reactor (DEMO) hasn't even started construction yet and isn't scheduled to come online until the 2040s or 2050s. Even if those plants work it is unclear if they will be economically competitive with other forms of energy, including nuclear fission.

The money would be better spent on the breeder reactor program. Some of the earliest prototype nuclear reactors were breeder reactors and by 1980 large demonstration plants were already in operation in several countries. The United States and Soviet Union both had plans for demonstration commercial breeder reactors but the United States ended up cancelling Clinch River and BN-800 was suspended for decades after Chernobyl. Russia is now working on making the BN-1200 as economical to operate as the VVER-1200, a pressurized water reactor design. If that can be achieved there wouldn't be as much reason to build conventional nuclear power plants. Even if breeder reactors are never able to compete with conventional reactors they will still be a key technology in the coming decades for supporting the conventional plants in producing sustainable clean power.

Also, as an aside, are you talking about MSRs or LFTRs in particular? Thorium is definitely something we should've been invested in.
 
Also, as an aside, are you talking about MSRs or LFTRs in particular? Thorium is definitely something we should've been invested in.

Nuclear energy in general is a field that should have seen continued focus after the 1970s. There is an element of technological lock-in around water cooled power reactors and sodium cooled breeder reactors simply because they were focused on earlier, but that doesn't necessarily mean that they are the best technology. Molten salt reactors had the disadvantage of being theorized after the Manhattan Protect and of being theorized by staff at Oak Ridge instead of Argonne.
 
Back
Top