Saturday, July 31, 2010

New Oil Production and Discoveries Pushing Peak Oil Back

As predicted here near the first of this year, Russia is ramping up its energy production, with no peak in sight.
"We are ordering more advanced, heavier and mobile rigs. Now operators are willing to pay more for better efficiency," said Kim Kruschwitz, marketing manager at Russia's top drilling firm, Eurasia Drilling.

He said West Siberia - an oil Eldorado that helped the Soviet Union fund the arms race with the United States - could see output up 10 percent in coming years despite wide industry belief its reservoirs were ruined by speedy Soviet exploration.

"The gain in efficiency and output that results from improved technology will only become more pronounced in the next few years, and I expect Russian production to hold around the level of 10 million barrels per day for years to come," said Kruschwitz. _Source

OPEC nations are also likely to continue increasing output -- led by improved production from Iraq. Mexico is finally beginning to invest in new technologies to allow it to produce more out of its Gulf of Mexico fields and older giant fields, which have suffered declining production. Canada is beginning to do more than just think about exploiting its vast arctic reserves of oil and gas -- and is steadily and incrementally building its oil sands production, thanks to investments from China. North Dakota is setting new oil production records.

New oil finds in Egypt and several countries in Africa are stressing the ability of oil field companies to keep up with the demand. At least there will be a lot of free offshore rigs -- thanks to Obama and Salazar driving a stake through the heart of the Gulf of Mexico economies with a hasty and ill-advised oil moratorium.

New natural gas finds are breaking out all over -- from Eastern Europe to Lebanon to Southeast and East Asia. Technology for extracting and exploiting oil sands, oil shale, and heavy oils are only going to get better and more economical -- with steadily decreasing water and energy demands. Better methods of converting coal to liquids, gas to liquids, biomass to liquids are being invented and adapted weekly.

If we really get desperate for hydrocarbons (for materials production) in a hundred years or so, we may have to learn to develop methane clathrates safely and economically. Automated robotic harvesters make the most sense for such deep sea and arctic production.

But the real heart-breaker for true-believing peak oil disciples will be safer and more abundant forms of nuclear energy -- fission first, then fusion. All of that electricity and heat can be turned to an almost infinite number of ways of producing other forms of energy -- or for substituting for hydrocarbon combustion in ships, locomotives, industrial process energy, and other significant forms of energy use.

I am not describing a cornucopian future here. But if you wish, I certainly could. The main obstacle to a better future is the basic underlying dullness of those barely evolved apes, known as humans.


Life Without the Sun? Yes, If We Learn to Think

Ring of Fire

We like to say that life on Earth could not survive without the sun, but that isn't actually true. Without the sun the surface of the planet would freeze down to a certain depth, but the amount of energy contained in the molten planetary core could provide ample heat and electric power to maintain human civilisation for hundreds of thousands of years or longer.

From the western US to Australia (Queensland and Victoria), Indonesia, the Phillipines, Kamchatka, Alaska, and New Zealand, the Pacific Ring of Fire makes thermal energy available to much of the world's population. Similar "rings of fire" over the rest of the Earth extends this ample fund of energy to Europe, western and central Asia, and parts of Africa.
Jefferson Tester: The figure for the whole world is on the order of 100 million exojoules or quads [a quad is one quadrillion BTUs]. This is the part that would be useable. We now use worldwide just over 400 exojoules per year. So you do the math, and you know you've got a very big source of energy.

How much of that massive resource base could we usefully extract? Imagine that only a fraction of a percent comes out. It's still big. A tenth of a percent is 100,000 quads. You have access to a tremendous amount of stored energy. And assessment studies have shown that this is thousands of times in excess of the amount of energy we consume per-year in the country. The trick is to get it out of the ground economically and efficiently and to do it in an environmentally sustainable manner. That's what a lot of the field efforts have focused on. _TechReview

If you drill far enough down into the Earth, you will find hot rock. Circulating a heat exchange fluid into the rock allows you to utilise the heat to drive a heat engine to generate electric power. Using both the heat and the electricity obtained from below ground, life on planet Earth could be maintained under large domes for millions of years, even without the sun. Add the energy you can get from nuclear fission and fusion, and human civilisation could go on even longer.

Since the sun is likely to go on for another billion years or longer, what we are talking about is two things: 1. geothermal energy as supplementary, baseload energy for parts of the world where nuclear power is not practical or safe, and 2. human settlements on molten-core planets which are too far away from a star for conventional star-powered photosynthetic life cycles and atmospheric heating.

Geothermal is expensive, and is a low-grade form of energy. But it is 24 hour a day baseload energy which can also be load-following power. Enhanced geothermal requires deep drilling and regular maintenance. And there is the fear of earthquakes:
Using EGS, producers drill deeply into hot rocks and pump surface water to them. The heat is transferred to the water and it is pumped back to the surface with geothermal energy that is used in a standard geothermal power plant. Much heralded until recently, EGS began to garner controversy when its deep fracturing of geologic structures seemed to be associated with increased seismic activity, first in Basel, Switzerland, and later in California.

..."The thing is, you've got to address it," Gawell said. "If you've got major slip faults in the area, you don't do a project there. You simply stay away. When somebody permits a geothermal project in Basel, Switzerland, the site of the biggest earthquake in European history, you have to wonder whether they did any screening or thinking" beforehand. _GreentechMedia
But think about it: Earthquakes come from faultlines where plates are pushing and sliding against each other. The best way to prevent a large earthquake is to trigger multiple small earthquakes to relieve the pressure that is building over time.

The hysteria over EGS-caused mini-quakes is misplaced. The danger comes from not relieving the pressure.

And so massive quantities of baseload energy goes untapped, because for now it is cheaper to use other forms of energy -- such as coal, gas, oil, hydro. How does geothermal compare to wind?
According to Tantoco, the company may spend as much as $3.5 million to produce a megawatt of geothermal power through its greenfield facilities, and about $2.5 million per megawatt for its wind power project. [EDC Philippines] _BusinessInquirer
But geothermal is 24 hour baseload and potentially load-following power. Wind power is intermittent, with a capacity factor of 0.3 or less -- and essentially unpredictable! Wind machines often break down within 5 to 10 years, whereas geothermal can last for several decades or longer.

No one is saying that geothermal is better than small modular nuclear reactors in terms of portability, versatility, efficiency, or affordability. But for demographic reasons, some parts of the world are simply not safe places to put nuclear plants -- even SMRs. If geothermal energy is available, it represents a better alternative, for those particular places.


Friday, July 30, 2010

LS9 Transplants Bacterial Genes to Produce Diesel from Sugars

South San Francisco biotech company LS9 has genetically programmed E. Coli with genes from cyanobateria, to produce long chain hydrocarbons.
The LS9 researchers discovered the genes involved by comparing the genomes of 10 strains of cyanobacteria (also called blue-green algae) that naturally produce alkanes with a very similar strain that produces no alkanes. They identified 20 genes that the alkane-producing strains had but that the non-alkane-producing strain lacked. From there, the researchers narrowed down the possibilities until they identified the genes and enzymes necessary for alkane production. They confirmed their discovery by incorporating the genes into E. coli and measuring the alkanes that the bacteria subsequently made. The bacteria secrete the alkanes, which can then by easily collected and used as a fuel.

Organisms make alkanes via a complex process that produces fatty acids from carbon dioxide or sugars. The fatty acids are then converted by the organisms to an aldehyde that includes a carbon atom bonded to an oxygen atom (together they create what's called a carbonyl group). The enzyme aldehyde decarbonylase helps remove this group to form a chain of hydrogen and carbon atoms--the hydrocarbon. The natural process produces a collection of hydrocarbons of various lengths that are comparable to the hydrocarbon molecules in diesel. _TechnologyReview
Such artificially programmed bacteria (and algae) will provide the most efficient means of converting CO2 directly to hydrocarbons. Humans will find it difficult to create economical processes to make long chain alkanes from CO2, using the low levels of CO2 found in the atmosphere (0.04%).
More on LS9 from GCC:
Researchers at LS9 have discovered an alkane biosynthesis pathway in cyanobacteria; i.e., a metabolic pathway that produces alkanes—the major hydrocarbon constituents of gasoline, diesel and jet fuel—in a direct, simple conversion from sugar.

When the newly identified alkane operon is expressed in E.coli, the bacteria produce and secrete C13 to C17 mixtures of alkanes and alkenes. This discovery is the first description of the genes responsible for alkane biosynthesis and the first example of a single step conversion of sugar to fuel-grade alkanes by an engineered microorganism. A paper on the work was published in the 30 July issue of the journal Science.

Alkanes are naturally produced by diverse species, but the genetics and biochemistry behind this biology have not been well generally well understood. The LS9 team looked into the genomes of cyanobacteria that produce alkanes in nature, evaluating many and identifying one that was not capable of producing alkanes, said Andreas Schirmer, Associate Director of Metabolic Engineering at LS9, and lead author on the paper. By comparing the genome sequences of the alkane producing and non-producing organisms, LS9 was able to identify the responsible genes. _GCC
By taking the genes from one species of bacteria and transplanting them into E. Coli, the scientists are making use of a bacterial production platform (E. Coli) which is quite familiar to biotechnologists, pharmacologists, and industrial microbiologists. Expect much more of this genetic "mixing and matching" to create optimal microbial production platforms.


Gevo Turning Cellulose into Fuels, Plastics, Clothing, etc.


Renewable chemicals company Gevo is now producing a range of products from biomass cellulose-derived sugars. These products include fuels usable as gasoline and diesel replacements, as well as jet fuel. Other high value products from biomass include plastics and clothing fiber materials. Gevo first ferments isobutanol from biomass (cellulose-derived sugars), then converts the isobutanol to other high value materials.
Gevo uses synthetic biology and metabolic engineering to develop biocatalysts (fermentation organisms) to make only isobutanol via fermentation at high concentrations—i.e., without the typical expression co-products. The initial generation biocatalyst operates on fermentable sugars from grain crops, sugar cane and sugar beets. Gevo has already produced renewable gasoline and jet fuel that meet or exceed all ASTM specifications.

The company is now developing a new generation of biocatalysts that can use the mixed sugars from biomass to produce cellulosic isobutanol.

To operate its fermentation at optimum conditions for the organism, and within the process conditions found in ethanol plants, Gevo developed a novel separation technology. The solution uses a process innovation for continuous separation of the isobutanol—which in high concentrations inhibits the growth of microorganisms—as it is produced.

...What’s new is the cost-effective production and purification of isobutanol from biomass. Gevo projects that the cash operating cost for its hydrocarbon fuel is competitive with $65 per barrel crude oil (without incentives).

Isobutanol can also be used directly as a gasoline blendstock and as a building block in the production of hydrocarbons found in petroleum-derived gasoline, jet and diesel fuels. _GCC

As Gevo improves its yields at all stages of its processes, the economics of Gevo's potential operations should become more clear. Biomass is not energy-dense, which means that pre-processing stages will need to be made cheap, light, and portable so as to go where the biomass is -- for purposes of densification of the energy resource. That will involve many small, local scale pre-processing plants -- some of them portable by truck or rail.

Preliminary products will then be shipped or piped to several regional processing plants for conversion into isobutanol and higher value products. A few more central plants may exist for more complex conversions and synthesis -- or chemical intermediates would be sold in bulk to other companies for final conversion in some special cases.

It is unclear how far into industrial production Gevo wants to go. The market exists for very large numbers of local pre-processing plants, and a significant number of regional processing plants.

Biomass can be derived from agricultural and forestry waste, and dedicated energy crops and fast growing trees. But the best prospect for high yield biomass at this time is micro-algae. Most of the problems preventing economic production of biodiesel from algae are not applicable to Gevo's process using algal biomass -- not algal oils.


Labels: , , , ,

Thursday, July 29, 2010

Matt Simmons Has Had Quite a Ride

Oil and gas industry investment banker, Matt Simmons, has achieved success and acclaim in a second career as a "peak oil prophet."  Mr. Simmons is famous for claiming that Saudi oil fields are in rapid decline, and that world oil production peaked in 2005.

During the recent BP Gulf of Mexico oil spill, Mr. Simmons was in great demand as an oil industry insider and expert.   But storm clouds are beginning to form over the reputation of Simmons as someone whose judgment and veracity can be counted on.  Strangely enough, some of the mumblings of mutiny are arising from the peak oil ranks themselves.

Peak oiler, energy blogger, and chemical engineer Robert Rapier has published a series of critical evaluations of Simmons' work, the most recent of which was published -- of all places (!) -- in The Oil Drum website. The TOD article takes a close look at some of Simmons' recent statements about the oil spill in the Gulf. A careful reader will come away with the clear conclusion that Simmons' credibility is in tatters, blowing in the breeze. Only the truest of Simmons true believers in TOD comments persists in defending the recent incredible and fantastic statements that Simmons had made to the press.

But is this a new phenomenon, or has Simmons always been this way? Some of the questions relating to Simmons' credibility go all the way back to his magnum opus, Twilight in the Desert.
Matt Simmons has been wrong about virtually every important trend he has tried to call.

He was wrong in his shrill predictions about US gas "going over a cliff". He predicted a catastrophic drop in US natural gas production by summer 2005. That never transpired, and in summer 2008 US gas production is *rising* at a rapid clip. Details

He also predicted a near term collapse in Saudi oil production (Twilight in the Desert: The Coming Saudi Oil Shock) which never transpired. The book was released in June 2005, and in the last 18 months, Saudi crude+condensate production has steadily risen, reaching a high of 9,700mbd in July 2008 (EIA stats, Table 1.1c). Historically, that's an extremely high level. The last time Saudi crude+condensate production was that high was almost 30 years ago, in October 1981 (see EIA, 2008 Monthly Energy Review, Table 11.1a Link).

Simmons staked his reputation on the claim that Saudi Production was going to collapse, and it did exactly the opposite. No wonder he's having a nervous breakdown and promoting bizarre schemes like mowing the bottom of the ocean with "underwater lawnmowers":
“Call it seaweed, if you want,” Simmons said. Whatever you call it, Simmons said the world must start harvesting this micro algae using what he called “underwater lawnmowers.”

Simmons acknowledged that any plan for large scale harvesting of micro algae likely would be strongly opposed by environmentalists. His blunt message to them: "Get over it. We’ve already destroyed the fish stock."Source
Simmons' consistent view has always been that high prices will not temper demand, that demand will continue to follow optimistic IEA forecasts even if supply massively undershoots that level (economic gobbledygook like "In seventeen years the world’s demand for oil may well be more than 50 percent greater than it is today, while production capacity may well sink to 1985 levels."Source), and that prices are going to go through the roof. Which, of course, is completely at odds with the actual situation of falling demand and prices. The man is overwrought and out of touch: on Sept. 22, 2008, as the price of oil was nosediving to $33, his comment was: "There really is no roof on oil prices at this point."Source

Petroleum is a finite resource. But if you want to build a career (even a second career) out of claims that "it is all downhill from here," you need to substantiate your claims much better than Mr. Simmons has been able to do. His recent spouting off about the Gulf Macondo oil spill tells us plainly that his judgment is not trustworthy. Which means that Mr. Simmons is even more obligated to provide factual substantiation to even the least of his claims and predictions.

Readers and administrators of The Oil Drum website generally seem to feel that Simmons should be referred to with respect in every instance -- as if Simmons had recently died, deserving the respect of the dead. But perhaps it is only Mr. Simmons' reputation that is dying, misstatement by misstatement, failed prognostication by failed prognostication.

Update: 31July2010 -- I just remembered a pertinent post by Robert Rapier from early January of this year:
The Saudis had production back above 9 million bpd by December 2007, and by July 2008 they had production at 9.7 million bpd – the highest level in almost 30 years (and without the aid of some of the major new projects that were expected to bump production a little). Their production then pulled back after prices collapsed. Just the fact that production flat-lined for 7 months with no new major projects coming on says without a doubt they were sitting on spare production when I was arguing that they were. If they hadn’t been, they would have declined a bit each month and could have only reversed that by bringing new projects online.

One argument that many people made for a permanent decline was that if Saudi had spare production they would have brought it online in 2006-2007 as prices climbed. As I replied at the time “Not if inventories are full.” (Of course Saudi production rose with the price of oil in 2008, and hit 9.7 million bpd in the same month that oil prices hit $147).
Robert provides a great deal more of interest to observers of peak oil in that brief article. Smart people will read it and take heed. Dull-witted disciples of the peak oil gurus will avoid Robert's reasoning for the head-pounding cognitive dissonance it would have provoked.


Wednesday, July 28, 2010

Big Solar: Not Ready for Prime Time

OECD via Rod Adams

JournoLists from the mainstream media have been credulous cheerleaders for the wind and solar industries -- and for the faux environmental movement in general. Rod Adams takes the New York Times to task for a typical example of atrocious journoLism on solar energy.
...a normally credible news source, the New York Times, apparently did not bother to more fully investigate the credibility of the "study" to find out that it is just a paper that was commissioned by an organization that is dedicated to a well publicized agenda. On July 26, 2010, on the front page of the New York Times business section, there was a Special Report: Energy titled Nuclear Energy Loses Cost Advantage written by Diana S. Powers whose conclusions about electricity cost comparisons between nuclear and solar were entirely based on the Blackburn and Cunningham paper and its sources. The writer did not check on the academic credentials of the paper's authors, check to see if it had been peer reviewed, or question whether or not it was backed up by independent work by anyone else. She quite possibly did not even read the entire paper to understand the calculations used to draw the pretty graph. The editor allotted a good deal of valuable space for this poorly researched work.
The paper is seductively titled Solar and Nuclear Costs — The Historic Crossover: Solar Energy is Now the Better Buy. The paper's cover has a dramatic and colorful graph that shows ever increasing costs for nuclear and ever decreasing costs for solar. The lines cross in 2010. (I will explain my use of the word "seductive".)

For their nuclear power cost projections, the professor emeritus and his grad student relied on a 2009 cost projection paper written by a lone researcher named Mark Cooper, whose current employment is described as "Senior Research Fellow for Economic Analysis" for the Vermont Law School Institute for Energy and the Environment. His brief biography states that he has a "PhD from Yale" but it does not specify his field of study. It indicates he is an "acivist/advocate" with a rather wide range of interest areas including telecommunications regulations and energy consumer issues.

The paper ignores all other cost projections for nuclear. Some of the previous work on this topic that the professor and his graduate student ignored includes the following:

Depleted Cranium takes a careful look at a new solar thermal plant in Sicily, and decides that the plant provides only "piddling" energy for the enormous cost of investment, land, and resources.
The actual plant is enormous. It contains over three miles of primary loop pipeline, occupies over thirty thousand square meters and costs sixty million euro just to build (never mind the operating cost of keeping those mirrors shiny and the pipes free of leaks and clogs.) For this enormous price the plant generates five megawatts.

Of course, that’s not why the plant is there. It makes a perfect window dressing for the much much smaller, yet much much more powerful gas-fired power plant that is located at the same site. Officials like to talk about how the solar collectors are “integrated” into the gas fired power system and allow for less gas usage, as some kind of transition. They don’t mention that the gas fired power plant is 752 megawatts, a lot more than the five megawatts of the solar power station. But hey, with all the glare of those solar collectors, you might not even notice the big gas burner next-door, right?

One other thing that should be noted: the plant is rated at five megawatts, but that doesn’t mean it actually generates that much power.

The “five megawatt” number is a reference to the peak capacity of the power station, which is much much different than its true output. During the mid day, on a perfectly clear sunny summer day, when all systems are functioning at their optimal performance, it may get up to five megawatts. However, it will be lower much more often.

To get a better idea of what this plant is actually supposed to produce, or at least what the estimates of the builder are, we need to figure out what the average power output is. According to one site, the plant is supposed to produce “9 million kilowatt hours a year.” That’s nine thousand megawatt hours. There are 8,760 hours in a year (non leap-year), so as it turns out, the actual output of the plant averages just a little over one megawatt. This is, of course, assuming the estimates are correct and not more rosy than the reality of things. _DepletedCranium

There are plenty of situations where off-grid solar energy makes a great deal of sense. Solar is more predictable than wind, but as an expensive and intermittent source it will not win any prizes from utility grid managers.

The gullibility and ignorance with which solar (and wind) energy projects are pursued by journoLists, public officials, political activists, and faux environmental lobbies is a sad testimony to modern education and child-raising practises. But the world is what it is and we must do with it what we can.

Labels: , ,

Tuesday, July 27, 2010

Dian Chu Predicts a New Oil Peak due to Shale Oil

Economic forecaster Dian Chu gives a presentation on the future of shale energy in the US.
Large deposits of gas and oil in US onshore shale formations are forming the foundation for a new fossil fuel energy rush inside the continental US.
Thanks to the Obama moratorium on new drilling in the Gulf of Mexico, oil producers and investors are looking toward fossil fuel deposits in shale formations from North Dakota to Texas to Pennsylvania.
Technological advances in exploration, recovery, production, and refinement of unconventional fossil fuels will only get better and more economical with time.
New timeline for Peak Oil? Sometime after the year 2100.

Of course by then, advanced fission and fusion will provide most of the world's electricity, and microbial producers will provide the bulk of the world's transportation fuels, high value chemicals, plastics, and animal feeds -- using everything from biomass to solid waste to liquid waste as feedstocks.

Labels: , ,

Monday, July 26, 2010

More Scientific Interest in Catching and Using CO2

Brian Westenhaus takes a look at a University of Cincinnati group that aims to turn CO2 into methanol (CH4) using a low temperature catalytic process.
The new paper published June 14th in the American Chemical Society journal Energy & Fuels reports a highly efficient nickel system for the catalytic hydroboration of CO2 to methoxyboryl species using a simple borane. The reactions operate at room temperature with turnover frequencies [495 h-1 based on B-H] at least 1 order of magnitude higher than those of the related reactions.
The improvement comes from the recent development of frustrated Lewis acid-base pair chemistry, which has led to alternative strategies for the reduction of CO2 to the methoxide level given either H2 or H3NBH3 9 as a hydrogen source.

The mechanism involves a nickel formate, formaldehyde, and a nickel methoxide as different reduced stages for the CO2. The reaction may also be catalyzed by an air-stable nickel formate.

An enzymatic approach to capturing CO2 for re-use comes from Codexis, and involves genetically modified enzymes made especially to withstand the higher temperatures involved. The new enzymes are said to be 100 times as efficient at CO2 capture as the standard solvent approach.

This recent scientific groundswell of interest in capturing CO2 and turning it back into fuels was spurred by this Sandia project of turning CO2 plus sunlight into fuels. It sounds almost poetic, even though we know that nuclear energy is far more reliable and potentially plentiful in concentrated form than solar energy.

Regardless, the human imagination has been unleashed in an attempt to solve a perceived problem -- rather than to merely whine about the perception. It will be fascinating to watch and see what problem-solving human minds can devise.

Update 27July2010: DOE to Award $106M to Six CO2 Conversion Projects; $156M in Matching Private Funding
Phycal, LLC (Highland Heights, OH) Phycal will complete development of an integrated system designed to produce liquid biocrude fuel from microalgae cultivated with captured CO2. The algal biocrude can be blended with other fuels for power generation or processed into a variety of renewable drop-in replacement fuels such as jet fuel and biodiesel. Phycal will design, build, and operate a CO2-to-algae-to-biofuels facility at a nominal thirty acre site in Central O’ahu (near Wahiawa and Kapolei), Hawaii. Hawaii Electric Company will qualify the biocrude for boiler use, and Tesoro will supply CO2 and evaluate fuel products. (DOE Share: $24,243,509)

Touchstone Research Laboratory Ltd. (Triadelphia, WV) This project will pilot-test an open-pond algae production technology that can capture at least 60% of flue gas CO2 from an industrial coal-fired source to produce biofuel and other high value co-products. A novel phase change material incorporated in Touchstone’s technology will cover the algae pond surface to regulate daily temperature, reduce evaporation, and control the infiltration of invasive species. Lipids extracted from harvested algae will be converted to a bio-fuel, and an anaerobic digestion process will be developed and tested for converting residual biomass into methane. The host site for the pilot project is Cedar Lane Farms in Wooster, Ohio. (DOE Share: $6,239,542)

Skyonic Corporation (Austin, TX) Skyonic Corporation will continue the development of SkyMine mineralization technology-a potential replacement for existing scrubber technology. The SkyMine process transforms CO2 into solid carbonate and/or bicarbonate materials while also removing sulfur oxides, nitrogen dioxide, mercury and other heavy metals from flue gas streams of industrial processes. Solid carbonates are ideal for long-term, safe aboveground storage without pipelines, subterranean injection, or concern about CO2 re-release to the atmosphere. The project team plans to process CO2-laden flue gas from a Capital Aggregates, Ltd. cement manufacturing plant in San Antonio, Texas. (DOE Share: $25,000,000)

Calera Corporation (Los Gatos, CA) Calera Corporation is developing a process that directly mineralizes CO2 in flue gas to carbonates that can be converted into useful construction materials. An existing CO2 absorption facility for the project is operational at Moss Landing, Calif., for capture and mineralization. The project team will complete the detailed design, construction, and operation of a building material production system that at smaller scales has produced carbonate-containing aggregates suitable as construction fill or partial feedstock for use at cement production facilities. The building material production system will ultimately be integrated with the absorption facility to demonstrate viable process operation at a significant scale. (DOE Share: $19,895,553)

Novomer Inc. (Ithaca, NY) Teaming with Albemarle Corporation and the Eastman Kodak Co., Novomer will develop a process for converting waste CO2 into a number of polycarbonate products (plastics) for use in the packaging industry. Novomer’s novel catalyst technology enables CO2 to react with petrochemical epoxides to create a family of thermoplastic polymers that are up to 50% by weight CO2. The project has the potential to convert CO2 from an industrial waste stream into a lasting material that can be used in the manufacture of bottles, films, laminates, coatings on food and beverage cans, and in other wood and metal surface applications. Novomer has secured site commitments in Rochester, NY, Baton Rouge, Louisiana, Orangeburg, SC and Ithaca, NY where Phase 2 work will be performed. (DOE Share: $18,417,989)

Alcoa, Inc. (Alcoa Center, PA) Alcoa’s pilot-scale process will demonstrate the high efficiency conversion of flue gas CO2 into soluble bicarbonate and carbonate using an in-duct scrubber system featuring an enzyme catalyst. The bicarbonate/carbonate scrubber blow down can be sequestered as solid mineral carbonates after reacting with alkaline clay, a by-product of aluminum refining. The carbonate product can be utilized as construction fill material, soil amendments, and green fertilizer. Alcoa will demonstrate and optimize the process at their Point Comfort, Texas aluminum refining plant. (DOE Share: $11,999,359)

Labels: ,

Sunday, July 25, 2010

11th Carnival of Nuclear Energy at NextBigFuture

Brian Wang presents 10 outstanding nuclear blog entries at the 11th Carnival of Nuclear Energy, hosted at NextBigFuture. Here is a sampling:

3. Atomic Rod (Rod Adams) has Proving a Negative - Why Modern Used Nuclear Fuel Cannot Be Used to Make a Weapon

My position is that there is no way for an ad hoc group of terrorists or nefarious state actors to build a weapon out of used commercialnuclear fuel. The challenge is that I have a self-appointed task of proving a negative when the positive assertion has been well publicized and firmly established as a "fact" by people with impressive credentials. I also have to figure out how to make this argument without access to technical details that remain classified. Finally, I have to do it in a way that does not require readers to work their way through the excellent, but lengthy explanation provided by Why You Can’t Build a Bomb From Spent Fuel (depleted cranium) orAlexander de Volpi's detailed Knol titled NUCLEAR WEAPONS PROLIFERATION: Controversy About Demilitarizing Plutonium.

4. Nuclear green has the Fuji Molten Salt Project Seeking $300 Million in Funding for Thorium Molten Salt Reactor Development.

On July 7, 2010 Professor Kazuo Furukawa, the head of the NPO "International Thorium and Molten-Salt Forum" has announced that a new company which aims to produce a commercially viable thorium nuclear power generation was established. The goal is to make a thorium nuclear power generator capable of producing 10,000 kilowatts and 20,000 kilowatts of electricity for the next five to ten years, respectively.

The company which was established in June is called, "International Thorium Energy & Molten-Salt Technology Inc. (IThEMS)." Its office is in Chiyoda-ku, Tokyo (tel. no. +81-3-3239-2595) with a capitalization of 2 million Japanese yen*. Its president is Mr. Keishiro Fukushima. A total of 300 million US dollars of capital coming from domestic and international companies and investors will be procured in order to produce a small-scale electric generator producing 10,000 kilowatts of electricity within five years.

This generation technology utilizes and involves fluoridized molten-salt to dissolve fluoridized thorium and other materials as liquid fuel.
10. Rod Adams had an article published on the oil drum "Possibilities for Small Modular Nuclear Reactors?". There are over 450 comments on this article, including comments by the author of this article under the alias of advancednano. In the comments, I also have another set of bets with Dittmar. This time on Kazakhstan uranium production in 2010 and 2011.

Go to NextBigFuture for the entire carnival

More from Charles Barton on Gen IV small reactors we might be building very soon:
The Babcock & Wilcox mPower Reactor actually sets a benchmark for Generation IV reactors. Generation IV reactors will need to compete with the mPower and similar reactors both in capital costs and in operational and maintenance costs.

The ARC-100 reactor project in the main conforms to the to the practical approach. Its design tracks closely with the design of the Experimental Breeder Reactor-II as it evolved into an Integral Fast Reactor prototype. The ARC-100 will be a more powerful reactor than the EBR-11, but not by an problematic extent. Current thinking suggests that reactors capable of generating 100 MWe represent a convergence point between the maximum financial benefit of factory reactor production, and grid usefulness. Smaller reactors because they produce less electricity, may represent less attractive investments for utilities seeking to replace fossil fuel generation sources, while larger reactors may demand far more expensive field construction. Thus the ARC-100 with an electrical output of 100 MW, is size competitive with the 125 MW mPower Reactor. _Much more with links at NuclearGreen

Safe, abundant, clean nuclear fission will provide the foundation for a successful transition from a fossil fuel economy to a sustainable energy economy. It is the key to long-term success in breaking free from the energy starvation, straitjacket that faux environmentaist lefty Luddites are attempting to wrap around the planet.


Splitting CO2 for Fuel and Climate?

* 700 square kilometers (270 square miles) of this system would extract the excess CO2 within ten years
A team of scientists at George Washington University and Howard University have devised a theoretical means of splitting CO2, turning the demon gas into either solid carbon or into carbon monoxide, CO. The CO could be used to generate hydrocarbon fuels with the aid of hydrogen -- a by-product of their theoretical process "STEP" (Solar Thermal Electrochemical Photo).
By using the sun's visible light and heat to power an electrolysis cell that captures and converts carbon dioxide from the air, a new technique could impressively clean the atmosphere and produce fuel feedstock at the same time. The key advantage of the new solar carbon capture process is that it simultaneously uses the solar visible and solar thermal components, whereas the latter is usually regarded as detrimental due to the degradation that heat causes to photovoltaic materials. However, the new method uses the sun’s heat to convert more solar energy into carbon than either photovoltaic or solar thermal processes alone.

...the process uses visible sunlight to power an electrolysis cell for splitting carbon dioxide, and also uses solar thermal energy to heat the cell in order to decrease the energy required for this conversion process. The electrolysis cell splits carbon dioxide into either solid carbon (when the reaction occurs at temperatures between 750°C and 850°C) or carbon monoxide (when the reaction occurs at temperatures above 950°C). These kinds of temperatures are much higher than those typically used for carbon-splitting electrolysis reactions (e.g., 25°C), but the advantage of reactions at higher temperatures is that they require less energy to power the reaction than at lower temperatures.

... The experiments in this study showed that the technique could capture carbon dioxide and convert it into carbon with a solar efficiency from 34% to 50%, depending on the thermal component. While carbon could be stored, the production of carbon monoxide could later be used to synthesize jet, kerosene, and diesel fuels, with the help of hydrogen generated by STEP water splitting._Physorg_via_BrianWang

If humans develop the ability to control the concentration of CO2 in the atmosphere through technological means, there would be little reason for the widespread hysteria which is seen in the UN, the EU, and in the Obama Pelosi regime. Likewise, as humans develop better technological methods of weather control -- controlling solar insolation, cloud formation, and precipitation -- the modern over-hyped concern about the use of fossil fuels should eventually subside.

The topmost graphic demonstrates the historically very-low levels of atmospheric CO2 found in our current atmosphere. Clearly the biosphere of Earth evolved under generally much higher levels of CO2. Modern high-tech greenhouses use expensive CO2 generators to boost the levels of CO2 to up to 3 X atmospheric levels -- for more optimal growth of a wide variety of plant life.

The modern obsession with "pre-industrial levels of CO2" displays a profound ignorance of this planet's atmospheric and biological history, as the graphic above demonstrates. Closer inspection of the motives of the leaders of the carbon hysteria orthodoxy demonstrates monetary payoffs via carbon trading, international carbon ransom payments, and other economic maneuvers of questionable legality and wisdom.

It is a good idea to develop the means to control basic atmospheric parameters, in order to provide for rapid recovery from unanticipated geologic or extraterrestrial events. Anyone who has looked at the details of Earth's carbon cycle intelligently and critically will not be alarmed at anthropogenic use of carbon. But the universe holds many surprises for a young race of slightly evolved apes, and it does not hurt to be prepared for as many of those surprises as we can anticipate -- if the results are potentially severe.

Previously published at Al Fin

More: We need all the fossil fuels we have in order to transition into a cleaner, more abundant, and more sustainable energy future. If we cut ourselves off at the neck now (via Obama Pelosi style energy starvation) we will not be able to develop the advanced technologies that will allow us to spread the miracle of Earth's ecosystems through the solar system and beyond. If we follow the political scams that are making the faux environmental movement wealthy and powerful, we may as well call it quits as a species and a planet. Because sooner or later something devastating is going to happen to this planet -- either via innate geological processes, or via an extraterrestrial event. If we follow the witless way of Greenpeace, WWF, Sierra Club, etc. that will be the end, because we will have abandoned technology and space in order to "save the planet." But what we will have actually done, is to allow the only known source of life and intelligence in the universe to die without a struggle. That is not only stupid, but it is cowardly. Do we really want to teach our children to be stupid cowards?

Labels: ,

Saturday, July 24, 2010

Another CO2 to Fuels Approach

Researchers at Columbia University’s Lenfest Center for Sustainable Energy, in collaboration with Risø National Laboratory for Sustainable Energy, DTU, are investigating the high-temperature co-electrolysis of CO2 and H2O using solid oxide electrolysis cells (SOECs) to produce a syngas for conversion into liquid hydrocarbon fuels. _GCC

The idea of "re-cycling" CO2 back into a fuel -- skipping the middleman of photosynthesis -- continues to be popular in certain scientific circles. This approach involves the co-electrolysis of H20 with CO2 to create an H2/CO syngas, which can be further processed into liquid hydrocarbon fuels.
The Lenfest/Risø team notes that high temperature electrolysis makes very efficient use of electricity and heat (near-100% electricity-to-syngas efficiency), provides high reaction rates (no need for precious metal catalysts), and the syngas produced can be catalytically converted to hydrocarbons in well-known fuel synthesis reactors (e.g. Fischer-Tropsch). There is no need for a separate reverse water-gas shift reactor to produce syngas, and the waste heat from exothermic fuel synthesis is useful in the process.

An analysis of the system energy balance presented by Christopher Graves at the May conference showed a 70% electricity to hydrocarbon fuel efficiency. Using solar photovoltaic energy at 10-20% efficiency, that would result in an overall 7-14% solar energy to liquid fuel efficiency, he said.

Their analysis of the economics of a co-electrolysis-based synthetic fuel production process, including CO2 air capture (earlier post) and Fischer-Tropsch fuel synthesis, determined that the price of electricity needed to produce competitive synthetic gasoline (at $2/gal wholesale) is $0.02 - $0.03 per kWh. _GCC


The Columbia / Riso approach involves the use of Ni/YSZ based solid oxide cells for co-electrolysis. This high temperature approach is somewhat similar to the STEP process being developed by George Washington U. and Howard U. researchers, for the explicit purpose of splitting CO2 for purposes of reduction of atmospheric CO2 levels. The STEP process provides the option of turning atmospheric CO2 into solid carbon for easy sequestration, or turning CO2 into CO for conversion to liquid fuels -- as in the Columbia / Riso approach pictured above.

Al Fin engineers and atmospheric scientists are uncertain where the large quantities of CO2 required to make these processes economical will be obtained. If political activists are successful in shutting down large-scale coal and other hydrocarbon power generation processes, CO2 could become rather scarce. (the atmosphere possesses only 0.04% CO2)

The plant life of Earth evolved, for the most part, under conditions of much higher levels of atmospheric CO2, and conditions of much more "acidic" oceans than at present. The planet's natural systems are, if anything, starved for CO2. If political activists ever grow beyond their greedy scamming stage of carbon scheme and tax grifting -- if they become efficacious at carbon reduction despite themselves -- planet Earth will might find itself in a desperate situation where the "great human dieoff" being promoted by political activists will not be optional any longer, but instead mandatory.

...In other words, we know that the planet can thrive at higher temperatures and higher levels of CO2 than at present -- because it has done so many times before, for prolonged periods of time. We do not know what may happen if humans -- out of a fanatical fear of a trace gas -- push levels of CO2 far below where natural homeostasis would have placed them. Particularly if Milankovitch cycles and other natural triggers of cooler climate happen to coincide with such sanctimonious human meddling. Earth life can be hurt badly if CO2 levels are pushed too low.


Friday, July 23, 2010

Coal Gasification Plant to be Built Near Odessa, TX

The new polygeneration IGCC project will use coal as its feedstock. With a gross capacity of 400 megawatt (MWe), the plant will also produce urea for the U.S. fertilizer market. With a carbon capture rate of 90 percent, the plant will have one of the highest carbon capture rates of any IGCC plant in the world. The CO2 will be used for enhanced oil recovery in the West _Thomasnet
The Texas Clean Energy Project will combine Integrated Gasification Combined Cycle (IGCC) technology for efficient energy production from coal, with co-products of urea (for agricultural fertilisation) and CO2 (for oil well enhanced recovery).
The Texas Clean Energy Project will be located in Penwell, near Odessa, Texas. Siemens will deliver the gasification island technology, which will include two SFG-500 gasifiers. The power block will be based on an SGT6-5000F gas turbine modified to operate on high H2 syngas, which will allow the plant to have a very high carbon capture rate of about three million tons/year. The power block will also include a Siemens SST-900RH steam turbine, air-cooled generators and SPPA-T3000 controls.

...IGCC technology is part of Siemens' Environmental Portfolio. In fiscal 2009, revenue from the Portfolio totaled about EUR23 billion, making Siemens the world's largest supplier of ecofriendly technologies. In the same period, the company's products and solutions enabled customers to reduce their CO2 emissions by 210 million tons. This amount equals the combined annual CO2 emissions of New York, Tokyo, London and Berlin. _Thomasnet
IGCC technology is more efficient because it combines gas turbine cycle and steam turbine cycle to capture more energy from the coal. Capturing the CO2 for productive use -- enhanced oil recovery -- is far preferable to mere sequestration, which is wasteful of energy, money, time, and CO2.

As long as the effluent of a coal power plant is just steam and CO2, there is no need for further "cleanup" of the effluent. Most of Earth life evolved at times when atmospheric CO2 was much higher than at present. Most Earth life is literally "starving for CO2." Nature will welcome the additional CO2, if humans have no use for it.


Wednesday, July 21, 2010

Fast Pyrolysis Biomass to Liquids, Biomass to Gas

Pyrolysis oil is a clean and uniform liquid that can be used as a sustainable alternative to fossil fuels for the production of renewable energy and chemicals. It is obtained through a process called fast pyrolysis, which transforms biomass into a liquid. As a technology supplier BTG-BTL delivers the engineering package, plant automation and the core components for pyrolysis plants. Other plant components and auxiliaries are sourced locally. BTG-BTL strongly believes in cooperation models involving local partners. Working with local partners helps to ensure full consideration of local regulations, standards and safety requirements. BTG-BTL will work together with (EPC-) partners when turn-key delivery is requested. _BTG-BTL PDF
Thermochemical conversion of biomass to biofuels and high value chemicals can be done with available, conventional technology. Optimising the processes is a matter of chemical and industrial engineering.
(1) BTG-BTL’s fast pyrolysis technology is based on intensive mixing of biomass particles and hot sand in absence of air in a modified rotating cone reactor. Pyrolysis oil, char and gas are the primary products from the process.
(2) The charcoal and the sand are recycled to a combustor where the charcoal is burned to reheat the sand.
(3) The vapours leaving the reactor are rapidly cooled in the condensor yielding the oil and some permanent gases.
(4) The permanent gases and the surplus heat from the combustor can be used to generate steam for power generation, biomass drying or external use.

In the last few years, BTG-BTL has tested more than 45 different kinds of biomass feedstock including wood, rice husk, bagasse, sludge, tobacco, energy crops, palm-oil residues, straw, olive residues, chicken manure and many more. BTG-BTL is always interested in broadening the range of suitable feedstock for producing pyrolysis oil._BTG-BTL PDF

via BiofuelsDigest

Labels: ,

More on Small Modular Nuclear Reactors from Rod Adams

A growing body of plant designers, utility companies, government agencies and financial players are recognizing that smaller plants can take advantage of greater opportunities to apply lessons learned, take advantage of the engineering and tooling savings possible with higher numbers of units and better meet customer needs in terms of capacity additions and financing. The resulting systems are a welcome addition to the nuclear power plant menu, which has previously been limited to one size - extra large. Developing a broader range of system choices using nuclear fission energy could have a measurable impact on segments of the energy market that have been most often served by burning distillate fuel or natural gas. Small modular reactors offer a reason to be optimistic that human society will have access to all of the energy that it needs for increased prosperity for larger portion of the population. _RodAdams
Small, factory-built, modular nuclear reactors have the potential to solve a lot of the problems with nuclear power development in both the advanced and the undeveloped world. Rod Adams recently published an article in The Oil Drum, providing more specifics:
... The complexities of putting together the very large systems and projects kept adding to the risk, which added to the cost and complexity of financing which added to the project complexity by requiring additional partners - including government agencies and public subsidies.

Some frustrated nuclear plant designers, inspired by talking with customers about their needs and remembering what was technically possible in terms of nuclear reactor sizing determined that they might be able to solve some of the cost and schedule complaints by a complete rethinking of the old economy of scale paradigm. For anyone who has been paying attention during the past five years or so, the names of Hyperion, NuScale and Toshiba 4S have been increasingly frequent terms of discussion as start-ups and some established vendors began designing nuclear fission based systems sized at 10, 25, or 45 MWe, which is a radical departure from the 1000 MWe (plus) sizes of the AP1000 (Westinghouse), ESBWR (GE-Hitachi), or EPR (Areva).

Initially, the project leaders for these new designs thought about using them in distributed remote locations where power is either not available or is being supplied by expensively delivered diesel fuel. John (Grizz) Deal and his sister, Deborah Deal Blackwell, the Hyperion Power Generation founders thought about the how a simple, infrequently fueled nuclear plant could supply power to a remote area for up to a decade without refueling. They recognized the value that such a system could provide to the previously powerless people living in that remote area.

The system could provide power for refrigeration, water treatment and distribution systems, communications systems, and reliable, flicker free lighting. Unfortunately, the specific technologies needed for the Hyperion design - liquid metal (Pb-Bi) cooling and uranium nitride fuel elements - are not in commercial use. They hve been used in several specialized reactors and proven to work reliably and safely, but starting up a new supply chain is just one of the many hurdles that Hyperion is diligently working to overcome. The Toshiba 4S sodium cooled power system faces similar challenges, but both concepts have their fans and both are moving forward.

A trio of project teams has recognized that the concept of small does not mean that you have to start from scratch with the supply chain, training programs, and safety analysis; it is possible to do a redesign of light water reactors from the ground up to produce an economical design that achieves economy by both simplification and increased unit volume. All three of the teams - NuScale, B&W and Westinghouse - have designed systems that put the entire primary plant into a single pressure vessel. This choice eliminates the potential for a large pipe break loss of coolant accident. They have all chosen to include a large volume of water - relative to the core power output - that provides operators with lengthy interval between any conceivable accident and required operator action. They also have chosen passive safety systems that do not require any outside power sources to operate, so they expect to be able to prove that they can meet existing safety criteria without redundant power sources. All of the iPWR systems envision using fuel assemblies that are essentially the same as commercial nuclear plant fuel elements - but they will be shorter and there will be fewer assemblies in each core. All of the systems have been designed for the post 911 security and safety considerations including the aircraft impact rule through the use of below grade installation.

...The integrated pressurized water reactor (iPWR) that is gaining the most buzz from the business community and political leaders, however, is the 125 MWe mPower™. Yesterday, Bechtel Corporation, one of the largest privately held companies in the United States, with 57,000 employees and $30.8 billion in 2009 revenue, announced that it was joining with B&W as a 20% partner in an exclusive alliance that they have branded as Generation mPower to build complete, turn-key power plants.

B&W has an already existing and ASME 'N-stamp' certified US manufacturing base and 50 years worth of experience in building nearly all of the components required for the small, modular light water reactors that power ships and submarines. Bechtel has either built or participated in major renovation projects at 64 of the 104 nuclear plants operating in the United States.

The mPower™ modules will be about the same size as the NuScale modules, but each module will produce about 2.5 times as much power as a NuScale module because they include submerged reactor coolant pumps to provide forced flow through the core. The system is designed to supply a sufficient quantity of natural circulation to provide core cooling after shutdown without any pumps running, thus maintaining the passive safety characteristic. Like NuScale, Generation mPower expects that customers for its plants will probably want to plan to install multiple units on a single site, though they might start with just one or two and add additional units gradually over time. Generation mPower has informed the NRC that it will be submitted a design certification application by the end of 2012; that application might be filed at the same time as a construction and operating license for the first of a kind unit.... _TOD

Once the US Nuclear Regulatory Commission finally gets to work certifying the best of the SMRs, the small-scale nuclear renaissance should come into play. As the safety, reliability, affordability, and prompt delivery and installation are all proven, the number of applications for these reactors will almost certainly multiply rapidly.


Tuesday, July 20, 2010

BP Looks to "Static Kill" -- A New Way to Kill Macondo

In Kent Wells' 19July evening press briefing, he announced the possibility that BP would attempt a new way to kill the well from the top called "static kill." It is called "static kill" because the well is currently shut in, with no flow. In this situation, there is no need for ultra-high speed pumping of the kill mud, as in the failed top kill. Instead, heavy mud can be pumped into the top of the shut-in well through the choke and kill lines, in a relatively leisurely manner, under close monitoring. More from Kent Wells:
In terms of the static kill. And let me talk about this because this is – people are probably going gee, we haven’t heard about this. And I think there’s good reasons. This is very much in its infancy. This is not something that we’ve approved to do. We want to have a number of sessions going through all our procedures. But let me tell you what brought this into play.

There was two things that allowed this to become a reality. First of all was the possibility the well having integrity. We needed to have that. The tests are encouraging at this point but we haven’t made a firm decision on that. But that was – that was important.

And the second piece was the fact that it had a lower reservoir pressure. That was important as well to make sure we stay underneath the – any pressure constraints we might have with the system.

And so the big difference between the static kill and of course before when we talked about the top kill, which was a dynamic kill where we had to pump at tremendously high rates to try to overcome the flow of the well. It’s a very different situation when you actually have the well shut in. We can pump at low rates, we can keep it at low pressures and do it in a very different way.

So we’re going to work through with the teams and work with the scientists and see whether this is something we can do. It clearly has some advantages in lowering the well head pressure et cetera. Maybe even to the point of the well being killed. But these are all the things that we need to work through.

Now, what I want to stress through is that at the end of the day the relief well will still be the ultimate solution. We will still drill in with the relief well to make sure that the annulus is dead, et cetera. But this static kill does give us a new option like always we like to pursue parallel options, we’d like to use an overabundance of caution and that’s what we’re doing to move forward. so I’ll put it as – it’s encouraging at this point but there’s a couple days of work to do before we’d be in a position to make a decision. _Kent Wells Briefing PDF_and_audio

Here is more about the static kill:
BP’s new idea of a static kill would involve pumping heavy mud into the well bore, to choke off the oil. A similar technique, called a top kill, was attempted in May but failed. The top kill could not overcome the force of oil flowing out of the well. But now that the cap is on, and holding, the forces involved are weaker and a static kill might work.

“The static kill does give us a new option,” Mr. Wells said. “It’s encouraging at this point but there’s a couple days of work before we’d be in a position to make a decision.”

The move, if it is tried and is successful, would effectively do the job of the close-to-ready relief well, which is now about a metre away from the original well. However, it will take until at least early August, and possibly mid-August, until the relief well is finished. Mr. Wells said the relief well remains “the ultimate solution.” _GlobeandMail

From an earlier article posted to Al Fin


Monday, July 19, 2010

10th Carnival of Nuclear Energy at NextBigFuture

Brian Wang -- founder of the Carnival of Nuclear Energy -- presents the 10th Carnival at his home website. This carnival looks at topics from Thorium energy to climate science to shale gas to China's stockpiling of uranium to the future of nuclear energy projects -- including small modular fission reactors and fusion projects.

Entry number 5 from Power Industry Trends is a synopsis of small modular fission reactor projects, aiming to overturn the calculus of nuclear power.
The players:

-Small modular reactor currently rated at 45MWe with up to 24 units at single location (1080MWe)
-36 months from first concrete to power

-Passive cooling systems using natural circulation
-Proven LWR design which should provide faster regulatory review as it is not novel technology
-24 month refueling cycle
-500 tons as shipped via barge, truck or train, forged and fabbed at any mid-size facility
-Estimated cost advantages due to: simplicity, modular design, volume manufacturing and shorter construction times
-Filing with NRC for design certification in Q2 2012. NuScale expects the first nuclear facility will be operational sometime in 2018.
Data from Nuscale website

B&W mPower
-125 MWe to 750 MWe or more for a 4.5-year operating cycle without refueling
-Proven ALWR design which will reduce regulatory review time
-Design Certification submittal in 2011
-Letter of intent recieved from Tennessee Valley Authority (TVA) to begin the process of evaluating a potential lead plant site
Data and picture from B&W website

ARC-100 Advanced Reactor Concepts
-Sodium-cooled, metal fueled, fast-reactor currently rated at 50-100MWe
-Sodium Cooled primary to super critical CO2 secondary Brayton Cycle
-Based on technology proven by over 30 years of successful operation of EBR II, an experimental program operated by the U.S. government
-20 year (yes year) refueling cycle
-Proliferation proof fuel system
-10 acre footprint and less than 24 months construction time
-Target cost of $0.05 per KWh for electricity production
-Initial discussions with NRC, but no date for design certification submittal
Data from Advance Reactor Concepts website

Hyperion Power Module (HPM) or Mini Power Reactor (MPR)
-Formerly the Comstar reactor invented by Dr. Otis "Pete"' Peterson at the United States' famed Los Alamos National Laboratory (LANL) in New Mexico. Through the commercialization program at LANL’s Technology Transfer Division, HPG was awarded the exclusive license
-Each liquid metal PbBi cooled HPM-based electric plant generates 25MWe and can be configured for steam only, co-generation, or electricity only. Two or more modules can be "teamed" together.
-$50 million for one 25Mwe module
-Fits into a standard fuel transport container, Transported via ship, rail, or truck. Total mass < 20 metric tons -Produces power for 8-10 years and entire reactor module is replaced -Expect design certification submittal to NRC within a year -150 purchase commitments from customers such as mining and telecom companies, provided its technology gets licensed for operation -Meets all the non-proliferation criteria of the Global Nuclear Energy Partnership (GNEP) as the entire module is fueled and sealed in the factory and returned to the factory once expended. -Alternate Energy Holdings Inc (AEHI) has signed a MoU with Hyperion Power Generation Inc of New Mexico which the companies have described as "the beginning of a joint venture" to build and market Hyperion's modular reactors around the world Data from Hyperion website _Several other SMR projects along with links provided at: PowerIndustryTrends

It seems as if small modular reactors are the most likely breakthrough nuclear energy product over the coming two decades. It is now a matter of the dozen or so companies behind the SMR projects to bring their products to market-ready condition, and for the nuclear regulatory agencies of various nations to get off their butts and certify the products that are ready for market.

Labels: , ,

Sunday, July 18, 2010

Utilities Starting to Like the Looks of Small Modular Reactors

For utilities, a small reactor has several advantages, starting with cost. Small reactors are expected to cost about $5,000 per kilowatt of capacity, or $750 million or so for one of Babcock & Wilcox's units. Large reactors cost $5 billion to $10 billion for reactors that would range from 1,100 to 1,700 megawatts of generating capacity. _WSJ

Power utility companies are beginning to see the wisdom in small, modular, factory-built reactors, which cost much less per unit than do large, conventional nuclear reactors. Construction time for installation of small reactors could be cut in half -- or more. And everyone understands that it is the time of construction which can make or break large industrial projects.
Companies such as NRG Energy Inc., Duke Energy Corp. and Southern Co. are planning large reactors that cost up to $10 billion apiece and can generate enough electricity to power a city the size of Tulsa, Okla.

But there is growing investor worry that reactors may have grown so big that they could sink the utilities that buy them. An increasingly global supply chain for big reactors also worries investors.

"We think the probability that things will go wrong with these large projects is greater than the probability that things will go right," said Jim Hempstead, senior vice president at Moody's Investors Service. He warns that nuclear-aspiring utilities with "bet the farm" projects face possible credit downgrades.

...For utilities, a small reactor has several advantages, starting with cost. Small reactors are expected to cost about $5,000 per kilowatt of capacity, or $750 million or so for one of Babcock & Wilcox's units. Large reactors cost $5 billion to $10 billion for reactors that would range from 1,100 to 1,700 megawatts of generating capacity.

While large reactors are built on site, a process that can take five years, the mPower reactors would be manufactured in Babcock & Wilcox's factories in Indiana, Ohio or Virginia and transported by rail or barge. That could cut construction times in half, experts believe.

Because they could be water-cooled or air-cooled, mPower reactors wouldn't have to be located near large sources of water, another problem for big reactors that require millions of gallons of water each day. That could open up parts of the arid West for nuclear development.

The first units likely would be built adjacent to existing nuclear plants, many of which were originally permitted to have two to four units but usually have only one or two.

Down the road, utilities could replace existing coal-fired power plants with small reactors in order to take advantage of sites already served by transmission lines and, in some cases, needed for grid support. Like any other power plants, these small reactors could be easily hooked up to the power grid.

One of the biggest attractions, however, is that utilities could start with a few reactors and add more as needed. By contrast, with big reactors, utilities have what is called "single-shaft risk," where billions of dollars are tied up in a single plant. _WSJ_via_SeekerBlog

Another advantage of small, modular reactors is that a lot more of them can put put into many more widespread locations -- providing a tougher, more resilient, distributed power generation system. It will also make it harder for faux environmentalist lefty-Luddites to stage their shut-down protests at all of them -- or even most of them.

Of course, when a nation is burdened with a regime such as Obama Pelosi, it has its work cut out for it just to survive. Creating a clean, abundant, and prosperous life might require a bit of rearranging of the national organisational chart.

Labels: ,

"Crude Oil Itself Has Already Peaked!"

...crude oil itself has already peaked – at least five times since 1950, Prof. Boyce says – without beginning to approach the demise of oil anticipated by peak oil theory’s famous Bell curve. Indeed, crude oil reserves have doubled roughly every 15 years since 1850 and the world now has more proven reserves than it has ever had in the ensuing 150 years. _CTV
Crude oil may just peak a few more times before humans discard oil in favour of better sources of energy. Oil production is responsive to price signals, although oil producers will watch the price curves closely, to be sure that a rise in prices will last long enough to justify a ramp-up in production. The ephemeral rise in oil prices in the summer of 2008 clearly did not last long enough to justify a full-scale build-up in production.
THERE is only one rule when it comes to the availability of oil: experts are always convinced it will run out – but it never seems to. Brent crude is now at close to $75 a barrel, which is high but hardly excessively so by historical standards....

But one thing is clear: the peak oil theorists, who constantly predict that prices are about to explode, keep on getting it wrong but are never sufficiently held to account by a gullible media. As Mark Perry, a professor at the University of Michigan, reminds us, it was five years ago that Houston-based Matthew Simmons (a leading peak oil man) and the unusually sensible New York Times columnist John Tierney bet $5,000 on the price of oil. The wager was based on the average daily price for 2010, adjusted for inflation. If the inflation-adjusted oil price averages $200 or more, Simmons wins $10,000, and if the average price this year is less than $200, Tierney wins. As Perry points out, average oil prices this year won't even be anywhere close to $100, and will probably average less than $70, far below the $200 price predicted by Simmons. Only a total catastrophe would ensure the price would average $330 per barrel for the rest of 2010 – which is what would be needed for Simmons to win his bet. He will have to hand the cash over and eat humble pie. _CityAM
Matt Simmons is just one of the grifters who has latched on to the gullible mass of believers in Peak Oil Doom. Losing $5,000 in a bet is unlikely to discourage Matt, since he probably cons a lot more than that from his credulous disciples every week.

If you are a doomer, you have to ask yourself the question: "If I am wrong about this doom business, how much am I hurting myself, my family, and our future prospects for happiness and prosperity?" Most doomers just get off on the doom, and don't really care about the opportunity costs. Some of them want to starve the world of energy, so that the human population will be reduced by 50% to 90% from its current numbers (the "dieoff"). But there are likely to be some doomers who eventually get tired of the echo choirs, the genocidal undertones, and the role-playing.
The notion that the world is approaching "Peak Oil" is just not supported by the facts. We may still be getting closer to a peak, but the world has far more untapped oil than anyone could have imagined 10 years ago. _StreetAuthority
There is nothing new about the idea of an energy crisis. Back in Elizabethan times, England suffered from "peak wood." There have been peaks in production -- multiple peaks -- for anything that humans have ever used for energy. More peaks will certainly come and go. But the only form of "peak energy" that could be semi-permanent, is the type of "peak energy" that has come to North Korea -- "political peak energy."

In America, insightful persons might well ask: "Who needs peak oil when we've got Obama Pelosi?" They are referring to the energy starvation policies of the Obama Pelosi regime, which intends to bankrupt coal and oil companies -- and would like to do the same to gas and nuclear companies, if they could devise the proper excuse.

Energy is the life's blood of an advanced society. Persons who intentionally set about to disrupt or shut down a society's energy supply, are essentially committing genocide. Voters should consider that when they go to the polls.

Labels: ,

Newer Posts Older Posts