Wednesday, April 27, 2011

Large, Lucrative Markets for n-Butanol and iso-Butanol from Biomass

Cobalt and others [ed: including Gevo and OPX] are hedging their bets. In addition to fuels, companies are also developing other products, such as biochemicals. Those products have higher price points that could create economic stability for the companies long before any big investments in biofuel production take place, Wilson said.

"This is a little company. Why take a molecule and turn it into $2 for fuels when you can turn it into a $5 chemical?" he said. _PaloAltoOnline
Gevo SEC Filing

Gevo Inc. is powering ahead with its renewable isobutanol product. Gevo's latest effort is a partnership with Mustang Engineering, LP, to produce jet fuel using isobutanol as feedstock.
Gevo, Inc., a renewable chemicals and advanced biofuels company, signed an engineering and consulting agreement with Mustang Engineering, LP for the conversion of Gevo’s renewable isobutanol to biojet fuel. This effort will focus on the downstream processing of isobutanol to paraffinic kerosene (jet fuel) for jet engine testing, airline suitability flights and advancing commercial deployment. (Earlier post.)

Gevo also announced that its “fit for purpose” testing at the Air Force Research Laboratory continues with a final report expected in June. Once completed successfully, the company will initiate jet engine testing with engine manufacturers. _GCC
Gevo competitor Cobalt Biofuels produces bio n-butanol (normal butanol), which is a straight chain 4 carbon alcohol, rather than the branched chain alcohol isobutanol, made by Gevo. Cobalt claims that its n-butanol product has access to even larger markets than does Gevo's isobutanol (see image at top).
The advantage of normal butanol is that you can take normal and isomerize it, but you can’t take an isomer and normalize it. So, with n-butanol, you have advantages as a platform for a wider variety of chemicals.

"For us, we like wood, bagasse and glycerol as feedstocks. We see costs there in the $60 per ton for wood biomass, $40 for bagasse and $20 for glycerol. Those change, but there's a significant enduring advantage compared to the cost of sugarcane or corn, which are well over $200 per ton. That’s lower than the cost of crude oil, but when you take into account the amount of energy you can access in corn or cane, the cost advantage can be minimal unless you have very high oil prices sustained for a very long time.

"The markets, for us," said Wilson, "are the $7 billion n-butanol market, for acetates, acrylates, and glycerol esters, where the current pricing is $2300 per metric ton. Compare that to diesel or gasoline, both under $1000 per ton. Also, we have the OXO derivatives, such as butyric acid or 2-ethyl hexanol. That's a $9 billion market trading at $2600 per ton. There are also the butene derivatives, such as isobutene, a $17 billion markt trading at between $1200 and $1500 per metric ton. There are paints, solvents, plasticizers, paint dyes, stabilizers, preservatives and more in those markets.

"The markets are very small, compared to the fuels markets, where you have $250 billion in jet fuel, $980 billion for gasoline and $1040 for diesel. Those are a great story, but its hard to make money, and the first goal of any company should be to make money. The cost of making petroleum-based butanol is around $1230 per metric ton. We can make it from corn at $1200 per ton, cane at $1170 per ton. But when we look at wood, our cost drops to $800 per ton; or $650 per ton from bagasse, or $380 from glycerol. _LowCarbonEconomy
OPX Biotechnologies has partnered with Dow Chemicals to produce specialty chemical products from biomass, locating their facility at the Dow plant to facilitate cost savings.
Over at OPX Bio, the market is acrylic acid, which is now at $8 billion and growing 3 to 4 percent per year. Acrylic acid is a key chemical building block used in a wide range of consumer goods including paints, adhesives, diapers and detergents. Later on, the company hopes to commercialize a new technology, converting syngas to fatty acid esters, in work funded by ARPA-E. Other partners will be signed by OPX to work that technology up to scale.

. Earlier this week, Dow Chemical and OPXBIO announced that the two companies are collaborating to develop an industrial scale process for the production of bio-based acrylic acid from renewable feedstocks.

Both CEOs agree that the path to cost competitive advanced biofuels runs first through the garden of renewable chemicals. Not only because they are sold at higher prices. They are made in smaller volumes. That means smaller commercial scale plants, lower capital costs, and faster returns for investors for who, time is money. Where fuel-centric companies are raising hundreds of millions for their commercial-scale plants, OPX, for example, is raising just $35 million in its series C round, which takes the company through completion of its demonstration-scale plant.

But there's one other advantage. That's the breadth of technical collaboration and opportunities to share cost on critical technologies. "That's where the Dow partnership comes in, said Eggert. "They are the largest acrylic acid producer from propylene, and have very strong chemical foundational knowledge, and the relationships in place for market development. Here's a benefit to us: neither Dow nor we feel that we need to raise and invest for the consuruction of a demon plant. Instead, we'll use standard contract fermenters to make hydroxypropionic acid, (3-HP), then Dow already has the bioacrylic conversion from 3-HP at pilot and demo scale."

Scale? For OPX, their 100 million pounds (equivalent of 14 million gallons) biorefinery is expected to be cost-competitive with 350 million pound propylene-based systems. The market price? 70 cents per pound, or right on $5 per gallon. _LowCarbonEconomy
You can see that the market for fuels is much larger than the market for chemicals, but that the profit potential for chemicals is much higher. A biofuels company can become profitable more quickly and at smaller scale -- with less debt -- by exploiting the renewable chemicals market first. This allows the company to develop more efficient, streamlined, and economical approaches to the higher-volume fuels markets without bleeding capital in the process.

Labels: , ,

Tuesday, April 26, 2011

A Resilient Grid of Small Modular Reactors

In reality a world free of nuclear power would be less secure. For the foreseeable future, neither fossil fuels nor renewable sources will be able to replace the 14 per cent of global electricity generated by nuclear reactors, without risking severe instability and shortages in energy markets. Put simply, energy security requires a diversity of sources, including nuclear. _FT_via_Arevablog
The world not only needs nuclear power, but it needs smarter forms of nuclear power. Small modular nuclear reactors (SMRs) offer far more versatility, safety, reliability, and affordability than the traditional 1 GW ++ nuclear reactor plant. They can be ganged together to provide whatever level of power production you want, or they can be networked into a decentralised, resilient grid.
A resilient grid of SMRs built in a distributed network would be much less susceptible to damage from natural disasters or man-made disruptions. If one SMR goes out of service, it doesn't create a regional blackout for everyone else in the utility's service area.

...The ARC-100 reactor design concepts contain intriguing safety measures which might benefit highly industrialized countries seeking a more resilient power grid. Similar benefits might come from other SMR designs including those that use conventional LWR designs. It depends in part on the pace of advancement in fuel cladding materials science.

The key idea is to find ways to avoid future consequences of having too much electrical generation capacity invested in a single site. This is especially important in areas where there is a potential for earthquakes, tsunami, and other natural disasters or man-made disruption. SMRs buried underground add the natural containment of that design paradigm to their protective envelope. _Dan Yurman Energy Collective
Many legislators in the US are calling modular reactors the wave of the future.

Small modular reactors have many applications besides electrical power generation:
The main applications of SMRs apart from electricity generation are desalination, process heating and district heating. SMRs are gaining importance in the global nuclear power industry; according to recent new reports, in the US's 2012 budget proposal, funds are said to be allocated to the development and deployment of SMRs. Despite the constraints, the SMRs enlist more key drivers for its deployment. SMR design is also considered as an alternative to replacing ageing fossil power plants of similar capacities across the world. _BusinessWire
SMRs are also likely to be key to the economical and clean unlocking of energy from oil sands, heavy oil deposits, and kerogen oil shales.

Interest in SMRs is growing, as more people begin to understand how vulnerable national and regional power supplies are -- to EMP attack, solar flares, computer hackers, etc. And unfortunately, the "smart grid" only makes it that much easier to shut the whole thing down remotely. A recent conference on SMRs in Columbia, South Carolina, drew 250 participants, and more individuals of influence are beginning to speak out about the US NRC's inability to facilitate a more rapid movement toward safer, more reliable, more economical, and more robust nuclear power infrastructures.


Monday, April 25, 2011

Bio-Oils to Diesel, Jet Fuel, Gasoline via North Carolina State U.


Researchers at North Carolina State University have devised a refining method for converting triglycerides, or biofats, into drop-in fuels for diesel, gasoline, and jet (turbine) engines. The technology is being developed by a small startup, Avjet Biotech Inc.
The Red Wolf Process (RWP) consists of three main steps: hydrolysis, deoxygenation and hydrocarbon reforming.

Hydrolysis. The first step of the RWP uses the well-established Colgate-Emery reaction (hydrolysis via high-pressure, high-temperature steam) to cleave the three fatty acid chains from the glycerol backbone of the triglycerides. This is accomplished by separately pumping steam and the feedstock oil at high temperature and pressures into a Colgate-Emery reactor. Two product streams exit this reactor, free fatty acids (mixed with water) and sweet water (glycerol mixed with water). Water is removed from the free fatty acid (FFA) stream prior to the next processing step: deoxygenation.

Deoxygenation. The free fatty acid (FFA) and low concentrations of hydrogen are fed into a deoxygenation reactor. This reaction has two steps: 1) remove any degrees of unsaturation from the FFA; 2) catalytically remove oxygen molecules from the FFA. This reaction step results in a saturated, straight chained hydrocarbon with one less carbon than the reacted FFA (i.e. a C18 free fatty acid yields a C17 hydrocarbon). These n-alkanes are a common component of petroleum (which has a wider distribution of alkanes as well as other types of hydrocarbons). The n-alkanes derived using the RWP from the fat-containing oils are in the ranges needed for diesel and jet fuel.

Hydrocarbon reforming. This step converts the n-alkanes to the specifications necessary for jet, diesel and gasoline fuels. Two separate reactions (and reactors) are used to reform the n-alkane into the desired fuels: hydroisomerization and aromatization.

In the hydroisomerization reaction, the n-alkane is branched (no longer a straight chain). Within the hydroisomerization reaction, hydrocarbon cracking also occurs, which shortens the hydrocarbon branch size, which can be controlled to give the desired ranges for jet, diesel and gasoline fuels. Aromatization is a reaction in which the n-alkane is converted into aromatic (or cyclic) hydrocarbons. These aromatics formed are necessary to meet specifications for jet fuel and are also a common component in gasoline to raise the octane rating.

A parallel step to the fuel conversion is the separation of the by-product of hydrolysis, glycerol, from the sweet water. Then the glycerol is combusted to provide energy for the entire process. This step increases the energy efficiency of the process as well as minimizes waste streams.

Red Wolf says that modeling has shown economic viability for smaller plant sizes (10 MGPY and larger). Co-locating fuel processing plants near the feedstock-producing sites, rather than near oil refineries, is therefore advantageous in this schema. This mitigates transportation costs of both the feedstock and the produced fuel. Smaller plants require less land space and capital, and are optimized towards the most economical feedstock of the region. _More details and illustrations at GCC

The process above is not likely to bring high profits to developers and investors unless the price of oil remains inflated, as currently. If oil prices are driven to rise too quickly, too far, the resulting demand destruction and oil price crash will devastate all forms of alternative fuels production which rely upon the high price of crude in order to break even.


Sunday, April 24, 2011

49th Edition of Carnival of Nuclear Energy @ Yes Vermont Yankee

Yes Vermont Yankee hosts this weeks edition -- # 49 -- of the Carnival of Nuclear Blogs. Excerpts below. Follow link to full carnival.

Fear of Radiation and Effects of Fear

In a first for the ANS Nuclear Cafe, four contributing bloggers do a group post on this subject. They ask whether improvements are needed in explaining the significance of the numbers to the public. The answer, as an outgrowth of media and public confusion related to the ongoing Fukushima crisis, appears to be a resounding "yes."

The contributors are Stewart Brand, ecologist; Cheryl Rofer, chemist; Steve Aplin, management consultant, and Mimi Limbach, public relations executive. Dan Yurman of Idaho Samizdat pulled the group together and wrote the introduction.

This post is a reprint of a 1998 paper written by Ted Rockwell. In light of the reminders of the 25th anniversary of Chernobyl - with outrageous claims of long term health effects - and the continued confusion about the evacuation zone around Fukushima it seemed important to share this work more widely. Radiation is something to be understood, not something to be feared. Its health effects have been widely and extensively studied for more than a century.

Shaping the Energy Debate: Brian Wang at Next Big Future
About a month ago, Brian Wang wrote an excellent post on deaths per TWh for different energy technologies. In these week's post, he notes that even Greenpeace is quoting his earlier post. With careful research and honesty, pro-nuclear bloggers are shaping the energy debate.


Some New Directions Post-Fukushima: Posted by Gail Marcus at Nuke Power Talk
In this post, Gail Marcus looks at the future of regulation and research after Fukushima. Emphasis will probably shift to research and regulation about the systems that led to the accident (back-up generation and fuel storage), but research on non-traditional nuclear designs will also get a boost. Pebble Bed reactors, for example do not use water for cooling.

Nuclear Fusion

Brian Wang examines the steady progress Lawrenceville Plasma Physics (LPP) is making toward nuclear fusion. LPP is solving one problem at a time, and Brian describes their latest advance.

Brian Wang has more news about Helion Energy's fusion approach

Energy drives all other industry, and energy itself is driven by human psychology along with basic economics. Basic economics is reasonably well understood, but the human psychology of the coming quasi-suicidal Idiocracy -- and how things came to be this way -- is something the powers-that-be do not wish you to even think about.

Who is John Galt?


Higher Oil Prices Seem to Prove Peak Oil Sceptics Right After All

Peak oil sceptics have been saying that if the price of oil rises high enough to offset the decline in the Obama dollar, that oil production would rise correspondingly. And so it seems to be doing:
The US Energy Information Administration (EIA) is reporting a new monthly peak in crude oil and lease condensates for January 2011 at 75.282 million barrels per day. This is 600,000 barrels more than July, 2008 (74.669 million barrels per day).

The IEA had reported a new peak in world oil supply (that includes other fuel liquids at 89 million barrels per day in February 2011)

These are important for the whole peak oil argument. World oil production is still slowly moving up. It is not declining yet. Claims of peak oil having already occurred in 2008 or 2005 are wrong. Even with Libyan oil production out, there could still be new highs in world oil production.

If world oil production keeps going up slowly to 2018-2025 then so what ?

It means more time for improved biofuels to be created. Algae biofuel or other kinds of synthetic fuel. _NextBigFuture

There are many reasons to be sceptical of the peak oil doomer spiel:
1. There remains considerable scope for further oil discoveries. For example, the Arctic is effectively unexplored and is opening up, and there remain unexplored areas elsewhere, especially in deep water and onshore. Furthermore, the technology-driven successes in exploiting shale gas in the US should now be repeatable in shale oil.
2. Putting new discoveries to one side, globally only circa 10% of existing oil discovered volumes have been brought onto production so there remains a considerable resource to be exploited.
3. Considering fields that are already in production, average global recovery factors are relatively low. Some industry experts place the number as low as 22%, others in the low 30’s. In either case, there is a considerable prize to be won by the use of improved recovery and enhanced recovery techniques. Some numbers illustrate this:
Shifting the average recovery factor offshore Norway from ~45% which is what it is today to their government’s target of 50% would add an extra 4 billion barrels or so.
Enhanced Oil Recovery using CO2 is capable of raising recovery factors offshore the UK by from 4 to 12%, resulting in from almost 3 up to 8 billion barrels of technical reserves. This is 60 to 160 times the most exciting discovery made in the UKCS last year!
An increase of 1% in the aforementioned global recovery factor would yield almost 90 billion barrels, equivalent to roughly 3 years consumption at current rates.
4. Some would argue that ‘Peak Oil’ will occur for economic reasons, specifically because the price of oil will rise so high that it will become too expensive to use.... _OilVoice
But the problem with the #4 argument is that as the price of oil rises, not only do new oil resources (deep sea etc.) become economical to exploit, but more alternatives to oil (GTL, CTL, BTL, and soon --- microbial fuels etc) become economical substitutes. As new resources and technologies come to bear -- on top of inevitable demand destruction -- the danger to oil markets becomes yet another price crash.

Peak oil religionists get hung up in "EROEI" and "exponentially rising global demand for oil" etc. without understanding all the drivers and inhibitors of demand, and without understanding all the dynamic economic and technological factors which affect EROEI. Consequently, many true believers peak oil tend to invest more than they can afford in long oil instruments, and when the inevitable collapse of the periodic bubble occurs, these invstors get hammered -- as in 2008.

Things are setting up for a repeat of the 2008 crash much more quickly than is comfortable for most observers. Blaming it all on the Obama - Bernanke fiscal and monetary policies would be unkind. They are only two men out of many individuals whose corruption is bringing about massive economic hardship for hundreds of millions of people in the developed world. Yet, if you could select any two persons most proximately responsible for the disaster that is developing, you would not go far wrong by choosing Oba-Bernie.

Labels: ,

Saturday, April 23, 2011

About Fracking, GTL Offshore, and CTL in West Virginia Hills


Modern horizontal drilling combined with fracking, have unleashed untold energy wealth upon a previously unsuspecting world. So, naturally the faux environmental movement -- with a little encouragement from a much-inconvenienced Russian gas industry -- is pulling out the stops to demonise fracking and shale gas / oil. The Luddites are ratcheting up the anti-fracking, anti-energy hysteria almost as high as it will go -- but the anti-energy campaign of the Luddites is based upon the flimsiest of delusion. About fracking:
...the whole anti-fracking movement has its head where the sun doesn’t shine – and here are just ten reasons why.
  1. Hydraulic fracking has been around for 60 years. Developments made by U.S. engineers around 2008-9 have simply made the process much more commercially viable.
  2. Since fracking was introduced in 1949, over 2 million frack treatments have been pumped without a single documented case of treatments polluting a water aquifer.
  3. 90 percent of all gas wells drilled in the United States since 1949 have been fracked.
  4. The depth of most shale gas deposits drilled is between 6,000 and 10,000 feet – water aquifers exist at an average depth of 500 feet.
  5. Claims of ‘migration’ between the shale gas layers and water aquifers due to fracking or for any other reason, are patently absurd as the gas would have to pass through millions of tons of impermeable rock. If the rock was that porous, neither the water nor the gas would have been there in the first place. (As the hard data in fig. 1 from a study of 15,000 frac treatments in the Barnett Shale Field reveals plainly.)
  6. Fracture design engineers go to great lengths to avoid fracture growth of even 100 feet to prevent losing production.
  7. The new eco-horror genre flicks like Josh Fox’s Gasland, create impact by making outrageous claims which include suggesting “569 chemicals” are used in a single “toxic cocktail” frack treatment. The reality is that 99.5 percent of the treatment is water and sand. Much of the remainder is made up of a maximum of 12 or so harmless gelling agents, like Guar gum (used in ice cream making), and chemicals commonly used around the house.
  8. Domestic running water faucets being set alight with a match might wow gullible film audiences, but dissolved methane found in well water may well be biogenic (naturally occurring). As the largest component in natural gas, methane is not even regulated as it is not toxic and escapes naturally like soda bubbles.
  9. Hydraulic fracking procedures are heavily regulated and not, as often claimed by eco-activists, exempt from drinking water and other key regulatory laws.
  10. Concerns about using “excessive water resources” in the process are already being assuaged by new developments, including recycling water. And the U.S. Ground Water Protection Council confirms that drilling with compressed air is becoming increasingly common.
_EnergyTribune Peter Glover
Just when the lefty-Luddites thought that energy starvation was a "done deal," just when peak oil doomers thought we had reached the end game stage, and just when the Russians thought they had Europe over a barrel -- fracking shale oil and gas come along to spoil their little doom and power party.

Such deep disruptions of doom fantasies are much to be welcomed, at any time.

More on the Oxford Catalyst microchannel F-T GTL process being employed by Petrobras offshore in Brasil

West Virginia's Adams Fork Energy is to produce 18,000 barrels a day of premium grade gasoline, from West Virginia coal.

With the unleasing of the massive reserves of shale oil & gas, coal (CTL), and the coming clean and massive production of liquid fuels from heavy oils and oil sands -- the world may be ready to get its "second wind of energy." The world possesses several trillion barrels of oil equivalent in gas, coal, bitumens, methane clathrates, and kerogens. All that is needed is the right technology to turn them into prime liquid fuels.

Labels: , , , ,

Friday, April 22, 2011

Catalysts and Solvents: Making Everything Possible

While we like to dwell on more exotic technologies and scientific theories, it is the nuts and bolts of modern industry and industrial scale agriculture which keeps you safe, warm, dry, and well fed. Almost no one likes to think about catalysts and solvents, but the quality of those arcane, mundane, nitty-gritty ingredients of your hidden underworld, determines much of what you can do with yourself.

Some interesting developments in catalysts:

New, cheaper nickel-based catalysts may spark a fuels and chemicals revolution

Newer, cheaper, platinum-free catalysts may open the door to cheap fuel cells, and fuel cell automobiles

Cheap molybdenum catalysts may make electrolysis of water to hydrogen / oxygen cheap and practical

Nanotechnology advances add an extra dimension to progress in catalysts

Solvents are even more easily ignored in everyday discussion than catalysts -- except in the context of a faux environmental armageddon. But they are no less important to everyday life for all of that.

New ionic solvents likely to revolutionise oil sands industry -- making oil sands and heavy oils environmentally friendly and setting back peak oil decades.

Supercritical CO2 and steam are proving to be effective solvents for more and more processes.

The movement toward cleaner, cheaper, more sustainable and effective solvents is accelerating, just like the movement toward better catalysts. And those are just two of the basic foundations of modern life where marginal improvements can pay huge dividents in quality of life.

Paying attention to such things can provide amazing investment opportunities as well.

Labels: , , ,

Oil Price Spikes and Excess Volatility: Investors Always Welcome

Financial markets have always been subject to erratic swings in prices due to investor whims, notions, and panics. Commodities markets -- including oil markets -- have become more like equity markets in that jittery respect. Fast and powerful electronic trading platforms combined with a delusional mindset that says oil will always go up in price, biases oil markets in the upward direction. Particularly when the global economy is stuck in a stagnant funk thanks to generally dysfunctional policies of central banks and governments -- and big institutional investors are desperate for ways to increase portfolio asset value., demand and geopolitical risks are no longer reliable tools for predicting commodity prices, and haven’t been since the early 2000s. At that time, two major trends converged and altered financial and commodity markets.

First, the advent of widespread Internet trading platforms radically increased the number of people with access to commodity markets, decreased the amount of time it took for an investment decision to impact the market and expanded the amount of money that could be applied to those markets. In particular, the creation of energy-indexed investment vehicles created additional demand for commodities by people who have no intention of ever taking delivery of the commodity.

Second, this technological evolution occurred just as America’s Baby Boomers, the largest generation in American history as a proportion of the population, approached retirement. For the most part, their children had moved away and their homes were paid for, while their earning power was the highest in their lives. Consequently, this demographic had large savings, and over the last 10 years those savings have become available for investment just as more options for investing it into commodities have opened up. Most of the developed world has a similar demographic bulge.

This created a problem for predicting prices. Industrial demand is fairly easy to predict, since it is based on — and highly constrained by — actual structural realities. If one has a good feel for an economy, one can reasonably predict whether economic activity is rising or falling and how industrial firms will react to that.

Not so with investors, who — almost by definition — trade on intuition as they seek to outthink the markets and each other. But perhaps most important, unlike the industrial world, the world of investors has no single or collective pulse to take. Even if there were, investors often respond to price shifts in a manner opposite to industrial players. Rising prices draw them rather than scare them away. After all, no investor wants to miss out on a winning trend. And so those investors have become the oil market’s price setters.

In any other market, the presence of a mass of new players would obviously have a distorting effect, but in the oil market, the inelastic nature of oil demand magnifies the investor presence. Since oil is so essential to modern life — needed for everything from transportation to making plastics, fertilizer or paint — industrial and retail demand for oil is actually fairly stable. The introduction of dynamic actors into a normally static system results in periodic and disproportionate price shifts....

...investors make the system sufficiently erratic that forecasting its activity, aside from noting that price crashes are inevitable, is largely impossible.

There is one final factor in play that is driving the markets, and in the past five years it has greatly magnified the role that investors play: an increase in the money supply.

Over the past six years, the global money supply has roughly doubled. There are any number of reasons to expand money supply, but the most relevant ones of late have been to ensure that there is sufficient credit to stabilize the financial system. However, governments have few means of forcing such monies to go in any particular direction. And since the entire purpose of professional investors is to shuffle money to where it will earn them the highest return, some of the money from an expanded money supply often finds its way into commodity markets. _Forbes
Conventional wisdom has long proclaimed that out of control futures markets cannot affect the real price of a commodity. In the distant past, that was approximately true. But modern traders have more sophisticated tricks up their sleeves than you would believe. And the largest of them are quite well connected with the political power structures in any advanced nation with significant financial markets.

Popular delusions of "peak oil" and "climate catastrophe" can only aggravate political and economic forces at the highest levels, leading to stacked dysfunction and snowballing misallocation in government policies and financial strategies.

Wildly fluctuating energy costs and periodic commodities crashes are nuisances, to be sure, and often impossible to ignore. Nevertheless, in terms of personal and group planning, it is best to treat such movements as distractions -- unless you have special insights into the particular swings that are taking place moment by moment on the global stage.

But beware. Almost everything you think you know, is wrong. Well, actually, everything you think you know is wrong, but I wanted to leave you with at least a little hope. ;-)

Labels: , ,

Biofuels Production of Over US$ 11 Trillion by 2050: IEA

In Washington, the International Energy Agency (IEA) said that it expects biofuels to generate $11-$13 trillion in production between 2010 and 2050, and the global share of biofuel in total transport fuel to grow from 2% today to 27% in 2050. _BiofuelsDigest

UC Berkeley is "all in" on the project. Not only is UCB an integral part of the Joint BioEnergy Institute, the school has also just launched the Synthetic Biology Institute for new research & development in biological engineering, and scale the advances up to industrial levels.

In the meantime, dozens of well-financed biotech companies, such as Joule Unlimited, are rushing ahead to develop custom microbes specifically tailored to produce specific fuels and high value chemical products. Joule is based in Cambridge, Massachusetts, on the opposite side of the US from UC Berkeley, in the middle of a competing high technology startup zone. Similar zones are located near Stanford U., around Austin, Texas, near La Jolla, California, and around dozens of other high tech startup zones across North America.

Clearly, if biofuels are to provide roughly 1/3 of global transportation fuels by the year 2050, a tremendous amount of feedstock will be required. Cellulosic biomass (both marine and terrestrial) will be one type of feedstock, as will waste streams of various types. All types of waste plastics, waste rubbers, waste papers and cardboards, waste foam packing etc etc will be routinely grabbed up by this growing industry, as valuable feedstock. Anything considered "garbage" or "waste" today is likely to be seen as a feedstock to be turned into high value product, by the time period of 2025 to 2050.

This will be simple economics, not an effort to "save Gaia."

Virginia Tech has recently licensed an open software tool tailored for synthetic biology safety. It is meant to monitor chemical reagent and biological agent acquisitions, to minimise the dangers of misuse of synthetic biology technologies, eg for terrorism.

Labels: , ,

Thursday, April 21, 2011

Stark, Simple Reality of Energy Density

MaterialEnergy Density (MJ/kg)100W light bulb time (1kg)
Wood101.2 days
Ethanol26.83.1 days
Coal32.53.8 days
Crude oil41.94.8 days
Diesel45.85.3 days
Natural Uranium5.7x105182 years
Reactor Grade Uranium3.7x1061171 years
__Source of table
It should be remembered, that the specific energy release from fission is many orders of magnitude larger than from chemical, mechanical or photoelectric processes (for example: O2 + C = CO2 yields 4.1 electron volts (ev) of energy, fission of a Uranium nucleus yields ~200 million electron volts), and thus it is not surprising that the resources besides Uranium, and some day Thorium, that a nuclear power plant requires (land, water, machinery etc), which do have substitution-value, are modest compared to other electrical energy sources. _Michael Natelson, Nuclear Engineer
Nuclear reactions may release from 10^6 to 10^8 more energy than chemical reactions.

Even LENRs (low energy nuclear reactions) such as claimed for the Rossi / Focardi E-Cat reactor could have much higher energy density than simple combustion reactions. That is why such a few grams of nickel may be able to release as much energy as many tonnes of coal.
This type of reaction, also called LENR_Low Energy Nuclear Reactions, belongs to the family of low energy nuclear reaction: it differs from the most famous hot reaction to extremely low values ​​of temperature and pressure at which it operates, with the support of a catalyst - such as palladium. In the case of 'E-Cat, were a few grams of hydrogen and nickel to fuse their nuclei, releasing energy (12 kW to 6 kW input) and a piece of copper as residue - in particular, is a proton' hydrogen into the core of nickel, copper turning. The scientific community, meanwhile, is now divided in accepting the Bologna experience as a real reaction to cold or some other phenomenon of nature is still unclear. _Italia (translated)
Clearly those who are trying to develop small modular nuclear reactors, small fusion such as Bussard or Focus Fusion, and LENRs such as the E-Cat, are all focused upon a much higher level of energy and power production than those who are necessarily stuck on coal, gas, and petroleum. Clearly bioenergy has a significant distance to go to compete with fossil fuels, but it is indefinitely sustainable, and provides necessary mobile liquid chemical energy for many applications. As for wind and solar, they have almost nothing going for them -- except for small, niche applications such as off-grid isolated locations.

Much better to face the stark and simple reality of energy and power density, and to work with that reality toward an abundant energy future. Anything less than that is a failure to comprehend the obvious.

Labels: , ,

Wednesday, April 20, 2011

Algal Cropping for Drylands Such as New Mexico and Israel

In the context of NMSU's multifaceted algal research agenda, the photobioreactor has a dual purpose, according to Lammers. Not only will it help answer major research questions about how best to raise algae in the southern New Mexico climate, it will assume an expanding role as a production facility.

The standardized algal biomass it generates will be used for research on algal oil extraction and fuel conversion technologies, as well as the development of algal co-products such as high-protein animal or fish meal and fish-oil replacements. "The economics of algae-derived fuel will be very difficult without generating revenue from every portion of the algae biomass," Lammers said.
New Mexico State University is taking delivery of a new algal photobioreactor from Solix -- a Lumian AGS-4000. The university will use the new photobioreactor (PBR) to help achieve critical new research on dryland algal cropping for algal fuels. As noted above, some key breakthroughs will be needed on several fronts, in order to optimise the economics of algal fuels, co-products, and power.
Funds to purchase the system came from a recent $2.3 million U.S. Air Force grant; long-range operational costs will be covered by a $49 million Department of Energy grant that established the National Alliance for Advanced Biofuels and Bioproducts consortium.

The Solix BioSystems Lumian AGS4000 is an algae cultivation system with a 4,000-liter production capacity that allows faster and denser production of algae than open "raceway" systems. In the new photobioreactor, algae culture will grow in enclosed panels suspended in an open 61- by 11-foot water-filled basin. Control of various factors, such as temperature, carbon dioxide content and nutrient supply, is very precise and the panels are designed to optimize solar exposure. The result is a system that can accelerate the rate of CO2 absorption, and therefore the rate of algae growth, up to 10 times the rate of raceways and can produce up to three times the density of algae per liter of water.


There is a need for critical advances in algal growth, algal harvesting, energy extraction, and co-product / power production -- in continuous, closely integrated fashion. Drylands algal growth can utilise salt water, brine, or wastewater, and can take place year round.

PBR's will become more important as more specialised strains of algae are used, which must be kept separate from wild strains in the atmosphere which could easily interfere with growth, in an open raceway or pond.

Israel is another drylands area where algal research is on the fast track. In fact, any arid nation in relatively close proximity to bodies of salt water, should be ideal for algal cropping -- since algae can grow so well in salt water.
Most companies pursuing algae as a source of biofuels are pumping nutrient-laden water through plastic tubes (called "bioreactors) that are exposed to sunlight (and so called photobioreactor or PBR). Running a PBR is more difficult than an open pond, and more costly.

Algae can also grow on marginal lands, such as in desert areas where the groundwater is saline, rather than utilize fresh water.

Because algae strains with lower lipid content may grow as much as 30 times faster than those with high lipid content, the difficulties in efficient biodiesel production from algae lie in finding an algal strain with a combination of high lipid content and fast growth rate, that isn't too difficult to harvest; and a cost-effective cultivation system (i.e., type of photobioreactor) that is best suited to that strain. There is also a need to provide concentrated CO2 to increase the rate of production _PeaceCorpsConnect
As noted above, it is more efficient at this time to grow algae for biomass, rather than for oil. As a biomass crop, algae is unsurpassed, and can be grown over roughly 90% of the Earth's surface (land and sea). The facts that algae can clean wastewater, gobble CO2 from power plants, ethanol plants, and cement plants -- and provide fish and animal feed are additional advantages.

A recent study by the Pacific Northwest National Labs suggested that 17% of US oil imports could be replaced by algal fuels, using roughly 5.5% of the lower 48 states' land area. This is an admirably cautious report from PNNL, which has almost nothing to do with what the state of the art will be in 5 years, and ignores th potential for genetic engineering of algae to increase sunlight-to-biomass or sunlight-to-lipids efficiencies.

Using small modular nuclear reactor heat and power, large quantities of algal biomass and fuels could be grown virtually anywhere on the planet or off the planet -- including polar stations, undersea stations, mid-ocean seasteads, orbital space stations, or lunar colonies etc. Algae modified to provide high quality food and recycled water and air for humans, would be ideal components of semi-hermetically sealed environments and outposts.,

Labels: ,

Tuesday, April 19, 2011

Industrial Scale Bio-Butanol Production w/ Clever Co-Processing

Cobalt’s technology converts sugars from non-food feedstock, such as forest waste and mill residues, into biobutanol. Cobalt’s continuous butanol production system is based on advancements in biocatalyst selection, bioreactor design and process engineering, resulting in a productive, capital-efficient, low-cost solution. This foundation ensures the production process is able to scale up quickly while maintaining capital efficiency. _GCC
Butanol is a four carbon alcohol which possesses much superior properties over ethanol for burning in modern gasoline engines, or for combining with diesel in modern diesel engines.

Cobalt Technologies is partnering with American Process Inc. to build an industrial sized bio-butanol plant. In addition, the partners are joining with Green Power + to include a clever carbohyydrate extraction system and alcohol processing module to a biomass power generation system (see image below).
Under the agreement, Cobalt Technologies and American Process will integrate Cobalt’s patent-pending continuous fermentation and distillation technology into American Process’s Alpena Biorefinery, currently under construction in Alpena, Michigan. Slated to begin ethanol production in early 2012 with a switch to biobutanol in mid-2012, the API Alpena Biorefinery will produce 470,000 gallons of biobutanol annually, which will be pre-sold to chemical industry partners.

Funded in part by an $18-million US Department of Energy (DOE) grant and a $4-million grant from the State of Michigan, the API Alpena Biorefinery will demonstrate the conversion of hemicelluloses extracted from woody biomass to fermentable sugars that can be used for production of ethanol. Meanwhile, Cobalt’s technology will demonstrate that these sugars can also produce butanol.

Greenpower+. GreenPower+ utilizes a module in front of the biomass boiler that utilizes steam extract hydrolyzate as feedstock and an ethanol extraction module. Dewatered solids are then returned to the biomass boiler. The process significantly increases overall profitability by converting low BTU hemicelluloses into high-value ethanol. The process enables cost-effective cellulosic ethanol production at a small scale of 10-20MMUSG/year, with an ethanol production cost of around $1/gallon US, API says.

The process significantly increases overall profitability of the site by converting hemicelluloses into fermentable sugars, which can be converted to high value biofuels and biochemicals. The GreenPower+ technology is applicable in any industry employing biomass boilers or having organic effluent. _GCC
As you can see, by combining cellulosic power and heat generation with cellulosic alcohols production, a company is able to increase revenue streams while providing its own process heat and electric power.

Labels: ,

Sunday, April 17, 2011

48th Carnival of Nuclear Energy: ANS Nuclear Cafe

ANS Nuclear Cafe is hosting the 48th Carnival of Nuclear Blogs (via NextBigFuture). Some excerpts are published below:

Idaho Samizdat – Dan Yurman
Decommissioning plans at Fukushima must wait for stable reactor conditions, The world’s biggest nuclear energy firms are lining up with proposals to clean up a historically huge radioactive mess at the Fukushima, Japan, reactor site. There six reactors in various degrees of damaged condition are presenting new engineering challenges on a daily basis punctuated by earthquake aftershocks and the continuing threat of new tsunamis.
At the same time, the Japanese and U.S. news media are publishing stories about the early stages of the crisis which may partially explain why NRC Chairman Gregory Jaczko issued a call for Americans to evacuate to a distance of 50 miles from the site.
Idaho Samizdatguest blog post by Jacques Besnainou, CEO, Areva Inc.
I am writing this essay today as a frustrated and fed up reader of nuclear-related stories originated by anti-nuclear organizations. While most recent reporting on the Fukushima reactors has been fair, some quite admirable, the coverage of MOX (mixed oxide) nuclear fuel has been mostly inaccurate and filled with half-truths.
As you may know, one of the reactors at Fukushima used MOX fuel. So what? The situation in Japan was not related to MOX fuel nor has its presence worsened the situation.
Next Big Future – Brian Wang
No deaths from radiation at Fukushima
The Register UK – The total non-story of the Fukushima nuclear power plant “disaster” – which has seen and will see no deaths or measurable health consequences for anyone anywhere – has received a shot in the arm today with the news that Japanese authorities have upgraded the incident to a Level 7 on the nuclear accident scale.
Fukushima at Level 7 on INES scale
Fukushima was raised to level 7 the same category as Chernobyl but Chernobyl had10 to 100 times more radiation. Japan raised the severity rating at the stricken Fukushima Daiichi nuclear power plant to level 7, the most serious on the international scale and the same rating that was given 25 years ago to Chernobyl, as aftershocks close to the facility heighten safety concerns.
The level 7 designation was made “provisionally,” and a final level won’t be set until the disaster is over and a more detailed investigation has been conducted. The previous event level of 5, equal to the 1979 accident at Three Mile Island in Pennsylvania, was also a provisional designation.
NEI Nuclear Notes
Nuclear Energy Workers in Japan and the U.S.
“First things first: nuclear workers in the United States, both employed by the plants and by contractors, are highly trained for their duties – no farmers plucked from their fields, no gangster-hires. Additionally, the safety culture implemented at plants applies to all workers, so any safety issue that arises can (really, must) be reported.”
Advances in nuclear safety – video from Idaho National Laboratory
Idaho National Laboratory’s Director John Grossenbacher explains how the U.S. nuclear industry has boosted its safety procedures as a result of the Three Mile Island (TMI) accident in 1979 and how the industry plans to use current events at Japan’s Fukushima nuclear plants to further enhance safety.
Yes Vermont Yankee – Meredith Angwin
Fukushima Oversimplified and Simplified – This Yes Vermont Yankee post tracks the evolution of our understanding of radiation sources and levels/ The journey took us from chaos, to oversimplification, and finally, at this point, to some level of clarity.
ANS Nuclear Cafe
At the act of creation – Susie Hobbs
The nuclear crisis in Japan will undoubtedly change the nuclear industry forever. Due to the ongoing efforts of so many nuclear professionals and supporters, I am beginning to think that it will be a change for the better. Innovative technologies and creative outreach are already positively impacting the way we think about energy in America and around the world.

As the lessons from Fukushima are being learned and applied, other approaches to nuclear power continue to be pursued, such as small modular nuclear reactors, low energy nuclear reactions, large and small fusion reactors of multiple types, and more esoteric forms of energy which probably will never work out, but may actually change the world.  No one knows, because humans are typically highly fearful, shite-throwing monkeys who tend to huddle in dark corners for security.

It is clear from the mainstream reactions to Fukushima, that not everyone is capable of waking up. Most people do not even want to wake up, they are too comfortable sleeping. That may be the most important lesson to learn from the world's reaction to Japan's sad experience.


More on Algal Biomass, Microchannel Gas to Liquids, BTL


PetroAlgae is an algal fuels company aiming for the near-term production of fuels from algae, in addition to producing animal feed co-product and electrical power. PetroAlgae is taking the algal biomass approach initially, and will presumably convert later to an "algal oils to biodiesel" approach to fuels as the technology matures over the next 10 to 20 years. This is the approach that Al Fin algal scientists and engineers have been recommending, for early algal fuels production.
Through a new agreement with Haldor Topsoe A/S and its U.S. subsidiary Haldor Topsoe Inc., PetroAlgae will now use catalysts provided from the subsidiary's Houston headquarters to enhance the oils produced through its algae refining process that includes coking and pyrolysis.

The agreement will also allow PetroAlgae to test the algae biomass produced from its system in refinery cokers and “validate the commercial viability” of the process according to John Scott, chairman of PetroAlgae. _BiodieselMag

In a fascinating development, Oxford Catalysts has shipped a microchannel gas-to-liquids demonstration plant to Brazil, for use by Petrobras. The plant was assembled in a plant in Asia, disassembled for transport, and will be reassembled at a Petrobras refinery in Fortoleza, Brazil, over the next 4 months.
The integrated GTL demonstration plant incorporates the Group’s proprietary microchannel reactor and catalyst technologies for the key Steam Methane Reforming (“SMR”) and Fischer-Tropsch (“FT”) steps of the GTL process. The demonstration is fully funded and managed by the Group’s partners Toyo Engineering Corporation and MODEC, Inc., in collaboration with the Brazilian national oil company Petróleo Brasileiro S.A. (“Petrobras”) which is hosting the demonstration at its Lubnor refinery in Fortaleza, Brazil.

The GTL plant will be reassembled at the demonstration site, and will then progress to the pre-commissioning and commissioning stages. These are expected to be completed within four months. The demonstration plant is scheduled to start up in September, subject to successful commissioning and availability of the required utilities from Petrobras. Following start up, the demonstration will operate for approximately nine months. _OxfordCatalysts

KiOR is pushing ahead with its IPO, aiming for a $100 million max target for its US biomass-to-liquids technology.
$1.80 per gallon: KiOR says its technology, scaled up to oil industry size, can turn wood chips into “biocrude,” then ship it to existing oil refineries to crack it into gasoline or diesel fuel, at a price of $1.80 per gallon — without government subsidies. UPDATE: Crude oil is measured in 42-gallon barrels, which would set the cost of a barrel of KiOR crude at about $76 — and oil was trading at $106.25 a barrel on the New York Mercantile Exchange this morning, the lowest it’s been since March 30. As a rule of thumb, crude oil makes up about one-half to two-thirds of the price of a gallon of gas at the pump, which would price KiOR’s pump-ready output at roughly $2.70 to $3.60 per gallon. By way of comparison, conventional gasoline and diesel were $2.86 and $3.08 per gallon on the Gulf Coast as of March, and market prices for corn ethanol, biodiesel and sugarcane ethanol were $2.49, $4.78 and $3.50 per gallon, according to KiOR.
1,500 bone dry tons (BDT): That’s how much wood chip material KiOR will need to process every day to reach that super-low price of $1.80 per gallon. Its current demonstration plant, on the other hand, is set up to process 10 BDT per day and has been running since March 2010, which gives a sense of the scale KiOR is seeking to achieve in the space of a few years. _gigaom
KiOR's plans illustrate the near-term thermochemical BTL approach, using conventional wood chip feedstock. Other companies may choose a similar approach, but using other biomass feedstocks.

It is too early in the BTL game to know which feedstocks (other than micro-algae and macro-algae) provide the greatest amount of biomass on a reliable and sustainable basis. Entire industries will be required for biomass production, preprocessing, refining, distribution, and sales.

Almost the entire surface of the planet -- except polar regions -- is suitable for growing biomass -- both marine and terrestrial. As the best biomass feedstocks prove themselves over the next ten years or so, it will become easier to calculate the true potential for biomass to liquids fuels.

Labels: , ,

Saturday, April 16, 2011

Ohio Looks Unapologetically to Utica Shale

In a presentation to the Ohio Oil and Gas Association last month, Larry Wickstrom, the state's geologist, estimated producers could recover as much as 15.7 trillion cubic feet of natural gas and 5.5 billion barrels of oil from Ohio's share of the Utica Shale.

That's a "very conservative estimation," said Mac Swinford, assistant chief of the Ohio Geologic Survey.

While far smaller than gas fields in other states, it's not small change. At today's values, the Utica Shale could contain more than $600 billion in oil and natural gas _PortClintonNewsHerald
Some US state governments have intentionally handicapped their states' ability to profit from rich shale hydrocarbon deposits. But oil & gas wealth is nothing to apologise for:
. "Peak oil"—the theory that global oil production will soon hit maximum levels and begin to decline—is a favorite among this crowd, and it is one basis for their call for more biofuels and solar power. Mr. Watson doesn't dismiss the idea but explains why it remains largely irrelevant.

In theory, he says, "we've been running out of oil and gas for a long time," yet technology creates new opportunities. Mr. Watson cites a Chevron field long in decline down the road in Bakersfield—to the point that for every 100 barrels of oil "in place," the company was extracting only 10 or 20. But thanks to a new technology called steam flooding, Chevron is now getting 70 to 80 barrels. "Price creates incentive, and energy will be developed if there's demand for it at the price you can develop it," Mr. Watson says. In that sense, "oil and gas are plentiful."

Don't believe it? Over the past 30 years, even as "peak oil" was a trendy theme, the world's proven reserves of oil and natural gas increased 130%, to 2.5 trillion barrels.

...Or consider America's latest energy innovation: hydrofracking for abundant and cheap natural gas. This advance, says Mr. Watson, took even the industry "by surprise"—as evidenced by the many U.S. ports to import liquid natural gas that are now "sitting idle." _WSJ
The alternative to using hydrocarbons is a return to the stone ages, or worse, for billions of people. Lefty-Luddite greens of the dieoff.orgy persuasion actually long for a huge dieoff of the human population. It is possible that Mr. Obama and many of his closest aides consider themselves among such greens, in outlook. That would certainly explain President Obama's energy and economic policies.

But leftists and faux environmentalists have ridden their hysterias almost to the limit. It is becoming more difficult for them to disguise the horrific end result of their policies -- even from a massively dumbed down Idiocracy.

Labels: , ,

Deep Earth Hydrocarbons of Abiotic Origin

Scientists at Lawrence Livermore National Laboratory used supercomputers to simulate what would happen to carbon and hydrogen atoms buried 40 to 95 miles beneath the Earth’s crust, where they would be subjected to prodigious pressures and temperatures.

They found at temperatures greater than 2,240 degrees F and pressures 50,000 times greater than those at the Earth’s surface, methane molecules can fuse to form hydrocarbons with multiple carbon atoms. Interactions with metal or carbon sped up the fusion process, the researchers said. These conditions are present about 70 miles down, according to an LLNL news release. _PopSci

A team of scientists and engineers from UC Davis, Lawrence Livermore Labs, and Shell Projects and Technology have created sophisticated simulations which demonstrate that methane can be polymerised to multi-carbon chains under conditions similar to those in the deep crust and mantle of Earth. (Published in PNAS)
...hydrocarbons of purely chemical deep crustal or mantle origin (abiogenic) could occur in some geologic settings, such as rifts or subduction zones said Galli, a senior author on the study.

"Our simulation study shows that methane molecules fuse to form larger hydrocarbon molecules when exposed to the very high temperatures and pressures of the Earth's upper mantle," Galli said. "We don't say that higher hydrocarbons actually occur under the realistic 'dirty' Earth mantle conditions, but we say that the pressures and temperatures alone are right for it to happen.

Galli and colleagues used the Mako computer cluster in Berkeley and computers at Lawrence Livermore to simulate the behavior of carbon and hydrogen atoms at the enormous pressures and temperatures found 40 to 95 miles deep inside the Earth. They used sophisticated techniques based on first principles and the computer software system Qbox, developed at UC Davis.

They found that hydrocarbons with multiple carbon atoms can form from methane, (a molecule with only one carbon and four hydrogen atoms) at temperatures greater than 1,500 K (2,240 degrees Fahrenheit) and pressures 50,000 times those at the Earth's surface (conditions found about 70 miles below the surface). _PO

More information in an earlier AFE posting

Labels: , ,

Friday, April 15, 2011

Better Batteries, Cheaper Splitting of Water to Hydrogen, O2

Ionic Liquid Batteries from US Naval Research Lab
While working with ionic liquids based on mineral acids, such as hydrogen sulphates, it was observed that Zn metal would react to form zinc sulphate. Since this is similar to that observed for the zinc anode in a standard alkaline cell, a series of experiments were then performed to determine how different metal oxides reacted in these types of ionic liquids.

Electrochemical experiments demonstrate that not only can these reactive ionic liquids act as the electrolyte/separator in both solid state and liquid batteries, but they can also act as a reactive species in the cell’s electrochemical makeup. Using a non-aqueous approach to primary and secondary power sources, batteries are designed using standard cathode and anode materials such as magnesium dioxide (MgO2), lead dioxide (PbO2) and silver oxide (AgO). The ionic liquid that is the main focus of this work is 1-ethyl-3-methylimidazolium hydrogen sulphate (EMIHSO4), however, other ionic liquids such as those based on the nitrate and dihydrogen phosphate anions have also been found to work well in this type of a battery design. _GCC

Cheap, Molybdenum based catalysts for inexpensive splitting of water
A team led by Ecole Polytechnique Fédérale de Lausanne (EPFL) Professor Xile Hu has discovered that a molybdenum-based catalyst allows the electrolytic production of hydrogen at room temperature, and is inexpensive and efficient. The results provide new opportunities for the development of renewable and economic hydrogen production technologies. A paper on the work appears in the RSC journal Chemical Science. _GCC

Xtreme Power's proprietary Power Cell technology will be used to provide 36 MW of storage at Duke Power's Texas "Notree" wind farm. More: "World's Biggest Battery"
PDF details
Large scale power storage may eventually make wind and solar more competitive power providers for particular locations where nuclear power is not practical -- isolated third world locations with minimal technical expertise.

Labels: ,

Thursday, April 14, 2011

More on Rossi / Focardi LENR / "Cold Fusion"

Andrea Rossi is not wasting time. As noted here earlier, Rossi has invested almost all his money in developing and producing the LENR reactors for the Athens, Greece 1 MW plant. Rossi also claims to have signed a contract "of tremendous importance" in the USA, with a company he is not yet at liberty to name.
Rossi plans to install a one megawatt, American made E-Cat power station in a factory in Greece in October, 2011. Rossi believes that only a working commercial power station can definitively prove to the world that his creation is real. If E-Cats turn out to be as economical as expected, they will eventually be used to power cars, trucks, trains, ships, aircraft, and spacecraft. _OpEdNews Chris Calder
More background and several helpful links at above link.

Both Brian Wang and Brian Westenhaus have been following the progress of the Rossi / Focardi low energy nuclear reaction device.

Rossi claims that the reactor is able to obtain large amounts of heat energy from the low energy nuclear transmutation reactions that transform Nickel into Copper. Here is a more detailed look at the energy numbers involved in such a transmutation:
MeV for each Ni transformation

Starting from Ni58 we can obtain Copper formation and its successive decay in Nickel, producing Ni59, Ni60, and Ni62. The chain stops at Cu63 stable.

For simplicity I assume all the Nickel in the reactor in the form Ni58.

For simplicity I suppose for each Ni58 the whole sequence of events from Ni58 to Cu63 and as a rough estimate I calculate the mass defect between (Ni58 plus 5 nucleons) and the final state Cu63.

Ni58 mass is calculated to be 57.95380± 15 amu

The actual mass of a copper-Cu63 nucleus is 62.91367 amu

Mass of Ni58 plus 5 nucleons is 57.95380+5=62.95380 amu

Mass defect is 62.95380-62.91367=0.04013 amu

1 amu = 931 MeV is used as a standard conversion

0.04013×931 MeV=37.36 MeV

So each transformation of Ni58 into Cu63 releases 37.36MeV of nuclear energy.

Nickel consumption
One hundred grams of nickel powder can power a 10 kW unit for a minimum of six months.

How much of Ni58 should be transformed, in six months of continuous operation, in order to generate 10 kW?

10 kW is thermal or electrical power. The nuclear power must be larger. Assume a nuclear power twice:
20 kW = 20,000 J/s = 1.25 x 10**17 MeV/s.

Each transformation of Ni58 into Cu63 releases 37.36MeV of nuclear energy.

The number of Ni58 transformations should thus be equal to (1.25 x 10**17)/37.36 = 3.346 x 10**15 per second.

Multiplying by the number of seconds in six months (1.55 x 10**7) the total number of transformed Ni58 nuclei is 5.186 x 10**22.

This means 5 grams.

The order of magnitude is not exactly the same but seems to be plausible. This means also 5 grams of Nickel in Rossi’s reactor transmuted into (stable) Copper after six months of continuous operation at the rate of 10 kW. _NextBigFuture

This may seem incredible to most persons who know how many tons of coal are required to provide the same amount of power as 5 grams of nickel. But nuclear energy is on a far different level of scale than chemical energies, such as combustion energy.

But if you consult this table of energy densities provided at Transtronics Wiki, you can clearly see the difference in scale between the energy of nuclear reactions and the energy from chemical reactions -- roughly 7 or 8 orders of magnitude, depending on the method of comparison.

Imagine the savings in fuel transportation costs alone!

Will this sparkling new form of energy prove to be true gold, or just a fool's flash in the pan? Time will tell. _AlFin2100
Update: More from Next Big Future


Wednesday, April 13, 2011

A New Approach to "Cold Fusion" / LENR from the Ukraine

From out of the Ukraine in Eastern Europe comes Professor Boris Bolotov and his engineer Waldemar Mordkovitch with a very different approach to fusion. Its still table top in size, runs at low, for fusion standards to date, quite cool temperatures and is reported to make electricity directly skipping over the heat step for power generation. The new fusion candidate uses the transmutation of zirconium, in zirconium oxide form on to other elements to produce energy.

For the demonstration the table top sized cold fusion reactor was pulsed with a nanosecond pulse generator. The pulses of electricity went into the cell filled with a “liquid metal.” This produced a kind of electrical arc in the liquid metal. During the demonstration reports have it that one hundred watts of power input produced three hundred watts of pure electrical output plus excess heat. _NewEnergyandFuel
Bolotov's work appears to be based on earlier Russian research which documented methods to transmute various elements. The Bolotov transmutation method apparently releases some potentially dangerous radiation, which the team is working to quench. Another way in which the zirconium transmutation of Bolotov differs from the nickel transmutation of Rossi, is that the Bolotov approach produces electricity directly -- without the need to produce heat energy as an intermediary.

More at PESN (via New Energy and Fuel)

In other fusion news, Brian Wang looks at research indicating that proton-Boron aneutronic fusion may be more feasible than previously thought.

Labels: , ,

Tuesday, April 12, 2011

Alta and Blackstone Aim to Turn EROEI On Its Head

The conventional wisdom among peak oil doomers and religionists says that the EROEI -- energy returned on energy invested -- is too low to allow most unconventional substitutes for crude oil to become economical.

But Blackstone Group and Alta Resources are teaming up -- investing US$ 1 billion -- to prove that unconventionals can indeed substitute for crude oil and other conventional fuels.
Blackstone Group LP (BX), the world’s biggest private-equity firm, agreed to form a joint venture with natural-gas explorer Alta Resources LLC to invest $1 billion in North American gas fields.

The entity will be called Alta Energy Partners, the companies said today in a joint statement. The new company will focus on acquiring leases and drilling wells in so-called unconventional fields, or geologic formations previously regarded as too hard to penetrate. _Bloomberg
Blackstone Group (BX.N) is investing in the lucrative area of shale, following rivals such as Kohlberg Kravis Roberts & Co (KKR.N).

North American shale fields are drawing billions of dollars from companies that are eager to learn the techniques to tap into the difficult geological formations.

...Unconventional assets include shale rock fields that may hold vast quantities of oil and gas but are more expensive to tap than traditional energy reservoirs. _Reuters
Devoted believers in EROEI as a sort of "barrier wall" preventing the economic use of difficult fossil fuel resources, have been unable to wrap their minds around the concept of dynamic technology -- on many levels. At one time, the EROEI for conventional crude oils in Texas or Saudi Arabia were much too low for profitable production at any scale. At one time, the EROEI for coal in most coal mines of England was far too miniscule to allow Englishmen to substitute coal for wood as a common fuel. And so it goes.

EROEI is not a static ratio. It changes over time, as society's needs drive both technology and the broad range of economic and political considerations.

Alta and Blackstone are attempting to push EROEI to its limits in regard to various forms of unconventional fossil fuels. They are putting significant resources behind the attempt. By doing so, they are likely to push ahead of some significantly better financed -- but less bold -- multi-national oil giants.

Current artificially inflated oil prices will tend to drive investments into alternative fuels acroass the entire wide spectrum of possibilities. Until the speculative bubble of political peak oil pops, of course. Then the cyclic process begins again -- but on a somewhat higher cost level, reflecting the ongoing phenomenon of political peak oil (designed energy starvation) and the ongoing intentional debasing of fiat currencies by central banks.

Labels: ,

Plentiful Bakken Shale Oil Profitable at $40 a Barrel and Up

North Dakota is likely to increase oil production by 300,000 to 700,000 barrels per day that will make a difference if all else was the same.

The same production methods are being used in Canada's Bakken, Cardium in Alberta, Eagle Ford in Texas and many foreign countries.

Eagle Ford in Texas is producing about 80,000 barrels per day and that could double over the coming year. _NextBigFuture
Multi-stage horizontal drilling is picking up.

One example: Bingham Exploration is cranking up production for its Williston Basin shale oil wells in North Dakota, where Bingham holds leases on more than 368,000 net acres. As Bingham expands its drilling and production, it is likely to offer equity shares on the public market. North Dakota is more "frack-friendly" than some Eastern US states, and consequently North Dakota is on a sounder financial footing than Pennsylvania or New York states.
By the fourth quarter of this year, Brigham should be producing more than 18,000 BOE per day, according to Beskow, more than tripling the output of early last year.

And yet Beskow says Brigham is just getting started. "They are still in the very early stages of drilling," he said.

Brigham currently has 58 net wells (if a company has a 50% interest in 10 wells, it has five "net" wells) in the Williston Basin, Beskow says. That amounts to just 3% of its potential total of wells once it begins drilling more densely and in some of its less developed acreage. Beskow estimates that Brigham will have 190 net wells by the end of 2012.

Brigham is flourishing today because it made some aggressive investments even as other drillers cut back during the recent recession.

"Most of their success came after the 2008 crash," said Joel Musante, an analyst at C.K. Cooper. At that time, he says, there was "no clear consensus" on how to complete a well in the Bakken.

Brigham's contribution to fracking was to do more of it. "If you want to get more oil out of a formation, do more fracks," Musante said. _IBD

It is likely that recoverable reserves in the Bakken formation will continue to rise as exploration continues, and technology improves. At today's over-inflated oil prices, the growing crowd of successful horizontal oil shale drillers are profiting very handsomely indeed.


Monday, April 11, 2011

Can Small Modular Nuclear Reactors Save America?

...mPower™ reactor...represents a new generation of smaller, scalable nuclear power plants on track to be deployed by 2020. Babcock & Wilcox Nuclear Energy, Inc. (B&W NE) and Bechtel Power Corporation have formed a joint company, Generation mPower LLC, to design, license and build the next generation of nuclear power plants based on B&W mPower reactor technology _BusinessWire
B&W has received a $5 million grant from a Virginia community revitalisation commission, to promote the company's development of a new, safer, cheaper, factory-built small modular nuclear reactor. The fact that the funding comes from a state-level community development agency indicates how quickly the strong need for energy development is being felt across the US at all levels. The US energy giant TVA is also interested in mPower's approach to modular reactors.
Watts Up With That
The US national economy is being hurt by dysfunctional federal government decisions being made from the White House to the NRC to all agencies of the US government. Policies of energy starvation -- anti-oil, anti-coal, anti-nuclear, anti-gas, anti-oil sands, etc etc -- have set the current US administration apart from all previous administrations except perhaps the Carter administration. Unfortunately, being anti-energy is the same as being anti-prosperity in the modern world.
If the US acts now to certify new, safer, cheaper, more reliable, more secure small modular reactors, the first reactors could provide energy to the grid by 2020.
"How do you reduce the upfront cost [of nuclear power - ed.]? Make them much smaller - we're talking about 100 megawatts, a power a tenth of one of the plants at Fukushima," he told BBC News.

"They would be factory-built, not built on site. You could combine them, gang them together - put 10 together and you get a gigawatt plant. The point is the utility could buy into nuclear power in stages, they wouldn't have to come up with the entire cost of a gigawatt plant."

What the Fukushima crisis casts light on, though, is that the SMR idea takes safety into account from the outset, said Victor Reis, senior adviser in the Office of the Undersecretary of Energy for Science.

"This is a reactor that is designed safety first, not one that you do the physics first and then add the safety on," he told the conference.

SMR designs all run on the idea of "passive safety" - that is, protective measures run without human intervention and even without power; cooling is done by natural convection, rather than with the kinds of pumps that were at the heart of the Fukushima plant's problems.

They are also small enough to be built underground, making them less vulnerable to severe weather, unauthorised access, impacts, and to some degree, seismic events. _BBC
Unfortunately, these reactors may never be built, since the priorities of the Obama administration lie elsewhere than in promoting a prosperous future for the United States. Look, we know that Mr. Obama has a plan. Hopefully, one day, we will all know what it was.


Saturday, April 09, 2011

47th Carnival of Nuclear Energy at Cool Hand Nuke

Cool Hand Nuke hosts the 47th edition of the Carnival of Nuclear Energy. Here are a few excerpts:

"When the prospect of a single nuclear-related fatality is judged more newsworthy than the plight of half a million homeless survivors of an unprecedented natural disaster, then something has gone egregiously wrong in the editorial rooms of mainstream media vehicles. It is time we admit that we do risk wrong in our public conversations."

It notes TVA's efforts to educate the public on the critical design differences that distinguish Browns Ferry from the Fukushima BWR Mark 1s.  Such understanding is important to public acceptance.  

"George Monbiot Declares the Era of Confusion Over:" What Mobiot does is to demonstrate that Caldicott systematically evades her responsibility to prove the things she claims to be proven true by scientific evidence. Caldicott has been playing this game for years, but her day of reckoning has arrived, and Monbiot gives her unwillingness or inability to provide evidence the exposure it richly deserves.  Caldicott is confused. No doubt a lot of people are, but George Monbiot, who has finally worked through his own confusion, has clearly announced that the Era of Confussion is over _CoolHandNuke

Visit the entire carnival at the links above. You will find more stories on the Fukushima nuclear reactors incident, and other stories dealing with the quest for more new safe nuclear power.

MidAmerican Energy in Iowa is almost unfazed by Japan's nuclear incident, and appears determined to go ahead with a new plant incorporating one or more small modular reactors -- to go online near 2020.

Labels: , ,

Friday, April 08, 2011

A Cold Look at Big Wind in the UK 2008 to 2010

It is clear from this analysis that wind cannot be relied upon to provide any significant level of generation at any defined time in the future. There is an urgent need to re-evaluate the implications of reliance on wind for any significant proportion of our energy requirement. _UKWindReport PDF
Stuart Young Consulting has released a 28 page PDF review of UK wind power from 2008 to 2010 (via TheRegister), which contradicts many of the green talking points used to promote wind energy. Policy analysts and decision makers must have access to reliable facts and figures, if they are to help businesses and governments make the best possible decisions regarding vital energy supplies.
1. During the study period, wind generation was:
• below 20% of capacity more than half the time.
• below 10% of capacity over one third of the time.
• below 2.5% capacity for the equivalent of one day in twelve.
• below 1.25% capacity for the equivalent of just under one day a month.
The discovery that for one third of the time wind output was less than 10% of capacity, and often significantly less than 10%, was an unexpected result of the analysis.

2. Among the 124 days on which generation fell below 20MW were 51 days when generation was 10MW or less. In some ways this is an unimportant statistic because with 20MW or less output the contribution from wind is effectively zero, and a few MW less is neither here nor there. But the very existence of these events and their frequency - on average almost once every 15 days for a period of 4.35 hours - indicates that a major reassessment of the capacity credit of wind power is required.
3. Very low wind events are not confined to periods of high pressure in winter. They can occur at any time of the year.
4. The incidence of high wind and low demand can occur at any time of year. As connected wind capacity increases there will come a point when no more thermal plant can be constrained off to accommodate wind power. In the illustrated 30GW connected wind capacity model with “must-run” thermal generation assumed to be 10GW, this scenario occurs 78 times, or 3 times a month on average. This indicates the requirement for a major reassessment of how much wind capacity can be tolerated by the Grid.
5. The frequency of changes in output of 100MW or more over a five minute period was surprising. There is more work to be done to determine a pattern, but during March 2011, immediately prior to publication of this report, there were six instances of a five minute rise in output in excess of 100MW, the highest being 166MW, and five instances of a five minute drop in output in excess of 100MW, the highest being 148MW....
6. The volatility of wind was underlined in the closing days of March 2011 as this Report was being finalised.

• At 3.00am on Monday 28th March, the entire output from 3226MW capacity was 9MW.
• At 11.40am on Thursday 31st March, wind output was 2618MW, the highest recorded to date.
• The average output from wind in March 2011 was 22.04%.
• Output from wind in March 2011 was 10% of capacity or less for 30.78% of the time.

The nature of wind output has been obscured by reliance on “average output” figures. Analysis of hard data from National Grid shows that wind behaves in a quite different manner from that suggested by study of average output derived from the Renewable Obligation Certificates (ROCs) record, or from wind speed records which in themselves are averaged.

It is clear from this analysis that wind cannot be relied upon to provide any significant level of generation at any defined time in the future. There is an urgent need to re-evaluate the implications of reliance on wind for any significant proportion of our energy requirement.

Most people are unwilling to delve into the technical details of energy cost-benefit analysis, but it is the specific technical details of big wind energy which damn the entire enterprise.
Wind output is actually below 20 per cent of maximum most of the time; it is below 10 per cent fully one-third of the time. Wind power needs a lot of thermal backup running most of the time to keep the lights on, but it also needs that backup to go away rapidly whenever the wind blows hard, or it won't deliver even 25 per cent of capacity.

Quite often windy periods come when demand is low, as in the middle of the night. Wind power nonetheless forces its way onto the grid, as wind-farm operators make most of their money not from selling electricity but from selling the renewables obligation certificates (ROCs) which they obtain for putting power onto the grid. Companies supplying power to end users in the UK must obtain a certain amount of ROCs by law or pay a "buy-out" fine: as a result ROCs can be sold for money to end-use suppliers.

Thus when wind farmers have a lot of power they will actually pay to get it onto the grid if necessary in order to obtain the lucrative ROCs which provide most of their revenue, forcing all non-renewable providers out of the market. If the wind is blowing hard and demand is low, there may nonetheless be just too much wind electricity for the grid to use, and this may happen quite often...

...there is little point building more wind turbines above a certain point: after that stage, not only will they miss out on revenue by often being at low output when demand is high, but they will also miss out by producing unsaleable surplus electricity at times of low demand. The economic case for wind – already unsupportable without the ROC scheme – will become even worse, and wind will require still more government support (it already often needs large amounts [3] above and beyond ROCs).

The idea that pumped storage will be able to compensate for absent wind – meaning that there will be no need for full thermal capacity able to meet peak demand – is also exposed as unsound. The UK has just 2,788 megawatts of pumped-storage capacity and it can run at that level for just five hours. UK national demand is above 40,000 megawatts for 15 hours a day and seldom drops below 27,000. Pumped storage would have to increase enormously both in capacity and duration – at immense cost – before it could cope even with routine lulls hitting the planned 30-gigawatt wind sector, let alone rare (but certain to occur) prolonged calms. _TheRegister
Big wind without big storage becomes a big drain on the entire economy. But big, utility-scale storage is expensive to build, and not available everywhere.

It has been argued that hydro power provides a good backup to wind, but given what we are learning about big wind's unreliability and huge costs, why would anyone even bother trying to make big wind palatable? The underlying compulsion to build big wind and solar -- carbon hysteria -- rests upon unscientific claims and unreliable computer models.

Big wind is a huge and ruinously expensive house of cards, being built up for reasons both corrupt and emotional, not rational to the public purse which is called upon to prop up the faulty enterprise. It is long since time to call it quits.

More: See this GWPF article linking to this John Muir Trust coverage of the same report.


Newer Posts Older Posts