Tuesday, January 31, 2012

Oil Dictatorships Require High Oil Prices: Can They Hold?

Oil dictatorships from Saudi Arabia to Iran to Venezuela to Russia have grown dependent upon $100 a barrel oil, in order to placate their people with handouts, social welfare programs, and Potemkin Village styles of "prosperity and power." But there is a very real question as to whether these heretofore "masters of the oil universe" will be able to hold the line on oil prices over the long term.
Only three years ago, it was thought that Saudi Arabia – the largest oil exporter and second largest producer in the world – could generate large budget surpluses with oil at $70/barrel. In recent weeks, new estimates state that the country would need oil at $75/barrel just to balance the budget – never mind trying to post a budget surplus. The country’s oil minister has stated that the nation would work to stabilize prices at the $100/barrel level – which is a first. Saudi Arabia has traditionally held the role of OPEC moderate while Iran and Venezuela have been hawks who favor higher oil prices. Saudi Arabia has always balanced its need for oil revenues with the knowledge that if left unchecked, high oil prices have tended to precede recessions.

The reason for this change in policy would most likely be due to the country’s response to the uprisings across the Middle East last year. Fearing unrest, the government of Saudi Arabia has unveiled a huge increase to public spending that totals almost $130 billion. The Saudi commitment to stabilizing oil in the $100/barrel range should serve as a wakeup call for consumers and investors alike.

...it is not just Saudi Arabia that needs high oil prices to meet its spending commitments. Russia needs prices of over $100/barrel to balance its budget. Together, Saudi Arabia and Russia account for a little over 20% of the world’s oil production. It would be hard to argue therefore that these two major oil producers would be willing to bring down prices. _Financial Post
Of course, the higher the oil price, the more incentive for wildcatters and other entrepreneurs to come up with new sources of crude, and new substitutes for crude oil in all of its wide and various application markets.

One of the sources for new oil is shale oil -- a source with massive potential for new oil supply. Another source is the arctic.
“The race is on for positions in the new oil provinces.” That starting-gun quote was fired last week by Tim Dodson, executive vice-president of the Norwegian oil and gas company Statoil. The ‘new oil provinces’ are in the Arctic, which brims with untapped resources amounting to 90 billion barrels of oil, up to 50 trillion cubic metres of natural gas and 44 billion barrels of natural gas liquids, according to a 2008 estimate by the US Geological Survey. That’s about 13% of the world’s technically recoverable oil, and up to 30% of its gas — and most of it is offshore.

...On 17 January, Moe awarded 26 production licences for developed offshore oil areas in the Norwegian and Barents Sea to companies including Statoil, Total, ExxonMobil and ConocoPhillips. And the settlement in 2010 of a long-running row between Norway and Russia over their Arctic maritime boundary will allow more exploration in formerly disputed parts of the Barents Sea (see ‘Frozen fuels’). “There’s an ocean of new opportunities that we will grasp with both hands,” says Moe. _Nature
Of course, no matter how much oil & gas the USGS thinks is in the Arctic, there is certain to be much more. As long as prospectors are looking mainly "under the streetlights," they will find only a small portion of the world's oil.

Of course the oil and gas resource shrinks in relative magnitude next to the massive global methane hydrate resource, which is merely waiting for smart and wise humans to find safe and efficient ways to scoop it up.

More on the desperate need of oil dictatorships to maintain high oil prices.

It is quite easy for peak oil doomers to misapprehend the reasons for high oil prices and "stalled" oil production levels. That is because their brains can only hold one idea: peak oil doom. To consider the dozens of other more important factors involved, would entail a massive and intolerable cognitive dissonance, which must be avoided at all costs.

Labels: , , ,

Monday, January 30, 2012

Can Canada Leverage Its Hydrocarbon Wealth to Create An Abundant Future Based Upon Advanced Energy Sources?

Canadians are understandably upset at the US in general, and at US President Obama in particular. Obama's abrupt and corrupt killing of the Keystone XL pipeline sent an unfriendly message of contempt and disregard to the US' neighbors to the north.

While the US Republican Congressional members are attempting to devise a way to approve the Keystone XL pipeline's completion and border crossing, Canadians are forced to contemplate other markets for their increasingly valuable oil sands product. After all, oil sands production has been ramped up in anticipation of cheap & ready pipeline transport to the south, all the way to refineries on the Gulf of Mexico. With the destruction of that rational plan by the congenitally feckless Obama, China is forced to look to China and India as alternative markets:
While the media fixates on the political spin around the Obama government's rejection of TransCanada's Keystone XL pipeline, there's another, more important element to this story that has been grossly underplayed:... in Asia...demand for energy of all kinds will continue to soar, according to BP's 20-year forecast....

``By 2030 China and India will be the world's largest and third-largest economies and energy consumers, jointly accounting for about 35 per cent of global population, GDP (Gross Domestic Product) and energy demand,'' the report says.

``Rapid economic development means industrialisation, urbanisation, and motorisation. Over the next 20 years China and India combined (will) account for all the net increase in global coal demand, 94 per cent of net oil demand growth, 30 per cent of gas and 48 per cent of the net growth in non-fossil fuels.''

No, that's not a typo. Let me repeat that: over the next 20 years, BP says China and India will account for 94 per cent of the net worldwide increase in oil demand. _Canada.com
The author goes on to warn of Canada's dependency on the US as an export market, at a time when US consumption of imported oil continues to decline.

Canada itself needs to also think about what kind of energy infrastructure it wishes to build for itself, using the oilsands wealth as a springboard. In the medium and long-term, advanced nuclear fission (and later, fusion) technologies make the most sense. Canada has rich ore resources for production of nuclear fuels, and with rational recycling and breeding technologies, the resource could last almost indefinitely, in practical terms.

Wisely, Canadian utilities are beginning to look toward building their nuclear infrastructure:
Utilities in Canada are expressing interest in the Westinghouse AP1000 pressurized waterreactor (PWR) and the Westinghouse Small Modular Reactor, a 200 MWe class integral PWR currently under development that is suited for smaller electrical grids, distributed generation, and process heat requirements. _Power-Eng
In an attempt to create a more sustainable oil sands industry, engineers are looking for ways to substitute geothermal heat in place of natural gas, for the production of oil sands. The key factor is industrial heat, which can be provided in multiple ways.

Al Fin energy analysts prefer the use of gas-cooled nuclear reactors for long term in situ production of oil sands, oil shales, heavy oils, and for even more unconventional fuels such as coal to liquids (CTL) and gas to liquids (GTL). Canada is rich in several hydrocarbon resources which could be economically converted to high quality fuels and high value chemicals, using process heat from gas cooled nuclear reactors.

All of these projects will require significant capital, which can be at least partially provided from export profits derived from sales of oil sands. The key issue is to convert a modern-day source of export wealth into a long term foundation for energy abundance, industrial sustainability, and commercial viability.

Neither large-scale wind power nor big solar power are rational foundations for a prosperous Canadian future, as both of these green approaches are inherently unreliable, expensive, intermittent, difficult to manage, and lead to exponential increases in customer energy bills.

Labels: , ,

Sunday, January 29, 2012

Small Modular Reactors and Obama's Kiss of Death

The Obama administration’s next move in boosting energy techonlogy will be nearly $500 million in support of small modular reactors. Individually, these nuclear reactors would produce less energy than traditional reactors, but could be used more flexibly and operate more efficiently. _PopularMechanics
President Obama's promotion of big wind and big solar has been a huge bust, and the US President is widely becoming known as the enemy of reliable forms of energy for his administration's attacks on hydrocarbon and nuclear power -- including his recent killing of the Keystone XL pipeline.

Mr. Obama is seem more commonly as an ideological extremist rather than as someone who wants the best of America and for Americans. In order to change that perception before the November 2012 general elections, Obama's speechwriters included some words of praise and promise for nuclear power in the president's latest State of the Union address. The promise is for $500 million in support from the Department of Energy, for development of small nuclear reactors. But the important thing is what happens behind the scenes, when the cameras are off. Americans have very little reason to be hopeful, based on the actions of this president.

Regardless, here is a brief look at the small nuclear reactor technologies which will be considered by the DOE for support:
An SMR would generate one-tenth to one-third the energy of a conventional reactor. Rather than producing 1000 megawatts of electricity, for example, an SMR might produce 300Mw or less. For example, the company NuScale Power is developing a 45Mw SMR that would be able to supply electricity to 45,000 American homes for a year, making it well suited for smaller towns and cities where a conventional reactor would be overkill. And because SMRs are modular, they’re scalable. The power plant can install additional SMRs as electricity demand grows.

There are three main varieties of SMR in development.

Light-Water SMRs

These are basically a scaled-down version of the light-water reactors already working in the United States. Inside a light-water reactor, heat from the uranium core turns water into steam, which spins turbines that generate electricity. The same thing happens in a light-water SMR, with a few modifications.

Unlike traditional reactors, which position the generators outside the reactor, some SMRs, such as the Babcock & Wilcox 125Mw "mPower" reactor, locate the generators inside the reactor. John Kelly, the energy department’s deputy assistant secretary for nuclear reactor technologies, says this makes manufacturing easier and eliminates the piping between reactors and generator, which is a safety liability. (If a pipe breaks, it becomes difficult to deliver coolant back to the hot core.)

Some light-water SMRs also incorporate what engineers call passive safety features—in an emergency, they could cool a reactor core even if the power goes out. At Fukushima Daiichi in Japan, the site of last year’s post-tsunami nuclear disaster, the plant relied on electrically driven pumps to deliver water to the hot core and cool it down. When the power went out and diesel backups failed, operators had to resort to desperate measures to prevent total catastrophe.

By contrast, small reactors such as the Westinghouse SMR would rely on gravity and thermodynamics to circulate coolants. As the radioactive core heats the water surrounding it, that hot water becomes less dense and flows upward toward the heat exchangers that turn the heat into electricity. As the water loses heat to the exchangers, it cools, becomes more dense, and falls back toward the core—no electricity required.

"The new plans are elegant in their simplicity," Genoa says. "Passive features allow reactors to go without operator interaction, and without pumps to move water around." To further improve on safety, several SMRs are meant to be installed and operated underground.

The light-water SMRs in development have been slightly less efficient than normal reactors, meaning less of the uranium’s potential energy is turned into electricity. But small light-water reactors may eventually deliver electricity that is less expensive than what larger reactors can produce simply because construction and installation costs would be lower. The Nuclear Regulatory Commission expects to approve the first light-water SMR power plants in the early 2020s.

Gas-Cooled SMRs

The idea behind gas-cooled reactors, Genoa says, is to rule out even the possibility of a meltdown. "It is physically impossible for the reactor to get hot enough to damage the fuel," he says. That’s because rather than using water as a coolant, gas-cooled SMRs would use helium.

As water boils it can build up pressure inside a reactor. Under extreme heat it can also react with zirconium alloys in the core. At Fukushima Daiichi, water-zirconium reactions caused a hydrogen explosion that blew the roofs off several reactors.

But unlike water, helium doesn’t boil or react. This allows the gas-cooled reactor to operate safely at temperatures up to 1000 degrees C, which increases the reactor’s efficiency. While a light-water reactor typically extracts roughly 34 percent of its core’s potential energy, a gas-cooled reactor would operate at more than 40 percent efficiency. A gas-cooled reactor developed by the Japanese Atomic Energy Research Institute has achieved 45 percent efficiency, and General Atomics’ Modular Helium Reactor achieves up to 47 percent.

To accommodate the high heat needed to achieve such high efficiencies, engineers must modify other elements of the gas-cooled reactor. The fuel requires a heat-tolerant carbon coating, for example, and metal parts of the reactor are replaced with ceramics, Genoa says. Because gas-cooled reactors require these new technologies, the Nuclear Regulatory Council estimates they won’t come on line until the mid-2020s.

Fast Reactors

Normal nuclear reactors use what are called moderators to slow down neutrons and control the chain reactions that happen during fission. That’s because the "fast neutrons" created when uranium splits are less likely to cause fission in the neighborhood—and keep the chain reaction going—than slightly slower neutrons are.

Fast reactors, though, are optimized for fast neutrons, which allows them to extract 60 times more energy from uranium than a typical light-water reactor can. That also means that fast reactors can digest the nuclear waste of other reactors, reducing the waste’s radiotoxicity while extracting energy in the process.

Fast reactors already in development include Argonne National Lab’s 175Mw reactor, Advanced Reactor Concept’s sodium-cooled ARC-100, and the 25Mw Hyperion Power Module. But because uranium is still in abundant supply, and because fast reactors can be used to breed weapons-grade plutonium, these SMRs are not economical (or legal) at this point. _Popular Mechanics
The gas cooled reactors and the fast reactors are the more advanced types of reactors -- and are thus less likely to receive significant federal support.

Light water reactors will probably receive the bulk of any of the promised funds which are actually delivered. Any companies working on SMRs which also have close ties to the Obama campaign will be first in line for any disbursed funding.

As Brian Westenhaus explained in a recent article, this promised funding to come from the DOE is no indication that a different federal agency -- the NRC -- will actually work in good faith to get the SMR designs licensed and the actual facilities built.

In fact this entire episode is dressed up more like a campaign promise and brush-off than as a genuine effort to promote a critically important set of technologies. In Obama's mind, if he has said something in a speech, then he deserves credit from voters for having already done the thing in reality.

Most perceptive people understand that although Mr. Obama talks a lot, he does very little, except to promote himself and his pet causes -- which have nothing to do with reliable energy or with a smoothly functioning and prosperous private sector economy. In other words, the smart bet is that very little of worth will come from Obama's promises for funding of small modular reactors. Perhaps one or two of the dozen or so companies working on SMR projects will ever receive any funds -- the company or companies best placed in the ranks of political supporters for the president, perhaps.

Which means that if this critical accomplishment is to ever get done, it will have to get done despite false promises from this president and despite the energy obstructionism from Obama's Nuclear Regulatory Commission and the dozens of other bureaucracies of mediocrity that have grown and proliferated under this pro-big government, anti private sector president.


A Contrarian Take on Peak Oil Doom and Peak Everything Doom

In 1971, the Limits to Growth team forecast that the world’s supply would run out 10 years from today. And yet according to renowned oil analyst Daniel Yergin, technology advances and new discoveries have allowed oil reserves worldwide to keep growing. For every barrel of oil produced in the world from 2007 to 2009, 1.6 barrels of new reserves were added. The World Energy Council reports that global proven recoverable reserves of natural gas liquids and crude oil amounted to 1.2 trillion barrels in 2010. That’s enough to last another 38 years at current usage. Add in shale oil, and that’s an additional 4.8 trillion barrels, or a century and a half’s worth of supply at present usage rates. Tar sands, including some huge Canadian deposits, add perhaps 6 trillion barrels more.

We’re awash in more than oil. One British study from the 1930s predicted an acute global shortage of copper “within a generation.” Not so much. The U.S. Geological Survey estimates global land-based copper resources to be 3 billion tons or more—the equivalent of 185,000 years at current production. That’s almost double the estimate of resources from 11 years ago, which means the number may have further to climb. And when we do finally run out of land-based supplies, there are still the undersea sources to use up.

The long-term picture for phosphate, vital for fertilizer production, is also reassuring, despite a price spike in 2008: Estimated global phosphate reserves climbed from 11 million tons in 1995 to 65 million tons in 2010—equal to 369 years of current production. The list goes on: Current resource estimates suggest it will take 347 years to run out of helium, 890 for beryllium, centuries for chromium, more than a millennium for lithium and strontium. And for those Americans worried about the price of makeup, resources of talc in the U.S. alone are enough to provide more than 1,000 years of supplies at current rates of domestic production.

...There are still plenty of good reasons to conserve the world’s mineral resources—just as there are very good reasons to avoid another war in the Middle East. But fear that the resources will run out isn’t one of them. _BW
While Al Fin energy analysts do not consider themselves cornucopians, neither do they consider themselves doomers. While they understand that the true complement of energy and minerals inside the Earth is certain to be far higher than the most optimistic estimates, these resources will not necessarily be easy to extract -- particularly with the types of lefty-Luddite green dieoff.orgy political leadership that more and more populations are choosing to lead them these days.

If your governments are sabotaging your society's best efforts to create an abundant future, you need to stop electing that kind of government! If you can't learn to stop hitting yourself on the head with a hammer, you probably have bigger problems than "peak oil doom" or "carbon hysteria." Think about it.


Saturday, January 28, 2012

Are Arguments for Peak Oil Doom Scraping the Bottom of the Barrel?

Are Arguments for Peak Oil Doom Scraping the Bottom of the Barrel? That's what CFR Fellow Michael Levi says in a recent blog post. Levi addresses a recent editorial in Nature claiming that peak oil production has already passed, and that peak oil is already upon us.

Levi demonstrates outright mistakes, mischaracterisations, and shifty reasoning on the part of the peak oilers, and says that the Nature piece is a good example of How Not to Argue that We're Running Out of Oil. One thing that makes Levi's piece interesting is that he himself is a true believer in the carbon hysteria orthodoxy, and agrees with many of the solutions proposed by the authors of the piece which he has set about contradicting. Keep in mind that it is all about the ideas that can be falsified. Everything else is opinion and speculation.

Another interesting response to the Nature editorial comes from North Dakota:
World production of oil peaked in 2008, argue James Murray of University of Washington and David King of Oxford University, in the Nature commentary. It has flattened or declined ever since, as North Dakota and Canadian crude fail to offset production drops elsewhere.

And “if oil production can’t grow, the implication is that the economy can’t grow either,” Murray and King write.

“This is such a frightening prospect that many have simply avoided considering it.”

But, for one thing, the claim that world production peaked in 2008 is arguable. “Oil company BP found in its most recent analysis that oil production was actually more than 82 million barrels per day in 2010, higher than the proposed plateau of 75 million,” a Scientific American analysis reports.

Furthermore, “adjusted for inflation, today’s $100 per barrel is roughly equivalent to prices in 1981,” the Scientific American story continues.

And “in the past 20 years, enough oil has been found to satisfy the demands of two new consumers — China and India — nations that now import more oil than is consumed by Germany and Japan.”

Given that gigantic surge, the fact that inflation-adjusted oil prices match those of 1981 is the market’s way of saying, “Don’t worry.” Because supply is keeping up perfectly well with demand.

And there’s another reason to believe Murray and King are too pessimistic. It’s clear during even a casual visit to Williston, N.D.

As North Dakotans know, Williston and the rest of western North Dakota have been utterly transformed. This transformation is almost entirely due to fracking.

And fracking, of course, is just the latest in a series of innovations that energy explorers have unleashed. These innovations have proven skeptics wrong time and again, from the whale-oil suppliers of 1869 (who mocked the “rock oil” drillers of Pennsylvania), to the 1970s claims of resource depletion and global collapse, to today.

Has that innovating stopped? Where fossil fuels are concerned, have all of the technological breakthroughs been found?

Of course not. As the Bakken boom shows, the greatest resource of all is the one that is in truly inexhaustible supply. The late economist Julian Simon identified it a generation ago: It’s the innovative power of the human mind. _Tom Dennis: There Will Be Oil
Tom Dennis is arguing from the facts on the ground. And he is also pointing out factual errors in the Nature editorial itself, just as Levi does in the CFR blog piece.

What seems clear is that a lot of academics, politicians, intellectuals, assorted opportunists, and media personalities simply hate oil -- or are willing to pretend to to make a buck. They are also willing to bend the facts, or make up entirely new facts out of the air, to support their desired narrative.

Such flimsy arguments have their parallel within the vastly larger and more powerful carbon hysteria orthodoxy, led by the IPCC and abetted by politicians throughout the developed world. But with the coming of Climate Gate, the carbon hysteria orthodoxy is now coming under attack from more legitimate thinkers and scientists.

The same situation is likely to arise if the "peak oil doom" establishment ever grows out of its clownish past and present.


Friday, January 27, 2012

How Obama Could Help US Energy: Get Government Out of the Way!

Without his nose growing visibly, the President claimed the government was behind the technological advances that led to the current shale gas boom, and even suggested that he might take credit for the rise in domestic oil production. In fact, Mr. Obama's administration has hampered and castigated oil companies at every turn. In the light of the hysterical grandstanding over the BP Gulf spill (whose impact proved to be greatly exaggerated), it was ironic indeed to hear the President now declare a great opening up of offshore exploration.

The industry has responded to attacks by becoming more innovative and productive. According to the U.S. Energy Information Administration, between 2007 and 2010, U.S. oil production grew from 5.1 million barrels a day (mbd) to 5.5 mbd. The agency predicts domestic production will hit 6.7 mbd by 2020, helping take imports down to 36% of domestic usage in 2035 from 60% in 2005. So much for peak oil. Meanwhile, the EIA also predicts that by 2016, thanks to the shale boom, the U.S. will be a natural gas exporter. _NatPost
In other words, the US oil & gas sector has grown and prospered despite Obama's agenda of energy starvation. Imagine how much healthier US energy and US industry would be without vicious governmental harassment and regulatory handicapping.

After taking credit for prosperity that has occurred despite everything he could do to shut it down, Mr. Obama goes on to promote the green energy scams which are helping to kill Europe, and which will certainly destroy any economy foolish enough to depend upon them.
One wonders if the President has the slightest clue about the flagging state of the wind and solar industries in Germany, or that what is boosting China's alternatives industry is government subsidies ... from other countries.

The President announced a plan to devote huge swathes of public land to the development of clean energy to power "three million homes." He also apparently committed the Navy to buying a chunk of this power, as if it weren't expensive enough to guard the Strait of Hormuz.

Mercantilist alternative energy strategies represent - as Jimmy Carter famously suggested - the "moral equivalent of war." The problem is that it is war on one's own economy. At least, with his partial ceasefire against the oil industry, President Obama is now only shooting himself in one policy foot rather than both. _NatPost
Obama says he is promoting clean oil technologies, and takes credit for the economic success of technologies which he has tried to kill, but more intelligent people can see through his endless crap. Obama's ongoing (although publicly undeclared) war against coal, oil sands, oil shale kerogens, shale oil & gas, advanced nuclear power, arctic oil, offshore oil, etc. etc. amounts to a total policy of energy starvation -- relentlessly pursued by the EPA, NRC, Interior Department, and a score of other agencies and politically controlled bureaucratic entities.

Warren Meyer: Obama Deserves No Credit for US Oil & Gas Boom

Master Resource: Did the Government Invent the Shale Gas Revolution?

US President Obama Misrepresents His Own Record on Oil & Gas in Televised Speech

Labels: ,

Thursday, January 26, 2012

Advanced Biomass Feedstocks for High Value Chemicals & Fuels

Energy analysts cannot allow themselves to get stuck on one form of energy or fuel. In fact, the best energy analysts familiarise themselves with parallel industries and processes which overlap with or complement the energy and fuel processes on which they most closely focus.

So while cheap natural gas has temporarily postponed the economical development of many types of advanced biofuels and renewable chemical feedstocks, in the long run natural gas will be used to facilitate the production of chemicals and fuels from renewable biomass feedstocks.
Researchers at the University of Wisconsin led by Dr. Jim Dumesic report the conversion of the hemicellulose fraction of lignocellulosic biomass to furfural and levulinic acid using biphasic reactors with alkylphenol solvents in a new paper in the journal ChemSusChem. The furfural and levulinic acid products are valuable compounds for a variety of chemical applications, and they serve as precursors for the synthesis of liquid transportation fuels.
The conversion of lignocellulosic biomass into fuels and chemicals requires effective utilization of the C5 and C6 sugars present in hemicellulose and cellulose, respectively, by either processing these fractions together or separating and processing them separately. While simultaneous processing, such as in gasification or pyrolysis, offers the potential for simplicity of operation, the fractionation of hemicellulose and cellulose allows the processing of each fraction to be tailored to take advantage of the different chemical and physical properties of these fractions, and provides increased flexibility of operation.

For example, chemical processing methods can be employed to convert C5 sugars into fuels/chemicals in hemicellulose, while employing recent advances in biological conversions allows to convert the C6 sugars in cellulose into fuels and/or chemicals. One can also take advantage of the physical properties of cellulose for pulp and paper applications.

Herein, we show that the hemicellulose fraction of lignocellulosic biomass can be converted into furfural [FuAL] and levulinic acid [LA] by using biphasic reactors with alkylphenol solvents that selectively partition furanic compounds from acidic aqueous solutions. These furfural and levulinic acid products are valuable compounds for a variety of chemical applications, and they serve as precursors for the synthesis of liquid transportation fuels.

—Gürbüz et al.

...The basic steps of the process include:

Solid biomass is subjected to mild pretreatment in a dilute-acid, aqueous solution to solubilize the hemicellulose as xylose.

After filtering the solution from the solid cellulose and lignin, an organic solvent is added to the aqueous solution, and these liquids are heated in a biphasic reactor to achieve dehydration of xylose to FuAL.

FuAL can be distilled from the solvent and sold as a chemical or converted to LA by first hydrogenating FuAL to furfuryl alcohol (FuOH) over a metal-based catalyst (e.g. , copper) and then reacting the FuOH with water in a biphasic reactor to form LA.

Similar to FuAL, the LA product can be distilled from the organic solvent and sold as a chemical.

In the paper, they demonstrated three organic solvents—2-sec-butylphenol (SBP), 4-n-hexylphenol (NHP) and 4-propyl guaiacol (PG)—to be effective extracting agents for the production of FuAL and LA in these biphasic systems. These solvents (i) have high partition coefficients for extraction of FuAL, FuOH, and LA; (ii) do not extract significant amounts of mineral acids from aqueous solutions; (iii) have higher boiling points than the final product; and (iv) could potentially be synthesized directly from biomass (i.e., lignin), such that these solvents would not have to be transported to the site of the biomass conversion steps. _GCC
It will take time for biomass feedstocks to find their niches in the larger scheme of things. In the near future, cheap natural gas will overwhelm many markets and postpone many alternative fuels projects.

But as more and more uses are found for natural gas -- including lucrative gas to liquids (GTL) processes which are beginning to catch on -- methane is likely to become ubiquitous as a feedstock, reactant, and heat source in a wide range of new industrial processes, including the conversion of biomass to fuels and chemicals.

Labels: ,

Wednesday, January 25, 2012

China Has Abundant Shale Gas, But North America Has the Expertise

China in January approved shale gas as an independent mining resource, a legal status that may allow more Chinese firms to develop the unconventional energy source.

Foreign companies would not be able to participate in the tenders but could partner with the winning Chinese firms.

The world’s top energy user could hold shale gas reserves around 1,275 tcf, according to the EIA, exceeding those of the United States (862 tcf). _FinancialPost
China is the world's largest energy consumer. While China is heavily dependent upon coal power plants and is working to build up its nuclear power plant infrastructure, developing its massive reserves of shale gas would be a huge economic and energy boon to China. Therefore, the Chinese government is beginning to change the rules dealing with shale gas.
Shale gas, or natural gas trapped inside deposits of shale rock, has been a game changer in the United States, where its commercial-scale production has reduced the country's dependence on imported gas and lowered its energy expenses. China, which sits on even bigger shale gas reserves but has yet to tap them, now also hopes to give that fuel a role in its energy mix.

To do so, Beijing on Dec. 31 approved shale gas to be an independent mining resource, a step that opens up its exploration to more participants. The sector previously belonged only to government-controlled companies, but now also welcomes Chinese private firms.

This policy shift will create competition and therefore boost China's shale gas development. It may also help China reduce its greenhouse gas emissions because burning gas produces about half as much as the coal China depends upon for its primary energy source.

Although foreign companies are still not allowed to participate independently, they can gain more partnership opportunities as more Chinese players enter this business and will need their expertise in extracting the fuel from hard-to-access deposits locked in shale rock.

Besides encouraging more companies to produce shale gas, there is also a new incentive for companies to produce more of it. The Chinese government last month began reforming its pricing mechanism in two pilot provinces, for the first time allowing the market to decide wholesale prices for unconventional gas, including shale.

Unlike the government-controlled pricing mechanism, which made producing shale gas unprofitable, this new scheme will raise prices energy companies can charge for their output in China's fast-growing natural gas market, with annual consumption set to triple during the next decade. _eenews
As shale gas production continues to gear up across North America, and begins to pick up in China and other energy import markets, the impact of this huge new energy bonanza cannot be easily over-stated.

China's markets will require significant liberalisation in order to make best use of resources. Such market liberalisation is apt to be applied in very uneven fashion, due to the age-old Chinese fear of disorder and collapse of authority.

Labels: ,

Tuesday, January 24, 2012

Carbon Sciences Inc. Aims for $150 Billion / Year Market

Carbon Sciences Inc. is probably best known for its fledgling technology to "dry reform" methane into liquid fuels. But it will take some time to perfect that technology and scale it up. What can Carbon Sciences do in the meantime, in terms of maintaining a healthy cash flow? The company is aiming for the lucrative $150 billion/yr market in the steam reforming of methane to produce syngas -- primarily CO and H2. Carbon sciences believes that its proprietary catalysts provide it with an advantage over competing purveyors of catalytic systems for steam reforming methane. Here is the basic schema for H2 production via steam reforming of methane.
Carbon Sciences Inc. CABN -10.16% , the developer of a breakthrough technology to make transportation fuels and other valuable products from natural gas, today announced that it will add to its core technology by accelerating the development of a steam reforming version of its proprietary catalyst for use by existing synthetic gas (syngas) plants. There are more than 2,000 plants worldwide that use steam reforming of natural gas to make syngas for the production of large volume chemicals such as hydrogen, methanol, ammonia, solvents and detergent alcohols.

Byron Elton, CEO of Carbon Sciences, commented, "We are accelerating our development efforts to adapt our proprietary catalyst to meet the needs of this valuable market. By the end of 2012, we plan to demonstrate that our catalyst will deliver more output at a lower cost and will be an attractive drop-in replacement for existing steam reforming plants. The financial rewards are enormous. The current global hydrogen market exceeds $150 billion/year with methanol at more than $20 billion annually. The 2,000 existing steam reforming plants in the world usually replace their catalysts every 3-5 years. Many of these catalysts replacements cost as much as $5-$10MM." _MarketWatch
Carbon Sciences is not giving up on its primary long-term market -- the methane to liquid fuels market. Nor is the company giving up on its most prized long-term technology -- the dry reforming of methane, using CO2 instead of steam. But market realities have to dictate short and intermediate term actions, and the hydrogen (and methanol) markets are ready in the short and intermediate term, with little additional capital investment required by Carbon Sciences.
Uses of Hydrogen

There is a large global market for hydrogen in the production of distilled petroleum products, ammonia, methanol, and more. Methanol itself (from steam reforming of methane) is a lucrative worldwide market, which is used for a wide range of applications.
Another Use for Steam Reformed Methane
Carbon Sciences specialises in catalysts, and is hoping to be able to sell its catalytic system for steam reforming to as many distinct industries which use the process (for making H2, CH30H, syngas, etc) as possible. That is how startups often succeed -- by utilising pre-existing niches for a sub-part of their overall process. Cash flow is cash flow, as they say. Sometimes startups learn that the short-term markets-of-convenience work out better in the long run, than their initial long-term goals.

In a rapidly changing environment such as today's energy and petrochemicals markets -- where everyone is looking for more economical substitutes for crude oil -- smart entrepreneurs will learn to think on their feet.


Monday, January 23, 2012

Biogas Methane: Harvesting Renewable Hydrocarbons

Methane can be readily produced by humans in much the same way it has been produced by under-the-seafloor micro-organisms for eons of geologic time. By using methanogenic microbes in anaerobic digesters, humans can convert a large outflow of garbage and waste into a renewable hydrocarbon product (plus heat) which can help to heat and power residencies, farms, businesses, and societies.
Methane within biogas can be concentrated via a biogas upgrader to the same standards as fossil natural gas(which itself has had to go through a cleaning process), and becomes biomethane. If the local gas network allows for this, the producer of the biogas may utilize the local gas distribution networks. Gas must be very clean to reach pipeline quality, and must be of the correct composition for the local distribution network to accept. Carbon dioxide, water, hydrogen sulfide and particulates must be removed if present. If concentrated and compressed it can also be used in vehicle transportation. Compressed biogas is becoming widely used in Sweden, Switzerland, and Germany. A biogas-powered train has been in service in Sweden since 2005. _Noenigma
Descriptions of single-stage and multi-stage anaerobic digestors

Big plans in the UK for the integral use of anaerobic digestion in the complete food processing cycle

Economical new approach for upgrading biogas to pure methane for integration into municipal natural gas distribution networks

Uses of the residual digestate left over from the anaerobic digestion process

Labels: ,

Sunday, January 22, 2012

Super Frackin' Gasolicious Extra Oiliosis!

...oil services companies including Baker Hughes (BHI) and Schlumberger (SLB) are continuing their quest to devise ways to create longer, deeper cracks in the earth to release more oil and gas. These companies are no longer content to frack—they want to super frack.

High crude prices and newly accessible oil and gas embedded in shale rock in North America are driving the wave of innovation. The more thoroughly that petroleum-saturated rock is cracked, the more oil and gas is freed to flow from each well, raising the efficiency—and profit—of the expensive process. For example, the growing use of movable sleeves, a tubelike device with holes that fits inside a well bore, lets drillers target multiple spots to dislodge entrapped oil. This technique can reduce the $2.5 million startup cost of a fracking well near the Canadian border by up to two-thirds, according to a recent analysis by JPMorgan Chase (JPM). Multiply such savings by hundreds of wells added in that area each year, and you start to understand why the industry is so eager to hone the process. “I want to crack the rock across as much of the reservoir as I can,” says David A. Pursell, a former fracking engineer who’s now an analyst at Tudor Pickering Holt in Houston. “That’s the Holy Grail.” _BW
The combined technologies of horizontal drilling and advanced fracking have changed the global energy balance. If you have not noticed any difference yet, it is only because it takes time for some revolutions to set in and shift the action schemes and frames that make up the foundations of our world.

Of course the dieoff.orgiasts of the leftist Luddite Green coalition are not going to take all of this laying down. If they cannot come up with rational reasons why these technologies should not be used, they will take the underhanded pathway of government edict. While Obama is king, nothing must stand in the way of energy starvation and the economic crippling of the foci of Mr. Obama's grand strategy of adjusting the scales. Hence, Mr. Obama's EPA is working diligently to find ways to hamper fracking and shale gas & oil production -- perhaps the only bright economic spot in Mr. Obama's entire presidency.

Meanwhile, fracking -- and now super-fracking -- is likely to proceed with all due haste.

Three Faces of Super-Fracking
The “super fracking” as its becoming named is based on three basic improvements. The first is Schlumberger’s “HIWAY” idea that is an innovation in the material forced into the rock. (The linked page has a good animation to explain the process in detail.) The new idea is to add fibers to the mix of hard small grains used to hold open the cracks. The fiber is being seen as a major production improver. Much more flow for a longer period is the result.

...The second idea called “RapidFrac” comes from Halliburton with a set of highly developed specialized pipe fittings that go into a newly drill hole. (This page also has a high quality animated video, though quite a large file.) Much like valves, these sections of the pipe when activated open passages to the rock.

...The third idea is Baker Hughes has developed disintegrating frack balls (No company info yet.). This solves the need to have a drilling rig return to the well, and spend several days drilling and fishing out the perhaps a many as 20 or even 30 balls dropped in to do the frack in stages. _BrianWestenhaus
Halliburton (via NewEnergyandFuel)

Schlumberger (via NewEnergyandFuel)

Baker-Hughes PDF (via New Energy and Fuel)

Brian Wang has more

Labels: , ,

An Estimated 3 - 5 Trillion Barrels of Oil Equivalent in the Continental US -- Not Counting the 3 Trillion BOE in Oil Shales

It is generally best to suppress our wilder instincts toward optimism, so as to prevent painful disappointments in the future. On the other hand, if we are not careful we are likely to vastly underestimate our possibilities, and live much smaller lives than was absolutely necessary. There are times when we need to go crazy-optimistic, in order to try to define the upper bounds of what is possible.

For oil & gas reserves in the continental US, new production capacities have already proven a lot of doomers wrong. I've got a feeling that -- if Americans can get rid of their current destructive leadership of energy starvation -- that US hydrocarbon has just gotten started.

California's Monterey Shale is estimated to contain 500 billion barrels of oil equivalent in place. Estimates for North Dakota's oil shales fall into a similar range (PDF) for oil equivalent in place. But the continental US is underlain with hydrocarbon-bearing shales at various depths and ages. Add the estimates all together, and you might just reach the trillions of barrels, in oil equivalent. Certainly current estimates of economical production below 50 billion barrels are almost certain to be proved wrong, in time.
My estimate of oil in place in the continental US is from about 3 trillion to 5 trillion barrels of oil not including the 3 trillion barrels of oil shale. See this shale play website for a partial list of Shale oil plays and basins in the US. I know this seems very high, but it was only a few short years ago that we were going to need to import huge amounts of liquefied natural gas to meet our demand for natural gas, and now we have a glut of natural gas in the market place because of all the shale natural gas.

We should be able to produce at least 150 billion barrels of oil to maybe 1.0 trillion barrels of oil if the majority of these plays can be water flooded and CO2 injected as in the Canadian Bakken. I used 5% for the low estimate of 3 trillion barrels and 20% of the high estimate 5 trillion barrels figuring they could do some water flood and CO2 tertiary treatment to a large part of this land. For this oil to be recovered, it will require that the oil price stays above $70 a barrel so the economics are in place to fully develop these areas. _SeekingAlpha
The truth is, there will be no need to develop even a fraction of these many barrels of oil equivalent in place, if the energy starvationists would simply get out of the way, and allow human ingenuity to devise safer and more advanced means of utilising non-combustion, nuclear energy technologies. For that to happen, the merry band of energy starvationists in the US White House will require jettisoning, in favour of a more rational group of government administrators.

H/T Brian Wang

Labels: ,

A Basic Understanding of Oil

The creation of oil, gas, coal, and kerogen is an ancient process, which has taken place over the eons ever since photosynthetic life first occurred in the oceans and seas. For example, did you know that the Alberta oil sands area was once part of a prehistoric sea?
Alberta's oilsands are in an area that was once part of a prehistoric sea and have yielded several important marine reptile fossils. _CBC.ca

Oil creation is a renewable process, but over quite a long time span. Gas is made more quickly and more ubiquitously under the seabed than oil, and is becoming so cheap and common as to be thought of as a nuisance in many locations.

But it is crude oil about which such a fuss has been made for the past 100 years or so. And a well educated person should know more about crude oil than he is likely to find in the media or on the doomer sites. This embedded book by oil insider Leonardo Maugeri is likely to fill a lot of holes in the oil education of most ordinary people.
"The Age of Oil" by Leonardo Maugeri is a basic-level primer on the various facets of the modern petroleum age, from past, to present, and to future. It is best to start with basic history and basic supportable facts. Then, if you wish to go out on a limb, at least you will have a solid foundation from where to start.

Where oil comes from, and a hint of where new oil may be found
Looking at changes in atmospheric concentrations of O2 and CO2 over time is another way of noting the underlying biological processes involved in making the plants and microbes that go into making fossil fuels.

Oil shale sediments were deposited on large lake beds in the US western states:
Lacustrine sediments of the Green River Formation were deposited in two large lakes that occupied 65,000 km2 in several sedimentary-structural basins in Colorado, Wyoming, and Utah during early through middle Eocene time....The warm alkaline lake waters of the Eocene Green River lakes provided excellent conditions for the abundant growth of blue-green algae (cyanobacteria) that are thought to be the major precursor of the organic matter in the oil shale. _geology.com
How old is the oldest oil? No one knows, since it hasn't yet been found. But some oil has reportedly been found in rock that was billions of years old. Photosynthetic life has been around almost 3 billion years, so that provides for a lot of oil creation in deep rock layers.
Geologists usually don't bother looking for oil in very ancient (Precambrian) rocks for two reasons:

Conventional wisdom insists that oil is derived almost exclusively from organic matter, and additional conventional wisdom assures us that life was exceedingly scarce on earth billions of years ago.

Any oil that was created billions of years ago would have surely been destroyed by intense pressures and high temperatures over the eons.

Yet, Precambrian oil in commercial quantities has been found in formations up to 2 billion years old (in Siberia, Australia, Michigan, for example). While some of this oil might have migrated in-to the Precambrian rocks from younger source rocks, some of it does seem indigenous and, therefore, ancient.

...Now, three Australian scientists (R. Buick, B. Rasmussen, B. Krapez) have discovered tiny nodules of bitumen (lumps of hydrocarbons) in sedimentary rocks up to 3.5 billion years old in Africa and Australia. These bitumen nodules were formed when natural hydrocarbons were irradiated by radioactive isotopes that coexisted in the ancient rocks. Futhermore, these African and Australian rock formations were never severely deformed or subjected to high temperatures. The possibility exists, therefore, that some of the earth's oldest rocks may contain substantial oil reserves. So far, no one has seriously looked for oil in Precambrian rocks because of the two preconceptions noted above. _Science-Frontiers
The planet has gone through a large number of cycles over the past few billion years. Unless you can go back through time and trace the large numbers of optimal areas for oil, gas, coal, kerogen, and bitumen formation which have come and gone, come and gone, come and gone -- and been hopelessly changed and disguised by ongoing geologic processes -- you may be easily persuaded that almost all the fossil fuels have already been found.

The "abiotic oil" concept is not discussed here because the concepts behind biotic oil are difficult enough for most people to understand. And most hydrocarbons produced in the mantle by abiotic processes are shorter chain hydrocarbons, as you might find in "wet gas." Biotic and abiotic hydrocarbons tend to mix in the crust and follow much the same routes of migration upward in many cases. But if you want a good example of quick renewable hydrocarbons, the abiotic variety might qualify.

Labels: ,

Saturday, January 21, 2012

Hot Plasma Makes Quick Work of Garbage -- And Generates Power Too!

From the highway, one of the biggest landfills in the US doesn’t look at all like a dump. It’s more like a misplaced mesa. Only when you drive closer to the center of operations at the 700-acre Columbia Ridge Landfill in Arlington, Oregon, does the function of this place become clear. Some 35,000 tons of mostly household trash arrive here weekly by train from Seattle and by truck from Portland....

On the southwest side of the landfill, bus-sized containers of gas connect to ribbons of piping, which run into a building that looks like an airplane hangar with a loading dock. Here, dump trucks also offload refuse. This trash, however, is destined for a special kind of treatment—one that could redefine how we think about trash.... _Wired
Here’s how it works: The household waste delivered into this hangar will get shredded, then travel via conveyer to the top of a large tank. From there it falls into a furnace that’s heated to 1,500 degrees Fahrenheit and mixes with oxygen and steam. The resulting chemical reaction vaporizes 75 to 85 percent of the waste, transforming it into a blend of gases known as syngas (so called because they can be used to create synthetic natural gas). The syngas is piped out of the system and segregated. The remaining substances, still chemically intact, descend into a second vessel that’s roughly the size of a Volkswagen Beetle.

This cauldron makes the one above sound lukewarm by comparison. Inside, two electrodes aimed toward the middle of the vessel create an electric arc that, at 18,000 degrees, is almost as hot as lightning. This intense, sustained energy becomes so hot that it transforms materials into their constituent atomic elements. The reactions take place at more than 2,700 degrees, which means this isn’t incineration—this is emission-free molecular deconstruction. (The small amount of waste material that survives falls to the bottom of the chamber, where it’s trapped in molten glass that later hardens into inert blocks.)

The seemingly sci-fi transformation occurs because the trash is blasted apart by plasma—the forgotten-stepsister state of matter. Plasma is like gas in that you can’t grip or pour it. But because extreme heat ionizes some atoms (adding or subtracting electrons), causing conductivity, it behaves in ways that are distinct from gas. _Wired

Curiously, in Cleveland, Ohio, bug-witted politician Dennis Kucinich is leading the rabble in protest against the building of a similar garbage gasification plant. Is it that people Cleveland like trash too much to give it up, or is it that their trashy politicians enjoy stuffing the garbage down their throats too much to give it up?


Why Did Al Fin Change His Mind About Big Wind and Big Solar?

Al Fin has changed his mind on many things over the years. But this is one topic that you can check for yourselves. Do a topic search on "Wind Energy". You can trace the progression of Fin's attitude toward big wind and big solar from "very favourable" all the way to "very unfavourable." What happened? No money changed hands to prompt the transformation. It was merely a question of looking at verifiable facts over time, and being compelled logically to change.

But in the larger world, big wind and big solar still have their champions -- from the US White House to the big money green activist groups to the EU bureaucratic apparatus to big money investors who benefit from government subsidies and tax breaks, such as Warren Buffett.

Now we are being told that "renewable energy" has surpassed nuclear power in terms of "energy generation." Does this mean that big wind and big solar are delivering on their promises? Well, no, not really. Look at the chart below, which breaks down the categories of "renewable energy."
The recent reports that renewable energy has overtaken nuclear power as a source of primary energy for the nation have created the mistaken impression that all the windmill and solar panel construction is having a decisive impact. In fact, as the Energy Information Administration's December Monthly Report reveals, 80 percent of "renewable energy" is still supplied by hydroelectricity, wood and biofuels. Twelve percent comes from wind and 1.2 percent from solar. An additional 6 percent comes from burning waste - which not everyone regards as "renewable" - and 2.5 percent comes from geothermal energy.

...Under this set of definitions, the consumption of renewables actually exceeded nuclear power before 1987, until nuclear gained ground as more reactors were completed. Renewables declined after 2000 while nuclear continued to expand from improved performance by existing reactors, even though no new reactors have been built. The slight ascent in renewables over the last few years has come from the expansion of wood, biomass and wind. _RealClearEnergy
It is important to emphasise that no matter how much big wind "capacity" is built, that is not the same thing as power production. And just as important, one must point out that the wind tends not to blow at the time the power is needed. This is a fatal flaw in the big wind scheme. Not only must expensive backup power capacity be built and kept on constant standby to supply any wind deficits, but if wind power output should happen to be excessive in relation to demand, the utility must find a way to dump significant power. In the US Pacific northwest, federal judges have forced utilities to pay wind developers for power, even if the utility could not use it!

This moment-to-moment unreliability of wind generation has been likened to "throwing a live grenade into the power grid control booth." Wind farms are also harmful to the health of people living nearby, and to birds and bats -- for what that is worth to your tender hearts.

Big solar power has many of the same problems, except it is even more expensive than wind. Mr. Obama invested billions of dollars in US taxpayer dollars into already failed or soon-to-fail big wind and big solar projects. But since the backers of these projects were political backers of Mr. Obama, a few $billion wasted here or there doesn't amount to much.

The Obama administration -- for reasons of its own -- is rather slow at catching on to the ruinous effects of the mad pursuit of big wind and big solar, on the national power system and economy. But the governments of Spain and Japan have already been forced to drop their most of their generous government subsidies for these green wastrels, out of a return to basic economic common sense.

Sharp contraction ahead for solar power industry

Vestas Wind Systems shares lost 92% of value since 2008

General problems with wind power ... articles from Master Resource blog

This slideshare presentation on wind power started the shift in Al Fin's opinion of big wind power

It should be noted that Al Fin took a sabbatical from his day job a number of years ago, in order to get involved in one of his greatest enthusiasms -- renewable energy, esp. wind and solar power. He expanded his knowledge of power systems, electrical engineering, and residential, commercial, and industrial electricity, in order to be able to participate in the installation of renewable power systems. He had a great deal of fun in the process, and felt he was accomplishing some good things.

He had to return to his day job, but he retained his warm feeling toward wind and solar power. But along the way, Fin learned that there is a tremendous difference in justifiability between a small, off-grid wind or solar installation and a giant wind farm or solar plant. With a small installation, one keeps a close watch on his power usage, power generation, and battery storage state. With large wind or solar installations, there is no way to control for intermittency, unreliability, the huge cost of power standby, and many other problems.

And thus was a mind changed. And a voice that had once promoted big renewables changed to one that criticises them quite harshly.

Minds that are incapable of changing, are minds that have passed their due date -- regardless of the age of the individual. The most fruitful way of understanding a controversial field where the opponents are closely matched, is to study the arguments of those who have changed their minds. Sometimes the arguments justify the change, and sometimes not, but they are typically informative and educational both for what they include and for what they leave out.

Trillions of dollars are on the line in connection with the catastrophic anthropogenic global warming argument. If you want to study a high stakes disagreement, that would be an excellent place to start. BTW, Al Fin changed his mind on that topic as well.

This article was first published on Al Fin blog


Childhood Leukemia in France and Proximity to Nuclear Power Plants

This PDF reprint of an article published in the International Journal of Cancer, provides a glimpse onto the epidemiological battlefield for those laymen who wish to grapple with the ideas directly.

Childhood leukemia (CL) is one of a number of the tragic cancers of childhood. The cause of CL is not known, but CL has been studied in association with maternal age, birth weight, maternal alcohol intake, and most recently proximity to nuclear power plants (NPP).

Most previous studies -- including studies by an author of the study -- did not find an association between NPP proximity and CL rates.
In the authors' previous multisite incidence studies 29; 30 no association between proximity to NPPs and AL was observed. This was in line with most multisite studies 1; 2; 8; 12 , and is also in line with the results of the authors' incidence analysis over the whole period, 1990-2007

...Overall, the estimated doses due to NPPs were very low compared to the doses due to natural radiation sources. Such doses are not expected to result in an observable excess risk on the basis of the available evidence 41 _Study PDF
The authors introduced a new measure -- DBGZ (Dose Based Geographic Zoning) -- in an attempt to more precisely estimate the likely radioactive exposure to subjects due to the regulated release of radioactive materials from NPPs. Although the authors considered their new DBGZ metric to be a success, they failed to show any significant association between CL rates and their DBGZ metrics.

Let's go through the study in relationship with Hill's Criteria of Causation:

1. Temporal Relationship:

Exposure always precedes the outcome. If factor "A" is believed to cause a disease, then it is clear that factor "A" must necessarily always precede the occurrence of the disease. This is the only absolutely essential criterion. This criterion negates the validity of all functional explanations used in the social sciences, including the functionalist explanations that dominated British social anthropology for so many years and the ecological functionalism that pervades much American cultural ecology.

AF: Since the authors were looking at childhood leukemias for children 5 years and under, it is assumed that the nuclear power plants existed before the children were conceived.

2. Strength:

This is defined by the size of the association as measured by appropriate statistical tests. The stronger the association, the more likely it is that the relation of "A" to "B" is causal. For example, the more highly correlated hypertension is with a high sodium diet, the stronger is the relation between sodium and hypertension. Similarly, the higher the correlation between patrilocal residence and the practice of male circumcision, the stronger is the relation between the two social practices.

AF: The authors calculated an "odds ratio" of 1.9 and a "standardised incidence ratio" of 1.9, comparing the exposed to the non-exposed. This would be roughly interpreted as nearly double the risk of CL for children living within 5 km of a NPP. This association was only found for the 2002 through 2007 time span -- not for the full 1990 - 2007 period of the study.

It should be noted for comparison that the "odds ratio" for cigarettes and lung cancer has been calculated as close to 9 (PDF) in some studies, or a 9 X risk for smokers to get lung cancer as opposed to non-smokers.

3. Dose-Response Relationship:

An increasing amount of exposure increases the risk. If a dose-response relationship is present, it is strong evidence for a causal relationship. However, as with specificity (see below), the absence of a dose-response relationship does not rule out a causal relationship. A threshold may exist above which a relationship may develop. At the same time, if a specific factor is the cause of a disease, the incidence of the disease should decline when exposure to the factor is reduced or eliminated. An anthropological example of this would be the relationship between population growth and agricultural intensification. If population growth is a cause of agricultural intensification, then an increase in the size of a population within a given area should result in a commensurate increase in the amount of energy and resources invested in agricultural production. Conversely, when a population decrease occurs, we should see a commensurate reduction in the investment of energy and resources per acre. This is precisely what happened in Europe before and after the Black Plague. The same analogy can be applied to global temperatures. If increasing levels of CO2 in the atmosphere causes increasing global temperatures, then "other things being equal", we should see both a commensurate increase and a commensurate decrease in global temperatures following an increase or decrease respectively in CO2 levels in the atmosphere.

AF: The dose response measures used were either "distance from NPP" or "Dose Based Geographic Zone (DBGZ), a measure devised by the authors specifically for this study. This was an important failing of the study, since important information relating to "radiation dosing" related to distance was assumed and estimated, rather than carefully collected. Time period of exposure -- time at particular addresses etc. -- was likewise not collected, but rather estimated.

4. Consistency:

The association is consistent when results are replicated in studies in different settings using different methods. That is, if a relationship is causal, we would expect to find it consistently in different studies and among different populations. This is why numerous experiments have to be done before meaningful statements can be made about the causal relationship between two or more factors. For example, it required thousands of highly technical studies of the relationship between cigarette smoking and cancer before a definitive conclusion could be made that cigarette smoking increases the risk of (but does not cause) cancer. Similarly, it would require numerous studies of the difference between male and female performance of specific behaviors by a number of different researchers and under a variety of different circumstances before a conclusion could be made regarding whether a gender difference exists in the performance of such behaviors.

AF: As noted above, this study was inconsistent not only with similar studies by other authors, but also with previous studies by the same authors.

5. Plausibility:

The association agrees with currently accepted understanding of pathological processes. In other words, there needs to be some theoretical basis for positing an association between a vector and disease, or one social phenomenon and another. One may, by chance, discover a correlation between the price of bananas and the election of dog catchers in a particular community, but there is not likely to be any logical connection between the two phenomena. On the other hand, the discovery of a correlation between population growth and the incidence of warfare among Yanomamo villages would fit well with ecological theories of conflict under conditions of increasing competition over resources. At the same time, research that disagrees with established theory is not necessarily false; it may, in fact, force a reconsideration of accepted beliefs and principles.

AF: The authors admit that natural radiation exposures for subjects would almost certainly overwhelm any likely radiation exposures from proximity to the NPPs, according to known mechanisms of radiation exposure.

6. Consideration of Alternate Explanations:

In judging whether a reported association is causal, it is necessary to determine the extent to which researchers have taken other possible explanations into account and have effectively ruled out such alternate explanations. In other words, it is always necessary to consider multiple hypotheses before making conclusions about the causal relationship between any two items under investigation.

AF: The authors fail to provide alternate explanations for their findings.

7. Experiment:

The condition can be altered (prevented or ameliorated) by an appropriate experimental regimen.

AF: Experiments in this setting would be unethical. [More: A prohibition zone of 5 km around NPPs could be created, and subsequent rates of CL could be measured and compared with earlier rates. This would not be a valid experiment, but might satisfy a small subset of bureaucrats and/or activists.]

8. Specificity:

This is established when a single putative cause produces a specific effect. This is considered by some to be the weakest of all the criteria. The diseases attributed to cigarette smoking, for example, do not meet this criteria. When specificity of an association is found, it provides additional support for a causal relationship. However, absence of specificity in no way negates a causal relationship. Because outcomes (be they the spread of a disease, the incidence of a specific human social behavior or changes in global temperature) are likely to have multiple factors influencing them, it is highly unlikely that we will find a one-to-one cause-effect relationship between two phenomena. Causality is most often multiple. Therefore, it is necessary to examine specific causal relationships within a larger systemic perspective.

AF: Causes of acute childhood leukemias are poorly understood in general.

9. Coherence:

The association should be compatible with existing theory and knowledge. In other words, it is necessary to evaluate claims of causality within the context of the current state of knowledge within a given field and in related fields. What do we have to sacrifice about what we currently know in order to accept a particular claim of causality. What, for example, do we have to reject regarding our current knowledge in geography, physics, biology and anthropology in order to accept the Creationist claim that the world was created as described in the Bible a few thousand years ago? Similarly, how consistent are racist and sexist theories of intelligence with our current understanding of how genes work and how they are inherited from one generation to the next? However, as with the issue of plausibility, research that disagrees with established theory and knowledge are not automatically false. They may, in fact, force a reconsideration of accepted beliefs and principles. All currently accepted theories, including Evolution, Relativity and non-Malthusian population ecology, were at one time new ideas that challenged orthodoxy. Thomas Kuhn has referred to such changes in accepted theories as "Paradigm Shifts".

AF: The claimed association is not compatible with nor explainable by existing theory.

It should be pointed out that out of 2,753 cases of CL in France from 2002 to 2007, 14 cases were estimated to have occurred within 5 km of a NPP. Given the small number of cases in question, an odds ratio of 1.9 would need to be confirmed by testing over other time periods besides the 2002 to 2007 period. The fact that no association was found over the entire 1991 to 2007 time period suggests that the "significant" odds ratio for the 2002 to 2007 time period was due to chance.

Read the entire study (PDF) to answer any further questions you may have.

Al Fin epidemiologists were quite disappointed at the low quality of the media coverage offered so far.


Friday, January 20, 2012

As Energy Mix Changes, Dependency on Oil Will Decrease

Oil is expected to be the slowest-growing fuel in terms of demand over the next 20 years as biofuels and other renewable energy sources take a larger role in meeting the world's additional energy needs, BP said Wednesday.

...Renewables on their own contribute more to world energy growth than oil, BP said, with the largest single fuel contribution coming from gas, which will meet about at third of the projected growth in global energy demand....Shale gas and coal bed methane will account for almost two thirds of US production by 2030 as the country looks at potential LNG exports. _Platts
BP Energy Outlook 2030

The North American shale oil bonanza will not only feed into the global LNG market, but also into the gas to liquids (GTL) market.
With North American natural gas reserves climbing to more than a century of production, developers are considering the once unthinkable alternative of converting North American natural gas to petroleum fuels and chemicals through Fischer-Tropsch synthesis. Several projects are proposed to use gas from conventional and unconventional production. From the North Slope of Alaska to the marshes of Louisiana, developers are evaluating full-scale GTL. This meeting will review the projects as well as the dynamics behind this shift. _Zeus Houston 7 March 2012
And besides GTL and LNG, other markets will grow to utilise the unconventional gas windfall. Markets for ethylene, other high value chemicals, plastics, and more, exceed $400 billion a year in the US alone. When the price spread for natural gas is so favourable, one cannot expect big chemical companies to ignore potential savings.

All of the estimates displayed above tend to significantly underestimate GTL, and probably CTL as well. Whether biomass to liquids technologies shoot upward as quickly as they could do, depends upon the economics of competing feedstocks such as methane. If breakthrough technologies are developed for harvesting methane hydrates, advanced biofuels to liquids may well either be delayed, or will be combined with both CTL and GTL in a variety of combinations -- depending upon the end product desired.

As the greedy oil dictatorships of the world (Russia, Iran, Venezuela, etc) see more substitution liquid fuels coming on to the market in greater volumes and with greater rapidity, they will discover a sudden need to reinvest in oil & gas field technologies AND the need to begin "playing nice" with international investors.


Brown Seaweed to Biofuels and Chemicals......Breakthrough?

The key benefits of BAL technology are:
Single Platform. BAL converts seaweed carbohydrates into one renewable chemical intermediate that is affordable and scalable for both fuels and chemicals.

Commercial Focus. Leveraging the single platform, BAL will first commercialize high-value products to generate early cash flow that simultaneously paves the path for larger market opportunities.

First Mover Advantage. With over 60 patents or patents pending, BAL has carved a broad IP estate for the use of seaweed as a biomass for chemicals and fuels.

BAL has developed a diverse product portfolio that provides large market opportunities at varying price points. Products include road transport fuels, green plastics, surfactants, agrochemicals, synthetic fibers and nutraceuticals. _BioArchitectureLab
What did the researchers at Bio Architecture Lab actually achieve?
Prospecting macroalgae (seaweeds) as feedstocks for bioconversion into biofuels and commodity chemical compounds is limited primarily by the availability of tractable microorganisms that can metabolize alginate polysaccharides. Here, we present the discovery of a 36–kilo–base pair DNA fragment from Vibrio splendidus encoding enzymes for alginate transport and metabolism. The genomic integration of this ensemble, together with an engineered system for extracellular alginate depolymerization, generated a microbial platform that can simultaneously degrade, uptake, and metabolize alginate. When further engineered for ethanol synthesis, this platform enables bioethanol production directly from macroalgae via a consolidated process, achieving a titer of 4.7% volume/volume and a yield of 0.281 weight ethanol/weight dry macroalgae (equivalent to ~80% of the maximum theoretical yield from the sugar composition in macroalgae). _Science Abstract
They increased the fermentation yield of ethanol from brown algae by genetic tweaking of their microbial fermentation platform.
Seaweed can be an ideal global feedstock for the commercial production of biofuels and renewable chemicals because in addition to its high sugar content it has no lignin, and it does not require arable land or freshwater to grow. Globally, if three percent of the coastal waters were used to produce seaweed than more than 60 billion gallons of fossil fuel could be produced. Today, in many parts of the world, seaweed is already grown at commercial scale. BAL currently operates four seaweed farms in Chile and has had great success in growing seaweed at economically viable production yields.

...“BAL's technology to ferment a seaweed feedstock to renewable fuels and chemicals has created an entirely new pathway for biofuels development, one that is no longer constrained to terrestrial sources,” says ARPA-E Program Director Dr. Jonathan Burbaum. “When fully developed and deployed, large scale seaweed cultivation combined with BAL’s technology promises to produce
renewable fuels and chemicals without forcing a tradeoff with conventional food crops such as corn or sugarcane.” _BAL (PDF)
It is far easier to grow large quantities of macro-algae in the sea, than microalgae. Macro-algae is much tougher and holds together in large masses for easier harvesting. Up to 4 crops a year can be grown, at very rapid biomass rates.

It should be clear that by adding roughly 70% of the earth's surface area to one's potential crop growing area, the limits to biomass growth have been expanded considerably.

H/T NextBigFuture

More from Green Car Congress

Labels: ,

Thursday, January 19, 2012

Limits? What Limits?

How would an advanced galactic empire generate its power?
Some fraction of the radiation seething from the disk would be reflected and focused onto the power plants. Each power plant would transmit collected energy as a collimated microwave beam from a 100-mile diameter antenna. _Discovery

A truly advanced civilisation on the Kardashev scale would harness the power of black holes to drive starships (PDF), power their industries, and gain control over both matter and time. We may have a few years to go before reaching that level.
A consortium of super-civilizations might pool resources to build a chain of power stations encircling the black hole. It would be the heart of a robust and fault-tolerant energy grid connecting numerous worlds like a fantasy scene out of the film "Tron."

However, I think it is more likely that a federation of expanding space colonies, spawned from a single mother civilization, would work together to maintain their viability. This wouldn't run into the thorny question of how two or more independent but similarly co-evolved species manage to contact each other and work out a practical energy infrastructure. _Discovery

Researchers at the U.S. Department of Energy's Los Alamos National Laboratory believe that magnetic field lines extending a few million light years from galaxies into space may be the result of incredibly efficient energy-producing dynamos within black holes that are somewhat analogous to an electric motor....The energy in these huge magnetic fields is comparable to that released into space as light, X-rays and gamma rays. In other words, the black hole energy is being efficiently converted into magnetic fields.

Colgate and Los Alamos colleagues Vladimir Pariev and John Finn have developed a model to perhaps explain what is happening. They believe that the naturally magnetized accretion disk rotating around a black hole is punctured by clouds of stars in the vicinity of the black hole, like bullet holes in a flywheel. This, in turn, leads nonlinearly to a system similar to an electric generator that gives rise to a rotating, but invisible magnetic helix.
In this way, huge amounts of energy are carried out and away from the center of a galaxy as a set of twisted magnetic field lines that eventually appear via radio waves from luminous cloud formations on opposite sides of the galaxy. _SD

So you see, humans do not yet understand how black hole energy is converted into all the forms of energy that are propagated within and throughout the galaxy. But give us some time -- and a respite from all the energy starvationists hounding our steps -- and we just might take it to the next level.

Cross-posted from Al Fin blog


Newer Posts Older Posts