Scientists from Jülich and Berlin have developed a material for converting hydrogen and oxygen to water in a fuel cell using one tenth of the typical amount of platinum that has been previously required. The fuel cell is pretty much stuck looking for economically viable, efficient and robust units to make commercial scale mass-market units.
The German researchers discovered with the aid of state-of-the-art electron microscopy that the function of the nanometer-scale catalyst particles is decisively determined by their geometric shape and atomic structure. This discovery opens up new paths for further improving catalysts for energy conversion and storage.
Hydrogen-powered fuel cells are idealized as a clean alternative to conventional combustion engines because aside from electrical energy power output, the only substance produced during operation is water. Presently the implementation of hydrogen fuel cells is being held up by the high price of platinum. Large quantities of the expensive noble metal are still required for the electrodes in the fuel cells where the chemical conversion processes take place. So far, without the catalytic effect of the platinum, it is not currently possible to achieve the necessary conversion rates with units currently in production.
We know catalysis takes place only at the surface of the platinum so material can be saved. At the same time the efficiency of the electrodes is being improved by using platinum nanoparticles, thus increasing the ratio of platinum surface to material required. Although the tiny particles are around ten thousand times smaller than the diameter of a human hair, the surface area of a kilogram of such particles is equivalent to that of several football fields. There is still a lot of platinum involved.
More platinum could be saved by mixing it with other, less valuable metals, such as nickel or copper. Scientists from Forschungszentrum Jülich and Technische Universität Berlin have proven the point by succeeding in developing efficient metallic catalyst particles for converting hydrogen and oxygen to water using only a tenth of the typical amount of platinum that was previously required.
The new catalyst consists not of the round nanoparticles that are previously in widespread use, but of octrahedral-shaped nanoparticles of a platinum-nickel alloy. The researchers discovered that the unique manner in which the platinum and nickel atoms arrange themselves on the surfaces of these particles serves to optimally accelerate the chemical reaction between hydrogen and oxygen to form water. Round or cubic particles, on the other hand, have different atomic arrangements at the surface and are therefore less effective catalysts for the chemical reaction, something which would have to be compensated for by using increased amounts of noble metal.
The way in which the life-cycle of the catalysts operates and can be optimized by their atomic composition was the subject of the research team’s investigation, which made use of ultrahigh-resolution electron microscopy at the Ernst Ruska-Centre (ER-C), a facility of the Jülich Aachen Research Alliance.
Dr. Marc Heggen from ER-C and the Peter Grünberg Institute at Forschungszentrum Jülich explains, “A decisive factor for understanding the life-cycle of the catalysts was the observation that nickel and platinum atoms prefer not to be evenly distributed at the surface of the nano-octahedra. Although this is advantageous for reactivity, it limits lifetime.”
To identify the location of each element with atomic precision, the researchers used a method in which the electron beam of one of the world’s leading ultrahigh-resolution electron microscopes is finely focused, sent through the specimen and, by interactions with the specimen, loses part of its energy. Each element in the specimen can thus be identified like a fingerprint. Conventional electron microscopes haven’t the capability of detecting such chemical signatures with atomic resolution.
Prof. Peter Strasser from Technische Universität Berlin said, “This pioneering experimental work provides direct evidence for the fact that the choice of the correct geometric shape for the catalyst particles is as important for optimizing their function as the choice of their composition and size. This provides researchers with new possibilities for further improving functional materials, especially catalysts, for energy storage.”
The latest experiments from Strasser’s research group indicate that substantial increases in efficiency may also be possible for the reaction splitting water to produce oxygen in electrolysers, in which the even more expensive noble metal iridium is used.
This is quite an innovative way to look at solving the problem that platinum presents. There are many efforts underway worldwide to get a solution that can scale up to commercial mass-market at pricing which will trigger huge growth. The one it will be isn’t known yet, but as these ideas pile up, one is surely going to make it over the top. When that happens and the market takes off, even more research will get done driving fuel cell and water splitting pricing to ever-lower costs.
Let it come, it can’t happen soon enough.
Hydraulic fracking has been studied with a published paper showing the energy return on investment (aka EROI) with a total input energy compared with the energy in natural gas expected to be made available to end users is similar to or better than coal.
The news for the natural gas industry, consumers and landowners lucky enough to get a well on their land is tremendously helpful. Its also a huge and crushing disappointment for the anti fracking crowd. Hydraulic fracturing pays off in a very big way.
The analysis indicates that the EROI ratio of a typical well is likely between 64:1 and 112:1, with a mean of approximately 85:1. This range assumes an estimated ultimate recovery (EUR) of 3.0 billion cubic feet per well. This is similar but significantly higher to the EUR of coal, which falls between 50:1 and 85:1.
Obviously the coal folks are less than thrilled, too. For now though over 75% of our current electricity needs come from a mix of gas and coal, and 83% of our homes are heated by gas. Luckily they are the low cost leaders except for nuclear.
U.S. utilities have ordered 20 reactors shut, the most in a three-year span since Chernobyl’s aftermath, saddling the industry with a possible $26 billion in costs to pass along to consumers. The nuclear fraction of power generation is going to shrink – a massive cost instead of a savings due to the political environment built up over two presidential election cycles. And the technology remains stalled in bureaucratic red tape machine and a paper blizzard.
Lead author Michael L. Aucott said in the Wiley press release about the study, “Our analysis indicates that gas can be extracted from shale efficiently, from an energy perspective. The energy return on (energy) investment ratio (EROI) does seem to be at least as favorable as coal. However, a comparison with coal is difficult. There appear to be large amounts of coal still available. Estimates of the amount of gas available from the shale plays vary widely. It is not clear yet whether there is anywhere near enough to rival coal over the long haul.”
Aucott concluded with, “There are concerns about water pollution and other environmental impacts associated with shale gas production. With the assumption that these can be managed, and that production quantities remain consistent with initial production data, the favorable EROI suggests that shale gas will be a viable energy source for quite some time.”
The value of a fuel’s long-term usefulness and viability is judged through its EROI. The EROI had been for a few years the darling of the peak oil enthusiasts. Now its come full circle for consumers, business and policy makers to use productively again.
There is sure to be some blowback from the opposition. But to temper things right at the start Aucott and his coauthor Jacqueline M. Melillo used natural gas records obtained from horizontal, hydraulically fractured wells in only in the Marcellus Shale region commonly identified as of Pennsylvania and New York. The study was conducted using net external energy ratio methodology and available data and estimates of energy inputs and outputs. The Marcellus Shale is only one of several fields in active development and not necessarily with the best economics, but certainly is close to a huge hungry market.
The curious part of the back-story is Aucott and Melillo aren’t claiming a university association. That comes as no surprise as the results are going to be quite a political “hot potato”, right or wrong, for years to come. Still, a little googling will reveal the pair is worthy of the peer-reviewed paper getting published. Your humble writer will respect their privacy.
The last point from here is noting the study illustrates why the board of directors at Chesapeake Energy ran off co-founder, retired chief executive officer and former chairman Aubrey McClendon. And shows the grounding of the latest news that McClendon is pitching Wall Street on his new energy company for taking another shot at the US energy boom with perhaps $1 billion of startup funds.
The only way – the only way possible to screw up the U.S. energy economy is for politics and bureaucrats to get further involved.
Chinese scientists have already succeeded in recovering a sensational 15% of the residual oil in their test reservoir when they formed a collaboration with the Centre for Integrated Petroleum Research (CIPR) in Bergen, Norway researchers to find out what had actually taken place down in the reservoir.
Oil in reservoirs is confined in tiny pores within rock, often sandstone. In the “old days” of easy oil the natural pressure in a reservoir was so high that the oil flowed upwards when drilling reached the rocks containing the oil.
When the pressure is used up and the petroleum companies abandon an oil well, more than half the reservoir’s oil is usually left behind as too difficult to recover. Now, however, much of the residual oil can be recovered with the help of nanoparticles and a simple law of physics.
In the petroleum companies’ arsenal to maintain the pressure within a reservoir the companies have learned to displace the produced oil by injecting water. The water forces out the oil located in areas near the injection point. The actual injection point may be hundreds or even thousands of meters away from the production well.
But eventually water injection loses its effect. Once the oil from all the easily reached pores has been recovered, water begins emerging from the production well instead of oil, at which point the petroleum engineers have few choices other than to shut down the well.
The petroleum industry and research community have been working for decades on various solutions to increase recovery rates. One group of researchers at CIPR, collaborating with researchers in China, has developed a new method for recovering more oil from wells – and not just more, far more.
The big news is the Norwegian partner in the collaboration has succeeded in recovering up to 50% of the oil remaining in North Sea rock samples.
The technology used to achieve these high recovery rates; the researchers make use of a simple physical phenomenon depicted in the figure above.
Water in an oil reservoir flows much like the water in a river, accelerating in narrow stretches and slowing where the path widens.
When water is pumped into a reservoir, the pressure difference forces the water away from the injection well and towards the production well through the tiny rock pores. These pores are all interconnected by very narrow tunnel-like passages, and the water accelerates as it squeezes its way through them.
The new method is based on infusing the injection water with particles that are considerably smaller than the tunnel diameters. When the particle-enhanced water reaches a tunnel opening, it will accelerate faster than the particles, leaving the particles behind to accumulate and plug the tunnel entrance, ultimately sealing the tunnel.
This forces the following water to take other paths through the rock’s pores and passages – and in some of these there is oil, which is forced out with the water flow. The result is more oil extracted from the production well earning more revenue for the petroleum companies.
The new particles are quite interesting. They are elastic or their shapes can change. The particles that are used are typically 100 nanometers in diameter, or 100 times smaller than the 10-micron-wide tunnels currently under research.
The Bergen and Beijing researchers have tested a variety of particle sizes and types to find those best suited for plugging the rock pores, which turned out to be elastic nanoparticles made of polymer threads that retract into coils. The particles are made from commercial polyacrylamide such as that used in water treatment plants. Nanoparticles in solid form such as silica were less effective.
The idea for this method of oil recovery came from the two Chinese researchers Bo Peng and Ming yuan Li who completed their doctorates in Bergen 10 and 20 years ago, respectively. The University of Bergen and China University of Petroleum in Beijing have been cooperating for over a decade on petroleum research, and this laid the foundation for collaboration on understanding and refining the particle method.
Field studies in China not only yielded more oil, but also demonstrated that the nanoparticles indeed formed plugs that subsequently dissolved during the water injection process. Nanoparticles were found in the production well 500 meters away.
Arne Skauge, Director of CIPR said, “The Chinese were the first to use these particles in field studies. The studies showed that they work, but there were still many unanswered questions about how and why. At CIPR we began to categorize the particles’ size, variation in size, and structure.”
At first it was not known if the particles could be used in seawater, since the Chinese had done their trials with river water and onshore oilfields. Trials in Bergen using rock samples from the North Sea showed that the nanoparticles also work in seawater and help to recover an average of 20 to 30 percent, and up to 50%, more residual oil.
The Centre for Integrated Petroleum Research is the only institution for petroleum research under the Norwegian Centres of Excellence scheme. CIPR is now supplementing its expertise on oil reservoirs with nanotechnology know-how in seeking ways to recover residual oil.
Success could have far-reaching impacts. The state-owned petroleum company, Statoil, is seeking to increase current recovery rates, which range from under 50%, to roughly 60%.
“We hope this new method can help to raise recovery rates to 60 or 65 percent,” Mr. Skauge said.
There is opportunity here for oil production firms. The Bergen researchers want to test out the method large-scale. “We’d like to try it in the North Sea and are in contact with Statoil, but we are certainly not the only ones hoping for a chance. We are competing with many promising methods for raising recovery rates,” explains Mr. Skauge. “That is why we may well test the method onshore in other regions, such as the Middle East. Several actors from there have contacted us after reading our published papers.”
The technology isn’t fully worked out for any reservoir service work. The researchers will be learning as much as they can about the particles, the pores and how the activity can be optimized.
“We are working hard to understand why the particles work well in some rock types and more marginally in others,” says Kristine Spildo, project manager at CIPR. “This is critical for determining which North Sea fields are best suited to the method.”
There are three research papers published now and the link to the CIPR site has many informative links to the nanoparticle and other ideas for lay persons as well as industry professionals.
The secondary tertiary and even further efforts to get more oil from the oil fields already found and producing will proceed. The end of oil availability isn’t anywhere close.
Raymond Schaak, a professor of chemistry at Penn State University with his research team members have found that an important chemical reaction that generates hydrogen from water is effectively triggered – or catalyzed – by a nanoparticle composed of nickel and phosphorus, two inexpensive elements that are abundant on Earth.
The new discovery may well lead to cheaper hydrogen production technologies.
Schaak explained that the purpose of the nickel phosphide nanoparticle is to help produce hydrogen from water, which is a process that is important for many energy-production technologies, including fuel cells and solar cells.
“Water is an ideal fuel, because it is cheap and abundant, but we need to be able to extract hydrogen from it,” Schaak said. Hydrogen has a high energy density and is a great energy carrier, Schaak explained, but it requires energy to produce. To make its production practical, scientists have been hunting for a way to trigger the required chemical reactions with an inexpensive catalyst.
Schaak noted that this feat is accomplished very well by platinum but, because platinum is expensive and relatively rare, he and his team have been searching for alternative materials. “There were some predictions that nickel phosphide might be a good candidate, and we had already been working with nickel phosphide nanoparticles for several years,” Schaak said. “It turns out that nanoparticles of nickel phosphide are indeed active for producing hydrogen and are comparable to the best known alternatives to platinum.”
To create the nickel phosphide nanoparticles the team members began with metal salts that are commercially available. They then dissolved these salts in solvents, added other chemical ingredients, and heated the solution to allow the nanoparticles to form. The researchers were able create a nanoparticle that was quasi-spherical – not a perfect sphere, but spherical with many flat, exposed edges. “The small size of the nanoparticles creates a high surface area, and the exposed edges means that a large number of sites are available to catalyze the chemical reaction that produces hydrogen,” Schaak explained.
The next step took place at the California Institute of Technology where team members tested the nanoparticles’ performance in catalyzing the necessary chemical reactions. This segment of the research was led by Nathan S. Lewis, the George L. Argyros Professor of Chemistry at the California Institute of Technology. The researchers performed the tests by placing the nanoparticles onto a sheet of titanium foil and immersing that sheet in a solution of sulfuric acid. Next, the researchers applied a voltage and measured the current produced. They found that, not only were the chemical reactions happening as they had hoped, they also were happening with a high degree of efficacy.
“Nanoparticle technology has already started to open the door to cheaper and cleaner energy that is also efficient and useful,” Schaak said. “The goal now is to further improve the performance of these nanoparticles and to understand what makes them function the way they do. Also, our team members believe that our success with nickel phosphide can pave the way toward the discovery of other new catalysts that also are composed of Earth-abundant materials. Insights from this discovery may lead to even better catalysts in the future.”
The researchers working with Schaak and Lewis who contributed to this study include Eric J. Popczun, Carlos G. Read, Adam J. Biacchi, and Alex M. Wiltrout from Penn State; and James R. McKone from the California Institute of Technology.
The curious fact is the nanoparticles of nickel phosphide are hollow and faceted to expose a high density of the nickel phosphide surface, which had previously been predicted based on theory to be an active HER catalyst.
The press release and the abstract both leave us without an efficiency result. Avoiding platinum alone will make a huge difference, but both nickel phosphide and platinum will need a source of energy to perform the electrolysis.
Hydrogen could very well be a fuel for the future if the economy climate and political interference can lure out the lowest possible cost means to produce electric current to drive the reaction.
This team’s work is at such an early stage that many questions are due up for answering soon. We’d like to know about the products, is it mono or di hydrogen and oxygen products, the watts needed per unit and the proposed mechanics to separate the gases.
Looks good as far as they’ve gotten, let’s hope they get much further successfully.
Scientists at the Technische Universitaet Muenchen (TUM) have synthesized a novel framework structure consisting of boron and silicon, which could serve as a Lithium-ion battery electrode material. The new material is similar to the carbon atoms in diamond as the boron and silicon atoms in the novel lithium borosilicide (LiBSi2) are interconnected tetrahedrally.
The new material adds a new dimension as they form, making channels within the structure with many more sites where the lithium ions can locate. That suggests laptops could work longer and electric cars could drive farther with increases in the capacity of their lithium-ion batteries.
The electrode material has a decisive influence on a battery’s capacity. Today’s lithium ion negative electrode typically consists of graphite, whose layers can store lithium atoms. The scientists at (TUM) have developed a process to build a material made of boron and silicon that could enable systems with higher capacities.
Loading a lithium-ion battery produces lithium atoms that are taken up by the graphite layers of the negative electrode. However, the capacity of graphite is limited to one lithium atom per six carbon atoms. Silicon could take up to ten times more lithium. But unfortunately, its dimensions expand during this process – which leads to unsolved problems in battery applications.
But a 10-fold capacity increase potential is a powerful incentive.
Thomas Fässler, professor at the Institute of Inorganic Chemistry at TUM said, “Open structures with channels offer in principle the possibility to store and release lithium atoms. This is an important requirement for the application as anode material for lithium-ion batteries.”
In the high-pressure laboratory of the Department of Chemistry and Biochemistry at Arizona State University, the scientists brought the starting materials lithium boride and silicon to a reaction. At a pressure of 100,000 atmospheres and temperatures around 900 degrees Celsius, the desired lithium silicide formed.
“Intuition and extended experimental experience is necessary to find out the proper ratio of starting materials as well as the correct parameters,” added Fässler.
As a bonus, lithium borosilicide is stable to air and moisture and withstands temperatures up to 800° C (1472º F).
Next, Fässler and his graduate student Michael Zeilinger want to examine more closely how many lithium atoms the material can take up and whether it expands during charging. Because of its crystal structure the material is also expected to be very hard, which would make it attractive as a diamond substitute as well.
It will be interesting to see if a strong and hard alloy structure can remain stable, crack apart or stuff itself with a huge load of lithium ions over many cycles. Then comes the question of production costs and scaling. There is a long way to go.
Because the framework structure of the lithium borosilicide is unique, Fässler and Zeilinger had an opportunity a name to their new framework material. In honor of their university, they chose the name “tum.” We’ll be watching for it.
The research effort was widespread. Cooperation partners of the project were the Department of Physics at University of Augsburg and the Department of Materials and Environmental Chemistry at Stockholm University. The work was funded by the TUM Graduate School, the German Chemical Industry Fund, the German Research Foundation, the Swedish Research Council and the National Science Foundation, USA.
One hopes this works with an incredible number of charge and discharge cycles made commercially at low cost. Lithium-ion technology is going to get competition and more advantages need to come soon. We consumers will be pleased with more and better choices.
Japan’s National Institute of Advanced Industrial Science and Technology (AIST) and Sumitomo Chemical Co Ltd have developed a “power-saving” sheet that blocks sunlight in summer and lets it through in winter. The idea is to stop the heat gain from summer infrared, yet let he the infrared come in during winter while keeping the window glass transparent.
The sheet can be attached to a glass window or other kinds of materials. Development exploits the fact that the incidence angle of sunlight in summer is different from that in winter. As the sun sinks lower in the sky more of the infrared is allowed through. Surprisingly, the new sheet design does not change the view through the window.
The idea offers a vast array of private, commercial and public buildings a means to reduce the heat gain from sunshine in air conditioning periods. By simply installing the sheet it is possible to adjust the amount of light coming inside. Sumitomo Chemical expects that the sheet will reduce the amount of power consumed by an air conditioner and assist with heating saving a worthwhile amount of money.
The sheet is made by fitting together two specially designed transparent sheets. Generally, light coming into the transparent sheet at an angle greater than a certain degree is totally reflected when it is coming out of the sheet. If the front and back sides of the sheet are not parallel to each other, the total reflection occurs with a smaller angle, making it possible, for example, to block sunlight during daylight hours in summer. On the other hand, sunlight in winter passes through the sheet because its incidence angle is small.
The idea is quite a clever innovation.
AIST and Sumitomo Chemical fitted together two sheets to cancel out the refraction to achieve a normal window view. If only one transparent sheet whose front and back sides are not parallel to each other is attached to a window, the view through the window is uplifted due to refraction. The two-sheet solution is a breakthrough idea.
The AIST folks reasoned that a window glass capable of adjusting the amount of incoming sunlight depending on its incidence angle could be built by using the total reflection phenomenon on the surface of a transparent material. So the AIST folks developed a light ray-tracing program for analyzing the reflection and transmission of sunlight.
That’s how AIST found a structure that blocks direct solar radiation in summer as much as possible while keeping the window transparent. Sumitomo Chemical prototyped the infrared managing sheet by using its processing technologies.
Sumitomo Chemical will make improvements to the manufacturing method of the sheet to achieve a better light blocking effect. The company intends to develop a better method to attach the sheet to a glass window and plans to commercialize the sheet in two or three years. Sumitomo has already exhibited the sheet at the 2013 Automotive Engineering Exposition last month in Yokohama City, Japan.
Sumitomo and AIST are both rather circumspect by U.S. standards. There is no press release, at least in English to be found. But Masaru Yoshida, Nikkei Monozukuri at the Nikkei Business Publications Tech-On dug up the information where the story has been a leading story for days.
Lets hope the development continues and we get to use these. The idea of blocking the summer heat and allow the winter heat in will make a very attractive product.
A new enzyme technology allowing the corn ethanol biofuels industry to produce more ethanol with less corn while saving energy and improving profits was announced yesterday. The technology is a new pair of enzymes combined with a third one that Novozymes shows saves up to 5% of the corn used in U.S. ethanol production. Even more useful in the food vs. fuel debate is the technology also increases corn oil extraction by 13%. As a practical matter the technology also saves 8% of the energy needed during production.
The efficiency improvements can be achieved when two new enzymes, Spirizyme® Achieve and Olexa®, are used together with another Novozymes enzyme, Avantec®.
Andrew Fordyce, Executive Vice President for Business Operations at Novozymes explains the effect with, “These new enzyme innovations offer strong benefits to ethanol producers. It allows our customers to make more from less and substantially improve their profit margins”.
For example take a typical U.S. ethanol plant. These use around 36 million bushels (900,000 tons) of feed-grade corn per year to produce 100 million gallons of fuel ethanol, 300,000 tons of animal feed called Dried Distillers Grain with the Solubles (DDGS) [the Wikipedia link while informative is quite out of date.] and 8,500 tons of corn oil. By using Avantec, Olexa and Spirizyme Achieve, such a plant can save up to 1.8 million bushels (45,000 tons) of corn while maintaining the same ethanol output, increasing the corn oil extraction, and generating up to $5 million in additional profit.
The Avantec product was introduced in October 2012 and Novozymes says it has been well received in the U.S. ethanol industry. “Our customers demand risk-free options that do not require major investments. That is exactly what our enzymes offer. We are the first to market this full package and are looking forward to implementing it together with our customers, trialing the technology at their plants, and getting the solutions out there. It’s a competitive industry and only via innovation like this can Novozymes continue to be the leading supplier of enzymes to the ethanol industry.” Fordyce added.
In the U.S. corn is the key raw material in biofuel production and by far the biggest cost component for an ethanol plant. After the corn is harvested, the kernels are ground into corn meal and water is added to make a mash. The enzymes convert the starch in the mash to sugar, which can then be fermented with brewers yeast to make ethanol. Avantec and Spirizyme Achieve convert starch to sugar more efficiently than any other enzyme product on the market, while Olexa works by freeing up oil bound in the corn germ.
Corn oil is used in a huge array of products. It’s used in food preparation, the production of animal feed, biodiesel and soaps and other products. Corn oil has become an increasingly important revenue stream for ethanol producers. Extensive implementation of extraction technology from 2008 to 2012 has seen the industry record a nearly five-fold increase in corn oil production, according to a study by the University of Illinois at Chicago.
Novozymes estimates that approximately 80% of the operating ethanol capacity in the U.S. will have incorporated oil extraction into their plants by end 2013. There is too much opportunity for products and revenue streams to be ignored.
A bit of background about corn. There are four primary corn crops. The most familiar is “Sweet Corn” that is what people eat. This corn variety doesn’t make starch – it makes sugar at least until it over matures or sits too long after harvest when the sugar will degrade to starch. The second is “Waxy Corn” that is used to make the corn oil found on the grocers’ shelves in the cooking and baking section and myriad other uses. Both of these crops are small markets and require a great deal more hands on attention. They are strong attractants for vermin, wild animals and insects. The third is popcorn we are all familiar with.
The fourth and huge market is field or flint corn. Field corn is starch rich and thus isn’t such a strong attractant for pests and can be grown in huge amounts all around the world without such intense labor inputs and property capital invested to keep the crop up to food quality. The future will see corn grown for primary proteins and pharmaceutical production.
Meanwhile, the field corn used for ethanol is only stripped of its carbohydrate or starch leaving a very desirable set of components, the protein, fiber and oil. The DDGS noted above, dried and with most of the oil removed is still a third of the mass of the original corn feedstock. Taking out the oil offers savings as the oil out makes drying easier and more efficient.
Novozymes’ technology is not just welcome for lower cost ethanol or less pressure on corn prices, its welcome as the DDGS is a necessary animal feed product and offers researchers a great potential for protein products.
Those making the food vs. fuel argument rely on the ignorance of the audience. The ethanol industry is continually making improvements and seeking higher value from the process. It’s only a matter of time before the corn that hasn’t ever been directly used as human food will indirectly become a protein based human food product. As the world’s population increases there will be a great incentive to use that huge reservoir of protein directly to feed people instead of feeding it to animals and then eating them. The future, and its coming fast, won’t be food vs. fuel it will be fuel and food.
Almost everyone from the consumer to the giant oil company is tired of the very high crude oil prices seen now for over a year. For consumers it’s a drain and a competitor to other disposable income purchases. For oil companies the high prices over a long time sets up an expectation that rises will stay up and misdirect investments funds, cause excess spending and raise internal costs. The ride down, when it comes will be much better for the consumer than the oil industry.
But that just sets up another price rise cycle.
The two factors in crude oil prices going up and down are the demand driving prices and production pushing prices. For Americans and other developed economies the demand has backed off, but the developing economies have just kept on demanding so world crude prices haven’t gone down.
But supplies, especially in the U.S. have risen dramatically. That should push prices lower, and a lot, but something is in the way.
Geoffrey Styles writes in the blog Energy Outlook about the petroleum industry, the most succinct and on point commentary and information source of a complex industry. Mr. Styles posted a question last Friday and set out to answer if U.S. oil trends might change how crude oil prices are set.
The facts are not a happy circumstance for energy consumers or the oil industry.
Here’s why: Crude oil comes in grades that range similar to the familiar motor oil in the engine of your car to about as hard as hot asphalt laid out to be compressed and leveled into a road surface. What crude oil is where and where it needs to get to for making useful gasoline, diesel, jet fuel and other products is a major logistical problem.
It would amaze most people to know how much petroleum products from crude oil to little cans of lighter fluid get moved around. There’s the clue – one major difficulty to getting consumer prices down is getting the crude to a processing refinery.
Another part to the point is the refineries that process crude oil are designed to work with particular ranges of crude oil. A refinery that is designed for the light motor oil type crude oils is years away from being rebuilt to handle oil with the consistency of play dough and vice versa as well.
The world is using about 85 million barrels of oil a day and a few weeks worth is moving around every day. We can move the crude, moving the oil fields isn’t possible, and moving or refurbishing the refineries is economically impractical.
For many the idea that the new oil discoveries and production technologies like hydraulic fracturing will make the U.S. energy independent. That could be a fact with a few caveats.
The crude oil needs to get to where it can be processed. The refineries in the U.S. do not match the crude supply with the product demand. To get the products made much of the new production in the U.S. will have to be exported to refineries that can get the job done.
That’s where the transport issue comes in – most U.S. crude oil exports are banned by law – and the exceptions are less than optimal for U.S. consumers.
Even more economically damaging is crude we can use is bottled up on rail cars while the government dithers on a pipeline to get heavy Canadian crude to heavy oil refineries in Texas and Louisiana.
As Mr. Styles pointed out rather kindly, “the US oil export policy merits a thorough reevaluation” is quite an understatement.
The reality is likely to be a huge, but quiet lobbying effort to get something done to the get crude where it needs to be and products to consumers. The effort might well fail.
But the truth will likely be that consumers won’t get the price break anytime soon. Instead due to miss-matched crude to refinery capacity we’ll see regional price spikes, gluts, transportation problems and still higher prices.
Someday an enterprising soul will map the transit routes of various crude oils to the products sold to consumers. That will be an eye popper.
Meanwhile, the government sets out barriers with regulations, delays or permission denials, lawsuits are tolerated, and people understandably don’t want development in “their back yard”.
Crude oil on to gasoline are available because businesses expect to make some money. As society adds on the costs the costs get rolled into the prices. We will see some break in crude oil prices, but the next time the consumer prices will not fall so much. Too many costs are being added in for the full impact to get to you and me.