Stanford University scientists are another step toward building a better battery doing long shifts at the helm of a highly sophisticated microscope recording chemical reactions at near-atomic-scale resolution.
The researchers are camped out with one of the most advanced microscopes in the world to capture an unimaginably small reaction in the Dionne lab 18 feet below the Engineering Quad of Stanford University.
The lab team members conducted arduous experiments – sometimes requiring a continuous 30 hours of work – to capture real-time, dynamic visualizations of atoms that could someday help our phone batteries last longer and our electric vehicles go farther on a single charge.
Toiling underground in the tunneled labs, they recorded atoms moving in and out of nanoparticles less than 100 nanometers in size, with a resolution approaching 1 nanometer.
Jen Dionne, associate professor of materials science and engineering at Stanford and senior author of the paper detailing this work said, “The ability to directly visualize reactions in real time with such high resolution will allow us to explore many unanswered questions in the chemical and physical sciences. While the experiments are not easy, they would not be possible without the remarkable advances in electron microscopy from the past decade.”
Their experiments focused on hydrogen moving into palladium, a class of reactions known as an intercalation-driven phase transition. This reaction is physically analogous to how ions flow through a battery or fuel cell during charging and discharging. Observing this process in real time provides insight into why nanoparticles make better electrodes than bulk materials and fits into Dionne’s larger interest in energy storage devices that can charge faster, hold more energy and stave off permanent failure.
For these experiments, the Dionne lab created palladium nanocubes, a form of nanoparticle, that ranged in size from about 15 to 80 nanometers, and then placed them in a hydrogen gas environment within an electron microscope. The researchers knew that hydrogen would change both the dimensions of the lattice and the electronic properties of the nanoparticle. They thought that, with the appropriate microscope lens and aperture configuration, techniques called scanning transmission electron microscopy and electron energy loss spectroscopy might show hydrogen uptake in real time.
After months of trial and error, the results were extremely detailed, real-time videos of the changes in the particle as hydrogen was introduced. The entire process was so complicated and novel that the first time it worked, the lab didn’t even have the video software running, leading them to capture their first movie success on a smartphone.
Following these videos, they examined the nanocubes during intermediate stages of hydrogenation using a second technique in the microscope, called dark-field imaging, which relies on scattered electrons. In order to pause the hydrogenation process, the researchers plunged the nanocubes into an ice bath of liquid nitrogen mid-reaction, dropping their temperature to 100 degrees Kelvin (-280 F). These dark-field images served as a way to check that the application of the electron beam hadn’t influenced the previous observations and allowed the researchers to see detailed structural changes during the reaction.
Fariah Hayee, lead co-author of the study and graduate student in the Dionne lab tells the story, “With the average experiment spanning about 24 hours at this low temperature, we faced many instrument problems and called Ai Leen Koh [co-author and research scientist at Stanford’s Nano Shared Facilities] at the weirdest hours of the night. We even encountered a ‘ghost-of-the-joystick problem,’ where the joystick seemed to move the sample uncontrollably for some time.”
While most electron microscopes operate with the specimen held in a vacuum, the microscope used for this research has the advanced ability to allow the researchers to introduce liquids or gases to their specimen.
Tarun Narayan, lead co-author of this study and recent doctoral graduate from the Dionne lab said, “We benefit tremendously from having access to one of the best microscope facilities in the world. Without these specific tools, we wouldn’t be able to introduce hydrogen gas or cool down our samples enough to see these processes take place.”
Aside from being a widely applicable proof of concept for this suite of visualization techniques, watching the atoms move provides greater validation for the high hopes many scientists have for nanoparticle energy storage technologies.
The researchers saw the atoms move in through the corners of the nanocube and observed the formation of various imperfections within the particle as hydrogen moved within it. This sounds like an argument against the promise of nanoparticles but that’s because it’s not the whole story.
“The nanoparticle has the ability to self-heal,” said Dionne. “When you first introduce hydrogen, the particle deforms and loses its perfect crystallinity. But once the particle has absorbed as much hydrogen as it can, it transforms itself back to a perfect crystal again.”
The researchers describe this as imperfections being “pushed out” of the nanoparticle. This ability of the nanocube to self-heal makes it more durable, a key property needed for energy storage materials that can sustain many charge and discharge cycles.
As the efficiency of renewable energy generation increases, the need for higher quality energy storage is more pressing than ever. It’s likely that the future of storage will rely on new chemistries and the findings of this research, including the microscopy techniques the researchers refined along the way, will apply to nearly any solution in those categories.
For its part, the Dionne lab has many directions it can go from here. The team could look at a variety of material compositions, or compare how the sizes and shapes of nanoparticles affect the way they work, and, soon, take advantage of new upgrades to their microscope to study light-driven reactions. At present, Hayee has moved on to experimenting with nanorods, which have more surface area for the ions to move through, promising potentially even faster kinetics.
Being human, the ability to see is critical. Seeing is a sense that will feed more understanding and that will lead to more insight followed by more innovations. Go Stanford!
Pacific Northwest National Laboratory (PNNL) scientists have developed a new streamlined process that could quickly pare down the stack of algae species into just a few that hold the most promise for making biofuel. The new, approximately $6-million collaborative project is using a unique climate-simulating laboratory system.
A dozen glass cylinders containing a potential payload of bright green algae are exposed to hundreds of multi-colored lights, which provide all of sunlight’s natural hues. The tiny LEDs brighten and dim to mimic the outdoors’ constantly changing conditions. To further simulate a virtual cloud passing overhead, chillers kick in and nudge the algae a little cooler.
Discovering which algae species is best suited to make biofuel is no small task. Researchers have tried to evaluate algae in test tubes, but often find lab results don’t always mirror what happens when green goo is grown in outdoor ponds.
The Algae DISCOVR Project – short for Development of Integrated Screening, Cultivar Optimization and Validation Research – is trying out a new approach that could reduce the cost and the time needed to move promising algal strains from the laboratory and into production. At the end of the three-year pilot project, scientists hope to identify four promising strains from at least 30 initial candidates.
Project lead researcher, Michael Huesemann of the Department of Energy’s Pacific Northwest National Laboratory, “Algae biofuel is a promising clean energy technology, but the current production methods are costly and limit its use. The price of biofuel is largely tied to growth rates. Our method could help developers find the most productive algae strains more quickly and efficiently.”
The project started this fall and is led by PNNL, out of its Marine Sciences Laboratory in Sequim, Washington. The project team includes three other DOE labs – Los Alamos National Laboratory, National Renewable Energy Laboratory and Sandia National Laboratories – as well as Arizona State University’s Arizona Center for Algae Technology and Innovation.
The project’s early work relies on PNNL’s Laboratory Environmental Algae Pond Simulator mini-photobioreactors, also known as LEAPS. The system mimics the frequently shifting water temperatures and lighting conditions that occur in outdoor ponds at any given place on earth.
The system consists of glass column photobioreactors that act like small ponds and are placed in rows to allow scientists to simultaneously grow multiple different types of algae strains. Each row of LEAPS mini-photobioreactors is exposed to unique temperature and lighting regimens thanks to heaters, chillers and heat exchangers, as well as colored lights simulating the sunlight spectrum – all of which can be changed every second.
The first phase of the team’s multi-step screening process uses PNNL’s photobioreactors to cultivate all 30 strains under consideration and evaluate their growth rates. Algae strains with suitable growth will be studied further to measure their oil, protein and carbohydrate content, all of which could be used to make biofuels. The algae will also be tested for valuable co-products such as the food dye phycocyanin, which could make algae biofuel production more cost-effective. The first phase will also involve evaluating how resistant strains are to harmful bacteria and predators that can kill algae.
Next, the team will look for strains that produce 20 percent more biomass, or organic matter used to make biofuel, than two well-studied algae strains. The top-performing strains will then be sorted to find individual cells best suited for biofuel production, such as those that contain more oil. Those strains will also be exposed to various stresses to encourage rapid evolution so they can, for example, survive in the higher temperatures outdoor ponds experience in the summer.
After passing those tests, the remaining strains will be grown in large outdoor ponds in Arizona. Researchers will examine how algae growth in the outdoor ponds compares with the algal biomass output predicted in earlier steps. Biomass will also be harvested from outdoor-grown algae for future studies.
Finally, the team will further study the final algae strains that fare best outdoors to understand how fast they grow in different lighting and temperature conditions. That data will then be entered into PNNL’s Biomass Assessment Tool, which uses detailed data from weather stations and other sources to identify the best possible locations to grow algae. The tool will crunch numbers to help the team generate maps that illustrate the expected biomass productivity of each algae species grown in outdoor ponds at any location in the U.S.
Data and strains will be made public in the hopes that algae companies and other researchers will consider growing the most productive strains identified by the project.
Potential future work not included in the current project could include converting harvested algae into biofuels, examining operational changes such as crop rotation to further increase biomass growth, and assessing the technical feasibility and economic costs of making biofuel from algae selected through this process.
This kind of environmental research is needed, especially for outdoor production. Algae has great potential, but the enthusiasm has greatly exceeded the science. Now, finally, some basic research is underway. The algae industry has legs, shorts ones, and this work is sure to grow ’em out some more.
Technical University of Denmark (DTU) scientists have engineered E. coli cells into producing large quantities of serine. (This link has a very informative sidebar.) Serine is used in detergents, tube feeding formula, and as building blocks for many important chemicals. In fact, serine has been mentioned as one of the 30 most promising biological substances to replace chemicals from the oil industry, if the production costs can be reduced.
If you are a consumer or had a company that manufactured valuable ingredients for chemicals like detergents or paint, you would probably like to produce the ingredients in large quantities, sustainably, and at a low cost. Serine is an amino acid important for humans, because it is one of the 20 amino acids forming proteins in our bodies. Being highly water soluble, serine finds application as moisturizer in lotions of pharma and cosmetic industry.
Professor Alex Toftgaard Nielsen from DTU Biosustain, The Novo Nordisk Foundation Center for Biosustainability, said, “This discovery is quite unique and proves that we can actually adapt cells to tolerate large amounts of serine – something many people thought wasn’t possible. In order to develop these cells, we used highly specialized robots that exists only at our Center in Denmark and in the US.”
There is a huge market for serine in the chemical industry, because it can be converted into other chemicals such as plastics, detergents, dietary supplements and a variety of other products. Fermentation by bacteria is the most common method of producing amino acids. However, serine is toxic to the laboratory work horse E. coli, which quickly “gives up,” if the bacterium is to produce large amounts of the substance.
The first step in the development process was to produce E. coli cells that could survive high concentrations of serine. To achieve this, the scientists used so-called automated ‘Adaptive Laboratory Evolution’ (ALE) in which they first exposed the cells to a small amount of serine. When the cells had grown accustomed to these conditions, the bacteria were transferred to a slightly higher concentration. The experiment was repeated several times with the cells best suited to tolerate serine.
This experiment required highly specialized robots, lead author of the study Hemanshu Mundhada from DTU Biosustain explained, “Cell growth must be monitored 24 hours a day, and the cells must be transferred to new medium at a certain time of growth. Moreover, we have so many samples, it would be almost impossible to monitor all the cells manually. Therefore, it is crucial that we use ALE robots.”
The tolerant E. coli cells were subsequently optimized genetically to produce serine, and in this way, they could suddenly produce 250 to 300 grams of serine for each kg of sugar (glucose) added, which is the largest productivity ever seen for serine.
Today, serine is already produced in other microbes by converting glycine and methanol. But these microbes must first be grown in large quantities, after which the glycine – which is also chemically produced – is then added. Glycine is relatively expensive, and therefore many are looking for cheaper and more sustainable production methods.
Mundhada said, “We have shown that our E. coli cells can use regular sugar and even residues from sugar production, molasses, in lower concentrations. And we have seen promising results with less expensive sugars, which makes it even more attractive to produce serine in E. coli.”
The research team is now working to establish a company which will be responsible for producing serine on a larger scale.
“The goal is to make this cell line useful for society. And the best way to do that, is by getting a company to further develop and commercialize our results,” said Toftgaard Nielsen.
This is a great example of how human insight and innovation can really change the future. This research is showing very favorable fundamentals and should get to market with major advantages. There’s a way to go, extraction from the E. coli broth and preparation steps. But this looks very very good, indeed.
The theory behind this kind of heat storage is fairly straightforward: if you pour water into a beaker containing solid or concentrated sodium hydroxide (NaOH), the mixture heats up. The dilution is exothermic where chemical energy is released in the form of heat. Moreover, the sodium hydroxide solution is highly hygroscopic and able to absorb water vapor. The condensation heat obtained as a result warms up the sodium hydroxide solution even more.
We are still far from a sustainable energy supply. In Switzerland during 2014, 71 percent of all privately-owned apartments and houses were heated with fossil fuels, and 60 percent of the hot water consumed in private households is generated in the same way. In other words, a considerable amount of fossil energy could be saved if we were able to store heat from sunny summer days until winter time and retrieve it at the flick of a switch.
Materials capable of storing heat include those such as bricks or concrete that slowly release the stored heat, and others such as water or ethylene glycol that take in heat when they transform from a solid to a liquid. However, none of these materials can store heat energy over a long period as they simply naturally release it slowly over time. A material that could store heat energy for a long time and release it at the exact timing desired would be a boon for the field of renewable energy.
Is there a way to do this? It certainly looks like the Swiss have it. Since autumn of 2016, following several years of research, Empa has a plant on a lab scale in operation that works reliably and is able to store heat for the long term. But the road to get there was long and winding.
When heat energy is fed into a dilute sodium hydroxide solution in the form of heat, the water evaporates, the sodium hydroxide solution will get more concentrated and thus stores the supplied energy. This solution can be kept for months and even years, or transported in tanks. If it comes into contact with water (vapor) again, the stored heat is re-released.
That’s how the theory works. But could the beaker experiment be replicated on a scale capable of storing enough energy for a single-family household? The Empa researchers Robert Weber and Benjamin Fumey rolled up their sleeves and got down to work. They used an insulated sea container as an experimental laboratory on Empa’s campus in Dübendorf – a safety precaution as concentrated sodium hydroxide solution is highly corrosive. If the system were to spring a leak, it would be preferable for the aggressive liquid to slosh through the container instead of Empa’s laboratory building.
Unfortunately, the first prototype didn’t work as anticipated. The researchers had opted for a falling film evaporator – a system used in the food industry to condense liquids, most commonly, orange juice into a concentrate. Instead of flowing correctly around the heat exchanger, however, the thick sodium hydroxide solution formed large drops. It absorbed too little water vapor and the amount of heat that was transferred remained too low.
Then Fumey had a brainstorm. The viscous storage medium should trickle along a pipe in a spiral, absorb water vapor on the way and transfer the generated heat to the pipe. The reverse – charging the medium – should also be possible using the same technique, only the other way round. The idea worked. And the best thing about it is the spiral-shaped heat exchangers are already available from existing inventory as the heat exchangers from flow water heaters.
Fumey then optimized the lab system further by asking which fluctuations in NaOH concentration are optimal for efficiency? Which temperatures should the inflowing and outflowing water have? Water vapor at a temperature of five to ten degrees C is required to drain the store. The water vapor can be produced with heat from a geothermal probe, for example.
During the heat recovery process a 50-percent sodium hydroxide solution runs down the outside of the spiral heat exchanger pipe and is thinned to 30 percent in the steam atmosphere. The water inside the pipe heats up to around 50º Celsius – which is just about right for radiant floor heating.
To recharge the heat inventory, the 30-percent, “discharged” sodium hydroxide solution again trickles down around the spiral heat exchanger pipe. But now inside the pipe flows 60º C hot water, which could be produced by a solar collector. Then the water from the sodium hydroxide solution evaporates; the water vapor is removed and condensed. The condensation heat is conducted back into a geothermal probe, where it is stored. The sodium hydroxide solution that leaves the heat exchanger after recharging is concentrated back to 50 percent again, i.e. “charged” with thermal energy.
Fumey said, “This method enables solar energy to be stored in the form of chemical energy from the summer until the wintertime, and that’s not all. The stored heat can also be transported elsewhere in the form of concentrated sodium hydroxide solution, which makes it flexible to use.”
The search for industrial partners to help build a compact household system on the basis of the Empa lab model has now begun. The next prototype of the sodium hydroxide storage system could then be used in Switzerland’s NEST building innovation project.
There have been many attempts at heat storage over the years, most all too vulnerable in heat loss to justify the investment, Here the Swiss have a fully static storage. This concept might well get to market. There are many questions yet to ask and answer, but this looks very good indeed.
NuScale Power is a step closer to seeing its nuclear reactor built in eastern Idaho. Having completed a 12,000-page design application for the federal regulator the Nuclear Regulatory Commission the firm expects to deliver the document describing the first-of-its-kind small modular reactor design tomorrow, Thursday, in Washington, D.C.
Nuscale Power is the first of several companies researching small modular reactors, but no one has made it this far in the expensive planning, testing and application process. Company officials plan a formal announcement of the milestone later Thursday in the capital. Lynn Orr, undersecretary for science and energy, is scheduled to speak.
Mike McGough, NuScale’s chief commercial officer said finishing the document is the biggest accomplishment for the company since winning $217 million in matching funds from the Department of Energy in 2013 to accelerate the reactor’s development. Under the agreement with the DOE, NuScale was to finalize the design application and submit it to the NRC by late 2016.
The Oregon-based company finished the document just in time by holding a last-minute afternoon meeting Dec. 31 and signing it just before midnight, McGough explained. The NRC application review process is expected to take more than three years and cost NuScale $45 million.
These kinds of documentation cost tens of millions of dollars to produce and requires paying NRC officials $258 per hour to review and provide feedback at various stages of completion, McGough explained. NuScale took out an NRC “project number” – indicating it wanted to move forward with the regulatory design certification process – in 2008. Some 800 people from NuScale and its partners pitched in along the way.
Advanced modular nuclear reactors could restart the atomic age by providing cheap, meltdown-proof and waste-free nuclear power. These reactors were originally developed at the Department of Energy’s Oak Ridge National Laboratory, but were abandoned because they couldn’t be used by the military.
Small modular reactors could be a massive power generation market contributor. These reactor designs have the potential to be much cheaper than conventional reactors because they would be completely manufactured in a factory. This reactor format would also require far less up-front investment, making them cost competitive with natural gas. The smaller size fills the need for enhanced distributed power installations making them useful for base load power in dense power demand areas as well as more capable of powering remote areas.
The aforementioned enormous regulatory expense combined with polices intended to support wind and solar power make it incredibly difficult to profitably operate a nuclear power plant, according to a study published in October by the free-market R Street Institute.
An American Action Forum report showed conventional U.S. nuclear plants spend an estimated $4.2 million each every year to meet government paperwork requirements and another $4.4 million to pay government-mandated security staff. In addition to paperwork requirement costs, the average plant spends approximately $14 million on various government fees.
One hopes that the advanced modular reactors will short circuit these regulations, getting very low cost power onto the market. More better cheaper for electrical power is far more important to the national economy than any other input.