Rice University researchers have produced triboelectric nanogenerators with laser-induced graphene. The flexible devices turn movement into electrical energy and could enable wearable, self-powered sensors and devices.

Wearable devices that harvest energy from movement are not a new idea, but a material created at Rice University may make them more practical.

The Rice lab of chemist James Tour has adapted laser-induced graphene (LIG) into small, metal-free devices that generate electricity. Like rubbing a balloon on hair, putting LIG composites in contact with other surfaces produces static electricity that can be used to power devices.

For that, thank the triboelectric effect, by which materials gather a charge through contact. When they are put together and then pulled apart, surface charges build up that can be channeled toward power generation.

In experiments, the researchers connected a folded strip of LIG to a string of light-emitting diodes and found that tapping the strip produced enough energy to make them flash. A larger piece of LIG embedded within a flip-flop let a wearer generate energy with every step, as the graphene composite’s repeated contact with skin produced a current to charge a small capacitor.

Tour said, “This could be a way to recharge small devices just by using the excess energy of heel strikes during walking, or swinging arm movements against the torso.”

The project research paper has been published in the American Chemical Society journal ACS Nano.

LIG is a graphene foam produced when chemicals are heated on the surface of a polymer or other material with a laser, leaving only interconnected flakes of two-dimensional carbon. The lab first made LIG on common polyimide, but extended the technique to plants, food, treated paper and wood.

The lab turned polyimide, cork and other materials into LIG electrodes to see how well they produced energy and stood up to wear and tear. They got the best results from materials on the opposite ends of the triboelectric series, which quantifies their ability to generate static charge by contact electrification.

In the folding configuration, LIG from the tribo-negative polyimide was sprayed with a protecting coating of polyurethane, which also served as a tribo-positive material. When the electrodes were brought together, electrons transferred to the polyimide from the polyurethane. Subsequent contact and separation drove charges that could be stored through an external circuit to rebalance the built-up static charge. The folding LIG generated about 1 kilovolt, and remained stable after 5,000 bending cycles.

The best configuration, with electrodes of the polyimide-LIG composite and aluminum, produced voltages above 3.5 kilovolts with a peak power of more than 8 milliwatts.

Rice postdoctoral researcher Michael Stanford, lead author of the paper said, “The nanogenerator embedded within a flip-flop was able to store 0.22 millijoules of electrical energy on a capacitor after a 1-kilometer walk. This rate of energy storage is enough to power wearable sensors and electronics with human movement.”

Co-authors of the paper are Rice graduate students Yieu Chyan and Zhe Wang and undergraduate students John Li and Winston Wang. Tour is the T.T. and W.F. Chao Chair in Chemistry as well as a professor of computer science and of materials science and nanoengineering at Rice.

5000 cycles is a number that offers great encouragement. While the cycling count isn’t perhaps the level needed for a shoe application it does suggest that the progress has more potential.

One doesn’t expect that this work is going to effect the grid level load until one recalls the load estimates from chargers of today’s devices that pull a noteworthy amount of current. Then add the consideration of not needing batteries. When these number in the millions it will be well worth the effort.

University of Illinois at Urbana-Champaign chemists have successfully produced fuels using water, carbon dioxide and visible light through artificial photosynthesis.

By converting carbon dioxide into more complex molecules like propane, green energy technology is now one step closer to using excess carbon dioxide to store solar energy – in the form of chemical bonds – for use when the sun is not shining and in times of peak demand.

Plants use sunlight to drive chemical reactions between water and CO2 to create and store solar energy in the form of energy-dense glucose. In their new study the researchers developed an artificial process that uses the same green light portion of the visible light spectrum used by plants during natural photosynthesis to convert CO2 and water into fuel, in conjunction with electron-rich gold nanoparticles that serve as a catalyst.

The new findings are published in the journal Nature Communications.

Jain, left, and Yu performing artificial photosynthesis experiments using green light. Photo by Fred Zwicky. Image Credit: UI at U-C. Click image for the largest view.

Prashant Jain, a chemistry professor and co-author of the study said, “The goal here is to produce complex, liquefiable hydrocarbons from excess CO2 and other sustainable resources such as sunlight. Liquid fuels are ideal because they are easier, safer and more economical to transport than gas and, because they are made from long-chain molecules, contain more bonds – meaning they pack energy more densely.”

In Jain’s lab, Sungju Yu, a postdoctoral researcher and first author of the study, uses metal catalysts to absorb green light and transfer electrons and protons needed for chemical reactions between CO2 and water – filling the role of the pigment chlorophyll in natural photosynthesis.

Gold nanoparticles work particularly well as a catalyst, Jain said, because their surfaces interact favorably with the CO2 molecules, are efficient at absorbing light and do not break down or degrade like other metals that can tarnish easily.

There are several ways in which the energy stored in bonds of the hydrocarbon fuel is freed. However, the easy conventional method of combustion ends up producing more CO2 – which is counterproductive to the notion of harvesting and storing solar energy in the first place, Jain noted.

“There are other, more unconventional potential uses from the hydrocarbons created from this process,” he said. “They could be used to power fuel cells for producing electrical current and voltage. There are labs across the world trying to figure out how the hydrocarbon-to-electricity conversion can be conducted efficiently.”

As exciting as the development of this CO2-to-liquid fuel may be for green energy technology, the researchers acknowledge that Jain’s artificial photosynthesis process is nowhere near as efficient as it is in plants.

“We need to learn how to tune the catalyst to increase the efficiency of the chemical reactions,” he said. “Then we can start the hard work of determining how to go about scaling up the process. And, like any unconventional energy technology, there will be many economic feasibility questions to be answered, as well.”

One day the world’s economy will be participating in the short term planetary carbon cycle. This work gets us closer. Carbon fuels are much less expensive for the consumer to capitalize for use – as most of the devices are already in place. Even if the only end product is propane, some pressure and it is liquid and has a history of lots of uses from fueling automobiles to heating homes.

As for efficiency, note plants on the whole are not amazingly efficient, and for all the hype about atmospheric CO2, its share of the atmosphere is pretty small. But its not so hard to imagine today’s oil companies harvesting propane and other materials from solar farms.

Brookhaven National Laboratory scientists have developed a highly efficient catalyst for extracting electrical energy from ethanol. Ethanol is an easy-to-store liquid fuel made from renewable resources.

The catalyst, described in the Journal of the American Chemical Society, steers the electro-oxidation of ethanol down an ideal chemical pathway that releases the liquid fuel’s full potential of stored electrical energy.

Jia Wang, the Brookhaven Lab chemist who led the work said, “This catalyst is a game changer that will enable the use of ethanol fuel cells as a promising high-energy-density source of ‘off-the-grid’ electrical power.” He noted that one particularly promising application is liquid fuel-cell powered drones.

A close-up of the platinum/iridium (green/blue) shell over a gold nanoparticle core (yellow), showing how this catalyst cleaves the carbon-carbon (gray) bonds in ethanol while initially leaving hydrogen atoms attached. The hydrogen protects the carbon in the early stages of the reaction, preventing the formation of catalyst-poisoning carbon monoxide, which enables complete oxidation and the release of 12 electrons. Image Credit: Brookhaven National Lab. Click image for the largest view.

“Ethanol fuel cells are lightweight compared to batteries. They would provide sufficient power for operating drones using a liquid fuel that’s easy to refill between flights – even in remote locations,” he said.

Much of ethanol’s potential power is locked up in the carbon-carbon bonds that form the backbone of the molecule. The catalyst developed by Wang’s group reveals that breaking those bonds at the right time is the key to unlocking that stored energy.

“Electro-oxidation of ethanol can produce 12 electrons per molecule,” Wang said. “But the reaction can progress by following many different pathways.”

Most of these pathways result in incomplete oxidation: The catalysts leave carbon-carbon bonds intact, releasing fewer electrons. They also strip off hydrogen atoms early in the process, exposing carbon atoms to the formation of carbon monoxide, which “poisons” the catalyst’s ability to function over time.

Wang explained, “The 12-electron full oxidation of ethanol requires breaking the carbon-carbon bond at the beginning of the process, while hydrogen atoms are still attached, because the hydrogen protects the carbon and prevents the formation of carbon monoxide.” Then, multiple steps of dehydrogenation and oxidation are needed to complete the process.

The new catalyst – which combines reactive elements in a unique core-shell structure that Brookhaven scientists have been exploring for a range of catalytic reactions – speeds up all of these steps.

To make the catalyst, Jingyi Chen of the University of Arkansas, who was a visiting scientist at Brookhaven during part of this project, developed a synthesis method to co-deposit platinum and iridium on gold nanoparticles. The platinum and iridium form “monoatomic islands” across the surface of the gold nanoparticles. That arrangement, Chen noted, is the key that accounts for the catalyst’s outstanding performance.

“The gold nanoparticle cores induce tensile strain in the platinum-iridium monoatomic islands, which increases those elements’ ability to cleave the carbon-carbon bonds, and then strip away its hydrogen atoms,” she said.

Zhixiu Liang, a Stony Brook University graduate student and the first author of the paper, performed studies in Wang’s lab to understand how the catalyst achieves its record-high energy conversion efficiency. He used “in situ infrared reflection-absorption spectroscopy” to identify the reaction intermediates and products, comparing those produced by the new catalyst with reactions using a gold-core/platinum-shell catalyst and also a platinum-iridium alloy catalyst.

“By measuring the spectra produced when the infrared light is absorbed at different steps in the reaction, this method allows us to track, at each step, what species have been formed and how much of each product,” Liang said. “The spectra revealed that the new catalyst steers ethanol toward the 12-electron full oxidation pathway, releasing the fuel’s full potential of stored energy.”

The next step, Wang noted, is to engineer devices that incorporate the new catalyst.

The mechanistic details revealed by this study may also help guide the rational design of future multicomponent catalysts for other applications.

In addition to the details described here, the scientists used the Inner Shell Spectroscopy (ISS) beamline at the National Synchrotron Light Source II (NSLS-II) – a DOE Office of Science User Facility – to characterize the relative amounts of each element in the catalyst samples. The paper’s additional co-authors are: Liang Song and Radoslav R. Adzic of Brookhaven Lab’s Chemistry Division, Shiqing Deng and Yimei Zhu of the Lab’s Condensed Matter Physics and Materials Science Division, and Eli Stavitski of NSLS-II.

Obviously this is a breakout development worthy of significant note and attention. The catch is the cost of platinum, gold and iridium before the processing. Perhaps though, the success here will stimulate more multicomponent catalyst research and there may be one day a price efficient solution.

For now though, the cost per watt is an undiscussed and unknown metric. A certainty is the world needs more platinum mines and a tremendous reduction in platinum’s price.

A University of Chicago team with an international research team of scientists has discovered superconductivity – the ability to conduct electricity perfectly – at the highest temperatures ever recorded.

Using advanced technology at University of Chicago affiliated Argonne National Laboratory, the team studied a class of materials in which they observed superconductivity at temperatures of about minus-23° Celsius (minus- 9° Fahrenheit) – a jump of about 50 degrees compared to the previous confirmed record.

Scientists bombarded a sample of a new superconducting material (center) with X-rays to study its structure at the Advanced Photon Source. Image Credit: Courtesy of Drozdov et al via the University of Chicago.  Click image for the largest view.

Though the superconductivity happened under extremely high pressure, the result still represents a big step toward creating superconductivity at room temperature – the ultimate goal for scientists to be able to use this phenomenon for advanced technologies.

The results have been published in the journal Nature. Vitali Prakapenka, a research professor at the University of Chicago, and Eran Greenberg, a postdoctoral scholar at the University of Chicago, are co-authors of the research.

Just as a copper wire conducts electricity better than a rubber tube, certain kinds of materials are better at becoming superconductive, a state defined by two main properties: The material offers zero resistance to electrical current and cannot be penetrated by magnetic fields. The potential uses for this are as vast as they are exciting: electrical wires without diminishing currents, extremely fast supercomputers and efficient magnetic levitation trains.

But so far scientists have only been able to create superconducting materials when they are cooled to extremely cold temperatures – initially, minus-240° Celsius and more recently about minus-73° Celsius. Since such cooling is expensive, it has limited their applications in the world at large.

Recent theoretical predictions have shown that a new class of materials of superconducting hydrides could pave the way for higher-temperature superconductivity. Researchers at the Max Planck Institute for Chemistry in Germany teamed up with University of Chicago researchers to create one of these materials, called lanthanum superhydrides, test its superconductivity, and determine its structure and composition.

The only catch was that the material needed to be placed under extremely high pressure – between 150 and 170 gigapascals, more than one and a half million times the pressure at sea level. Only under these high-pressure conditions did the material – a tiny sample only a few microns across – exhibit superconductivity at the new record temperature.

In fact, the material showed three of the four characteristics needed to prove superconductivity: It dropped its electrical resistance, decreased its critical temperature under an external magnetic field and showed a temperature change when some elements were replaced with different isotopes. The fourth characteristic, called the Meissner effect, in which the material expels any magnetic field, was not detected. That’s because the material is so small that this effect could not be observed, researchers said.

The team used the Advanced Photon Source at Argonne National Laboratory, which provides ultra-bright, high-energy X-ray beams that have enabled breakthroughs in everything from better batteries to understanding the Earth’s deep interior, to analyze the material. In the experiment, researchers within University of Chicago’s Center for Advanced Radiation Sources squeezed a tiny sample of the material between two tiny diamonds to exert the pressure needed, then used the beamline’s X-rays to probe its structure and composition.

Because the temperatures used to conduct the experiment is within the normal range of many places in the world, that makes the ultimate goal of room temperature – or at least 0 degrees Celsius – seem within reach.

The team is already continuing to collaborate to find new materials that can create superconductivity under more reasonable conditions.

Professor Prakapenka said, “Our next goal is to reduce the pressure needed to synthesize samples, to bring the critical temperature closer to ambient, and perhaps even create samples that could be synthesized at high pressures, but still superconduct at normal pressures. We are continuing to search for new and interesting compounds that will bring us new, and often unexpected, discoveries.”

This research has made another milestone in the superconductivity hunt at minus-10° F, something a home air conditioner could achieve. There remains the pressure matter, but this level of success bodes well for more improvements. With today’s skill sets and instrumentation there are sure to be more milestones as understanding the elements needed, their assembly and operating conditions are discovered and expressed.

Machine learning can help bring to Earth the clean fusion energy that powers the sun and stars. Researchers are using this form of artificial intelligence to create a model for rapid control of plasma – the state of matter composed of free electrons and atomic nuclei, or ions – that fuels fusion reactions.

Machine learning (ML), a form of artificial intelligence that recognizes faces, understands language and navigates self-driving cars, can help bring to Earth the clean fusion energy that lights the sun and stars.

Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) are using ML to create a model for rapid control of plasma.

The sun and most stars are giant balls of plasma that undergo constant fusion reactions. Here on Earth, scientists must heat and control the plasma to cause the particles to fuse and release their energy. PPPL research shows that ML can facilitate such control.

Researchers led by PPPL physicist Dan Boyer have trained neural networks – the core of ML software – on data produced in the first operational campaign of the National Spherical Torus Experiment-Upgrade (NSTX-U), the flagship fusion facility, or tokamak, at PPPL. The trained model accurately reproduces predictions of the behavior of the energetic particles produced by powerful neutral beam injection (NBI) that is used to fuel NSTX-U plasmas and heat them to million-degree, fusion-relevant temperatures.

These predictions are normally generated by a complex computer code called NUBEAM, which incorporates information about the impact of the beam on the plasma. Such complex calculations must be made hundreds of times per second to analyze the behavior of the plasma during an experiment. But each calculation can take several minutes to run, making the results available to physicists only after an experiment that typically lasts a few seconds is completed.

The new ML software reduces the time needed to accurately predict the behavior of energetic particles to under 150 microseconds – enabling the calculations to be done online during the experiment.

Initial application of the model demonstrated a technique for estimating characteristics of the plasma behavior not directly measured. This technique combines ML predictions with the limited measurements of plasma conditions available in real-time. The combined results will help the real-time plasma control system make more informed decisions about how to adjust beam injection to optimize performance and maintain stability of the plasma – a critical quality for fusion reactions.

The rapid evaluations will also help operators make better-informed adjustments between experiments that are executed every 15-20 minutes during operations. “Accelerated modeling capabilities could show operators how to adjust NBI settings to improve the next experiment,” said Boyer, lead author of a paper in Nuclear Fusion that reports the new model.

Boyer, working with PPPL physicist Stan Kaye, generated a database of NUBEAM calculations for a range of plasma conditions similar to those achieved in experiments during the initial NSTX-U run. Researchers used the database to train a neural network to predict effects of neutral beams on the plasma, such as heating and profiles of the current. Software engineer Keith Erickson then implemented software for evaluating the model on computers used to actively control the experiment to test the calculation time.

Efforts coming up will include development of neural network models tailored to the planned conditions of future NSTX-U campaigns and other fusion facilities. In addition, researchers plan to expand the present modeling approach to enable accelerated predictions of other fusion plasma phenomena. Support for this work comes from the DOE Office of Science.

It may be safe to assume this work applies to the tokamak field of fusion research. It does seem like there is an immense amount of effort involved in trying to keep the plasma in the donut. One is still concerned the tokamak design will become so complex and expensive that most all other power sources will seem cheap and likely survive, perhaps even prosper, at the net output of a fusion power plant.