Physics Newsletter October #2
- physicspulse
- Oct 29, 2024
- 4 min read
Physics Pulse: Physics Newsletter
By: Bhavya Goel - Researcher

Researchers in Japan made progress in improving solar energy technology

Scientists in Japan have made an exciting breakthrough that could improve solar energy technology. They discovered a special effect called the bulk photovoltaic (BPV) effect in a material called alpha-phase indium selenide (α-In2Se3). This effect allows the material to convert sunlight into electricity more efficiently than traditional solar cells, which are usually limited by something known as the Shockley–Queisser limit. Unlike regular solar cells that rely on the junction between two materials, the BPV effect uses a unique property of certain materials to generate electricity by moving electrons in a specific direction when exposed to light.
The research team confirmed this effect for the first time in α-In2Se3 by building a special device and testing it with different light frequencies. They found that this material can produce what are called "shift currents," leading to higher efficiency in converting light into electricity. What’s even more impressive is that the efficiency of this material is much higher than other similar materials, showing great potential for future solar cells and sensitive light detectors.
This discovery could have a big impact on renewable energy by improving the performance of solar panels. The researchers believe their work will help make solar power more efficient and contribute to a carbon-neutral society. This advancement opens up new possibilities for developing better solar cells and energy-harvesting technologies that are key to tackling environmental challenges.
New light-induced material shows powerful potential for quantum applications

Scientists are exploring new materials to help develop advanced quantum technologies, and one promising candidate is a class of materials called perovskites. These materials, which have special electronic properties, could be useful in spintronics—a field that focuses on controlling the "spin" of electrons. Spintronics could revolutionise technologies like quantum sensors and memory devices by using electron spins to store and process information more efficiently than traditional electronics.
In a recent breakthrough, researchers at the U.S. Department of Energy's Argonne National Laboratory and Northern Illinois University discovered how to use light to control electron spins in a perovskite material called methylammonium lead iodide (MAPbI3). This material is already known for its use in solar panels, but scientists now see its potential in quantum technology. By using light to excite electrons in the material, the team created special "excitons"—pairs of an excited electron and the empty space (or "hole") it left behind. The challenge was that these excitons typically only last for a very short time before the electron falls back into the hole, releasing energy.
However, the researchers found a way to make these excitons last much longer by adding a rare earth metal called neodymium to the perovskite. Neodymium's unpaired electrons interact with the excited electrons in the exciton, creating a spin-entangled state. This connection between electrons in different atoms is key for quantum sensing and computing. The researchers hope that by adjusting the neodymium concentration, they could potentially control multiple electron spins, making perovskites a promising material for future quantum devices.
Brain delays could be a computational advantage, researchers say

Delays in how neurons process signals might seem like a disadvantage, slowing down brain function and making it less efficient compared to electrical systems. Unlike machines that instantly transmit signals, the brain experiences varying delays in signal arrival, forcing neurons to process information over time. To overcome this, the brain doesn’t rely on individual neurons but instead uses groups of neurons to ensure consistent firing, even when some cells are temporarily inactive. While these delays appear to be a drawback, could they actually improve how the brain learns?
A new study from Bar-Ilan University, published in Physica A: Statistical Mechanics and its Applications, has shown that the answer is a surprising "yes." Led by Prof. Ido Kanter, researchers discovered that these delays may enhance learning efficiency and flexibility without requiring any
changes in brain structure. Rather than being a hindrance, the brain’s ability to integrate signals over time allows it to recognize objects more efficiently. Unlike artificial neural networks that need a separate output for each object, the brain can use a single output where timing differentiates between objects, effectively "learning with time," while computers rely on spatial recognition.
This means that the brain can adapt to new information without needing to alter its architecture. As Yarden Tzach, a Ph.D. student involved in the research, explains, learning new things just requires recognizing a signal at a different time, allowing even combinations of objects to be recognized at specific moments. For example, a horse and a person can be recognized at different times, but a person riding a horse can be recognized at a time between the two. This discovery highlights how the brain's learning mechanisms can actually surpass machine learning, offering insights into developing faster and more advanced artificial learning systems.
Nobel Prize in physics awarded to 2 scientists for discoveries that enabled machine learning
John Hopfield and Geoffrey Hinton, two pioneers of artificial intelligence, were awarded the Nobel Prize in physics for their groundbreaking work that laid the foundation for machine learning, which is transforming our world. AI now plays a critical role in various fields like medicine, science, and everyday life, though it also raises concerns about potential risks. Hinton, often called the “godfather of AI,” works at the University of Toronto, while Hopfield is a professor at Princeton. According to the Nobel committee, these two scientists’ work has shaped the way technology evolves today, marking a profound impact on society.
Artificial neural networks, which mimic the way neurons in the brain interact, were developed through the contributions of these researchers. These networks are used in everything from diagnosing medical conditions to powering the smart systems we use daily. Ellen Moons, a member of the Nobel committee, highlighted the wide-reaching effects of their work. Hinton noted the incredible influence AI will have on civilization, comparing it to the Industrial Revolution. He believes that AI will bring significant advancements in fields like health care and productivity, improving life in ways we’ve yet to fully imagine.
However, both Hinton and Hopfield recognize the potential dangers that come with these advancements. As AI continues to progress, there are growing concerns about losing control over these systems. Hinton specifically cautioned about the risks of AI becoming more intelligent than humans, something humanity has never experienced before. While the technology holds immense promise, the challenge lies in ensuring that it’s developed responsibly to avoid unintended consequences.




Comments