As it turns out, the truth is both less and more than meets the eye. The article above was eventually edited to better reflect the truth that, alas, we have not yet found a way to create objects so massive that even light cannot escape them. Instead, physicist [Jeff Steinhauer] and colleagues at the Technion-Israel Institute of Technology have developed an acoustic model of black holes, which is what was used to observe the equivalent of Hawking radiation for the first time. Hawking radiation is the theoretical exception to the rule that nothing makes it out of a black hole and would imply that black holes evaporate over time. The predicted radiation would be orders of magnitude weaker than the background radiation, though, making it all but impossible to detect.
That’s where [Steinhauer]’s sonic black holes come in. In these experiments, phonons, packets of mechanical vibrations that stand in for photons, are trapped in a fast-moving stream of fluid. The point in the stream where its speed straddles the local speed of sound is the equivalent to a real black hole’s event horizon; phonons inside that boundary can never escape. Except, of course, for the sonic equivalent of Hawking radiation, which the researchers found after 97,000 attempts.
When we first stumbled upon this story, we assumed a lab-grown black hole, even an acoustic analog, would take a CERN’s-worth of equipment to create. It turns out to be far simpler than that; [Steinhauer], in fact, built his black hole machine singlehandedly from relatively simple equipment. The experiments do require temperatures near absolute zero and a couple of powerful lasers, so it’s not exactly easy stuff; still, we can’t help but wonder if sonic black holes are within the reach of the DIY community. Paging [Ben Krasnow] and [Sam Zeloof], among others.
[Featured image credit: Nitzan Zohar, Office of the Spokesperson, Technion]
According to Spectrum, several companies are poised to make a splash storing energy with gravity. That sounds fancy and high tech at first, but is it, really? Sure, we usually think of energy storage as some sort of battery, but there are many energy storage systems that use water falling, for example, which is almost what this new technology is all about. Almost, since instead of water these new systems move around multi-ton blocks.
The idea itself is nothing new. You probably learned in high school that you have kinetic energy when a rock rolls down a hill, but a rock sitting on a mountain immobile has potential energy. These systems use the same idea. Moving the “rock” up stores energy and letting it fall releases the same energy. The big difference between the systems is what “up” means.
For Swiss company Energy Vault, the 35 metric ton bricks rise into the air manipulated by towers that look like alien construction cranes. To store energy, the crane builds a tower of bricks around itself. When the bricks return to the ground, they form a lower ring around the tower.
Over the years, humans have come up with four forces that can be used to describe every single interaction in the physical world. They are gravity, electromagnetism, the weak nuclear force that causes particle decay, and the strong nuclear force that binds quarks into atoms. Together, these have become the standard model of particle physics. But the existence of dark matter makes this model seem incomplete. Surely there must be another force (or forces) that explain both its existence and the reason for its darkness.
Hungarian scientists from the Atomki Nuclear Research Institute led by Professor Attila Krasznahorkay believe they have found evidence of a fifth force of nature. While monitoring an excited helium atom’s decay, they observed it emitting light, which is not unusual. What is unusual is that the particles split at a precise angle of 115 degrees, as though they were knocked off course by an invisible force.
The scientists dubbed this particle X17, because they calculated its mass at 17 megaelectronvolts (MeV). One electron Volt describes the kinetic energy gained by a single electron as it moves from zero volts to a potential of one volt, and so a megaelectronvolt is equal to the energy gained when an electron moves from zero volts to one million volts.
What Are Those First Four, Again?
Let’s start with the easy one, gravity. It gives objects weight, and keeps things more or less glued in place on Earth. Though gravity is a relatively weak force, it dominates on a large scale and holds entire galaxies together. Gravity helps us work and have fun. Without gravity, there would be no water towers, hydroelectric power plants, or roller coasters.
The electromagnetic force is a two-headed beast that dominates at the human scale. Almost everything we are and do is underpinned by this force that surrounds us like an ethereal soup. Electricity and magnetism are considered a dual force because they work on the same principle — that opposite forces attract and like forces repel.
This force holds atoms together and makes electronics possible. It’s also responsible for visible light itself. Each fundamental force has a carrier particle, and for electromagnetism, that particle is the photon. What we think of as visible light is the result of photons carrying electrostatic force between electrons and protons.
The weak and strong nuclear forces aren’t as easy to grasp because they operate at the subatomic level. The weak nuclear force is responsible for beta decay, where a neutron can turn into a proton plus an electron and anti-neutrino, which is one type of radioactive decay. Weak interactions explain how particles can change by changing the quarks inside them.
The strong nuclear force is the strongest force in nature, but it only dominates at the atomic scale. Imagine a nucleus with multiple protons. All those protons are positively charged, so why don’t they repel each other and rip the nucleus apart? The strong nuclear force is about 130x stronger than the electromagnetic force, so when protons are close enough together, it will dominate. The strong nuclear force holds both the nucleus together as well as the nucleons themselves.
The Force of Change
Suspicion of a fifth force has been around for a while. Atomki researchers observed a similar effect in 2015 when they studied the light emitted during the decay of a beryllium-8 isotope. As it decayed, the constituent electrons and positrons consistently repelled each other at another strange angle — exactly 140 degrees. They dubbed it a “protophobic” force, as in a force that’s afraid of protons. Labs around the world made repeated attempts to prove the discovery a fluke or a mistake, but they all produced the same results as Atomki.
Professor Attila Krasznahorkay and his team published their observations in late October, though the paper has yet to be peer-reviewed. Now, the plan at Atomki is to observe other atoms’ decay. If they can find a third atom that exhibits this strange behavior, we may have to take the standard model back to the drawing board to accommodate this development.
So what happens if science concludes that the X17 particle is evidence of a fifth force of nature? We don’t really know for sure. It might offer clues into dark matter, and it might bring us closer to a unified field theory. We’re at the edge of known science here, so feel free to speculate wildly in the comments.
Gravity can be a difficult thing to simulate effectively on a traditional CPU. The amount of calculation required increases exponentially with the number of particles in the simulation. This is an application perfect for parallel processing.
For their final project in ECE5760 at Cornell, [Mark Eiding] and [Brian Curless] decided to use an FPGA to rapidly process gravitational calculations. This allows them to simulate a thousand particles at up to 10 frames per second. With every particle having an attraction to every other, this works out to an astonishing 1 million inverse-square calculations per frame!
The team used an Altera DE2-115 development board to build the project. General operation is run by a Nios II processor, which handles the VGA display, loads initial conditions and controls memory. The FPGA is used as an accelerator for the gravity calculations, and lends the additional benefit of requiring less memory access operations as it runs all operations in parallel.
This project is a great example of how FPGAs can be used to create serious processing muscle for massively parallel tasks. Check out this great article on sorting with FPGAs that delves deeper into the subject. Video after the break.
[Bigelow Brook Farm] has a cool geodesic dome greenhouse that needs to stay warm in the winter. There are a lot of commercial solutions for greenhouse heating, but if you’re the kind of person who research and develops solutions for aquaponics, a greener solution may have more appeal.
A rocket mass heater is a combination of a rocket stove and underfloor heating. A rocket stove works by having such a strong draft created by the heat rising up the chimney that the flames can’t crawl up the fuel and burn in the open air, creating a controlled burn zone. Unfortunately, with just a plain rocket stove a lot of heat is lost to the atmosphere needlessly. You only need enough to create the draft.
The mass part solves this. It runs the exhaust under the floor and through radiators. This passively retains a lot of heat inside the space to be heated. It’s a bit of a trick to balance the system so it puts as much heat into the space as possible without stalling, which can be dangerous due to carbon monoxide, among other things. Once the balance is achieved the user gets a stove that can burn fuel very effectively and best of all passively.
[Bigelow Brook Farms] have been working on their heater for quite some time. We really enjoy their test driven development and iteration. They have really interesting autopsies when a component of the heater fails and needs replacing. Right now they have a commercial sized operation heated by their latest iteration and it’s completely passive, being gravity fed. Video after the break.
When you hear “gravity waves” or “sprites”, you’d think you would know what is being discussed. After all, those ripples in space-time that Einstein predicted would emanate from twin, colliding, black holes were recently observed to much fanfare. And who doesn’t love early 8-bit computer animations? So when we were browsing over at SpaceWeather we were shocked to find that we were wrong twice, in one photo (on the right). Continue reading “Two Words That Don’t Mean What You Think They Do”→
It was the year of 1687 when Isaac Newton published “The Principia“, which revealed the first mathematical description of gravity. Newton’s laws of motion along with his description of gravity laid before the world a revolutionary concept that could be used to describe everything from the motions of heavenly bodies to a falling apple. Newton would remain the unequivocal king of gravity for the next several hundred years. But that would all change at the dawn of the 20th century when a young man working at a Swiss patent office began to ask some profound questions. Einstein had come to the conclusion that Newtonian physics was not adequate to describe the findings of the emerging electromagnetic field theories. In 1905, he published a paper entitled “On the Electrodynamics of Moving Bodies” which corrects Newton’s laws so they work when describing the motions of objects near the speed of light. This new description became known as Special Relativity.
It was ‘Special’ because it didn’t deal with gravity or acceleration. It would take Einstein another 10 years to work these two concepts into his relativity theory. He called it General Relativity – an understanding of which is necessary to fully grasp the significance of gravitational waves.