Ambrosianus L 99 sup., p. 190, ll. 14–23, UV fluorescence image by Lumière Technology. Upside-down Latin overtext in dark brown and Greek undertext in light brown.

Reading Ptolemy’s Treatise On The Meteoroscope On Palimpsests After Centuries Of Recovery Attempts

During the Middle Ages much of Ancient Greek and Roman scientific, legal and similarly significant texts written on parchment were commonly erased, mostly because of the high cost of new parchment and the little regard given to these secular texts. Although recovery attempts of the remaining faint outlines of the old text has been attempted since at least the 19th century, these often involved aggressive chemical means. Now researchers have managed to recover the text written by Ptolemy on a parchment that suffered such a previous recovery attempt.

The term for a parchment and similar on which the existing text was washed or scraped off is a palimpsest, via Latin from  Ancient Greek παλίμψηστος (palímpsēstos, from παλίν + ψαω = ‘again’ + ‘scrape’). In the case of this particular treatise, it is part of L 99 sup which is kept at the Biblioteca Ambrosiana in Milan, Italy. This collection contains fifteen palimpsest parchment leaves previously used for three Greek scientific texts: a text of unknown authorship on mathematical mechanics and catoptrics, known as the Fragmentum Mathematicum Bobiense (three leaves), Ptolemy’s Analemma (six leaves), and a so far unidentified astronomical text on six leaves.

Outermost six rings of the meteoroscope, not to scale. Nh, Sh, Eh, and Wh are cardinal points of the horizon; Ne and Se are the north and south celestial poles; Nz and Sz are the north and south poles of the ecliptic; and Z is the zenith. (Gysembergh et al., 2023)

It is this last text that has now been identified, courtesy of work by Victor Gysembergh and colleagues. Whereas 19th century palimpsest recovery attempts by Angelo Mai involved reagents, during the 20th century ultraviolet illumination became the preferred method, followed by similar non-destructive analysis methods. For this study UV fluorescence and multispectral reflectance imaging was employed, which allowed for significant more of the original Greek text to be uncovered. Most notable, this revealed Ptolemy’s treatise on the Meteoroscope, which is an instrument for measuring the position, length, and direction of the apparent path of a shooting star.

This new recovery builds upon text previously recovered by other researchers since Mai’s attempts, and fills in more details, although it must be noted that not all of the text has been recovered. It’s hoped that in future imaging sessions more can be recovered of this irreplaceable text, that like so many of its kind nearly got destroyed during Europe’s darkest era.

(Top image: Ambrosianus L 99 sup., p. 190, ll. 14–23, UV fluorescence image by Lumière Technology. Upside-down Latin overtext in dark brown and Greek undertext in light brown.)

Generating Entangled Qubits And Qudits With Fully On-Chip Photonic Quantum Source

As the world of computing and communication draws ever closer to a quantum future, researchers are faced with many of the similar challenges encountered with classical computing and the associated semiconductor hurdles. For the use of entangled photon pairs, for example, it was already possible to perform the entanglement using miniaturized photonic structures, but these still required a bulky external laser source. In a recently demonstrated first, a team of researchers have created a fully on-chip integrated laser source with photonic circuitry that can perform all of these tasks without external modules.

In their paper published in Nature Photonics, Hatam Mahmudlu and colleagues cover the process in detail. Key to this achievement was finding a way to integrate the laser and photonics side into a single, hybric chip while overcoming the (refractive) mismatch between the InP optical amplifier and Si3N4 waveguide feedback circuit. The appeal of photon-based quantum entanglement should be obvious when one considers the relatively stable nature of these pairs and their compatibility with existing optical (fiber) infrastructure. What was missing previously was an economical and compact way to create these pairs outside of a laboratory setup. Assuming that the described approach can be scaled up for mass-production, it may just make quantum communications a realistic option outside of government organizations.

Soft Robotic System For In Situ 3D Bioprinting And Endoscopic Surgery

The progress of medical science has meant increasingly more sophisticated ways to inspect and repair the body, with a shift towards ever less invasive and more effective technologies. An exciting new field is that of in situ tissue replacement in a patient, which can be singular cells or even 3D printed tissues. This in vitro approach of culturing replacement tissues comes however with its share of issues, such as the need for a bioreactor. A more straightforward approach is printing the cells in vivo, meaning directly inside the patient’s body, as demonstrated by a team at the University of New South Wales Sydney with a soft robot that can print layers of living cells inside for example a GI tract.

In their paper, the team — led by [Dr Thanh Nho Do] and PhD student [Mai Thanh Thai] — describe the soft robot that is akin to a standard endoscope, but with a special head that has four soft microtubule artificial muscles (SMAM) for three degrees of freedom and fabric bellow actuators (FBA) that provide the motion desired by the remote controller. The system is configured in such a way that the operator inputs the rough intended motions, which are then smoothed by the software before the hydraulics actuate the head.

In a test on a simulated GI tract, the researchers were able to manipulate a prototype, and deposit a range of materials from the installed syringes. They envision that a system like this could be used as with endoscopes and laparoscopy to not only accurately deposit replacement cells inside the patient’s body, but also to perform a range of other surgical interventions, whereby the surgeon is supported by the system’s software, rather than manipulating the instruments directly.

NASA’s Ingenuity Mars Helicopter Completes 50th Flight

While NASA’s Perseverance rover brought an array of impressive scientific equipment to the surface of Mars, certainly its most famous payload is the stowaway helicopter Ingenuity. Despite being little more than a restricted-budget experiment using essentially only off-the-shelf components that you can find in your smartphone and e-waste drawer, the tenacious drone managed to complete its fiftieth flight on April 13 — just days before the two year anniversary of its first flight, which took place on April 19th of 2021.

Engineers hoped that Ingenuity would be able to show that a solar-powered drone could function in the extremely thin atmosphere of Mars, but the experiment ended up wildly exceeding expectations.  No longer a simple technology demonstrator, the helicopter has become an integral part of Perseverance’s operations. Through its exploratory flights Ingenuity can scout ahead, picking the best spots for the much slower rover, with rough terrain only becoming a concern when it’s time to land.

Since leaving the relatively flat Jezero Crater floor on January 19th of 2023, Ingenuity has had to contend with significantly harsher terrain. Thanks to upgraded navigation firmware the drone is better to determine safe landing locations, but each flight remains a white-knuckle event. This is also true for each morning’s wake-up call. Although the rover is powered and heated continuously due to its nuclear power source, Ingenuity goes into standby mode overnight, after which it must re-establish its communication with the rover.

Though there’s no telling what the future may hold for Ingenuity, one thing is certain — its incredible success will shape upcoming missions. NASA is already looking at larger, more capable drones to be sent on future missions, which stand to help us explore the Red Planet planet faster than ever. Not a bad for a flying smartphone.

Continue reading “NASA’s Ingenuity Mars Helicopter Completes 50th Flight”

Detecting Anti-Neutrinos From Distant Fission Reactors Using Pure Water At SNO+

Although neutrinos are exceedingly common, their near-massless configuration means that their presence is rather ephemeral. Despite billions of them radiating every second towards Earth from sources like our Sun, most of them zip through our bodies and this very planet without ever interacting with either. This property is also what makes studying these particles that are so fundamental to our understanding so complicated. Fortunately recently published results by researchers behind the SNO+ neutrino detector project shows that we may see a significant bump in our neutrino detection sensitivity.

The Sudbury Neutrino Detector (Courtesy of SNO)
The Sudbury Neutrino Detector (Courtesy of SNO)

In their paper (preprint) in APS Physical Review Letters, the researchers describe how during the initial run of the new SNO+ neutrino detector they were able to detect anti-neutrinos originating from nuclear fission reactors over 240 kilometers away, including Canadian CANDU and US LWR types. This demonstrated the low detection threshold of the  SNO+ detector even in its still incomplete state between 2017 and 2019. Filled with just heavy water and during the second run with the addition of nitrogen to keep out radioactive radon gas from the surrounding rock of the deep mine shaft, SNO+ as a Cherenkov detector accomplished a threshold of 1.4 MeV at its core, more than sufficient to detect the 2.2 MeV gamma radiation from the inverse beta decays (IBD) that the detector is set up for.

The SNO+ detector is the evolution of the original Sudbury Neutrino Observatory (SNO), located 2.1 km below the surface in the Creighton Mine. SNO ran from 1999 to 2006, and was part of the effort to solve the solar neutrino problem, which ultimately revealed the shifting nature of neutrinos via neutrino oscillation. Once fully filled with 780 tons of linear alkylbenzene as a scintillator, SNO+ will investigate a number of topics, including neutrinoless double beta decay (Majorana fermion), specifically the confounding question regarding whether neutrinos are its own antiparticle or not

The focus of SNO+ on nearby nuclear fission reactors is due to the constant beta decay that occurs in their nuclear fuel, which not only produces a lot of electron anti-neutrinos. This production happens in a very predictable manner due to the careful composition of nuclear fuel. As the researchers noted in their paper, SNO+ is accurate enough to detect when a specific reactor is due for refueling, on account of its change in anti-neutrino emissions. This is a property that does not however affect Canadian CANDU PHWRs, as these are constantly refueled, making their neutrino production highly constant.

Each experiment by SNO+ produces immense amounts of data (hundreds of terabytes per year) that takes a while to process, but if these early results are anything to judge by, then SNO+ may progress neutrino research as much as SNO and kin have previously.

Noninvasive Sensors For Brain–Machine Interfaces Based On Micropatterned Epitaxial Graphene

As fun as brain-computer interfaces (BCI) are, for the best results they tend to come with the major asterisk of requiring the cutting and lifting of a section of the skull in order to implant a Utah array or similar electrode system. A non-invasive alternative consists out of electrodes which are placed on the skin, yet at a reduced resolution. These electrodes are the subject of a recent experiment by [Shaikh Nayeem Faisal] and colleagues in ACS Applied NanoMaterials employing graphene-coated electrodes in an attempt to optimize their performance.

Impedance values of eight-channel FEG and eight-channel HPEG sensor systems placed on the occipital area of the head. (Faisal et al., 2023)
Impedance values of eight-channel FEG and eight-channel HPEG sensor systems placed on the occipital area of the head. (Faisal et al., 2023)

Although external electrodes can be acceptable for basic tasks, such as registering a response to a specific (visual) impulse or for EEG recordings, they can be impractical in general use. Much of this is due to the disadvantages of the ‘wet’ and ‘dry’ varieties, which as the name suggests involve an electrically conductive gel with the former.

This gel ensures solid contact and a resistance of no more than 5 – 30 kΩ at 50 Hz, whereas dry sensors perform rather poorly at >200 kΩ at 50 Hz with worse signal-to-noise characteristics, even before adding in issues such as using the sensor on a hairy scalp, as tends to be the case for most human subjects.

In this study, they created electrode arrays in a number of configurations, each of which used graphene as the interface material. The goal was to get a signal even through human hair — such as on the back of the head near the visual cortex — that would be on-par with wet electrodes. The researchers got very promising results with hex-patterned epitaxial graphene (HEPG) sensors, and even in this early prototype stage, the technique could offer an alternative where wet electrodes are not an option.

While the subject is complex, brain-computer interfaces don’t have to be the sole domain of research laboratories. We recently covered an open hardware Raspberry Pi add-on that can let you experiment with detecting and filtering biosignals from the comfort of your own home.

Uranium-241 Isotope Created And Examined Via Multinucleon Transfer Reactions And Mass Spectrometry

A recent paper (PDF) in Physical Review Letters by T. Niwase and colleagues covers a fascinating new way to both create and effectively examine isotopes by employing a cyclotron and a mass spectrograph. In the paper, they describe the process of multinucleon transfer (MNT) and analysis at the recently commissioned KEK Isotope Separation System (KISS), located at the RIKEN Nishina Center in Japan.

Sketch of the KISS experimental setup. The blue- and yellow-colored areas are filled with Ar and He gases, respectively. Differential pumping systems are located after the doughnut-shaped gas cell as well as before and after the GCCB. (Credit: Niwase et al., 2023)
Sketch of the KISS experimental setup. The blue- and
yellow-colored areas are filled with Ar and He gases, respectively. Differential pumping systems are located after the doughnut-shaped gas cell as well as before and after the GCCB. (Credit: Niwase et al., 2023)

The basic process which involves the RIKEN Ring Cyclotron, which was loaded for this particular experiment with Uranium-238 isotope. Over the course of four days, 238U particles impinged on a 198Pt target, after which the resulting projectile-like fragments (PLF) were led through the separation system (see sketch). This prepared the thus created ions to be injected into the multi-reflection time-of-flight mass spectrograph (MRTOF MS), which is a newly installed and highly refined mass spectrograph which was also recently installed at the facility.

Using this method, the researchers were able to establish that during the MNT process in the cyclotron, the transfer of nucleons from the collisions had resulted in the production of 241U as well as 242U. Although the former had not previously been produced in an experimental setting, the mass of 242U had not been accurately determined. During this experiment, the two uranium as well as neptunium and other isotopes were led through the MRTOF MS instrument, allowing for the accurate measurement of the characteristics of each isotope.

The relevance of producing new artificial isotopes of uranium lies not so much in the production of these, but rather in how producing these atoms allows us to experimentally confirm theoretical predictions and extrapolations from previous data. This may one day lead us to amazing discoveries such as the famously predicted island of stability, with superheavy, stable elements with as of yet unknown properties.

Even if such astounding discoveries are not in the future for theoretical particle physics, merely having another great tool like MNT to ease the burden of experimental verification would seem to be more than worth it.