The Usage Of Embedded Linux In Spacecraft

As the first part of a series, [George Emad] takes us through a few examples of the Linux operating system being used in spacecraft. These range from SpaceX’s Dragon capsule to everyone’s favorite Martian helicopter. An interesting aspect is that the freshest Linux kernel isn’t necessarily onboard, as stability is far more important than having the latest whizzbang features. This is why SpaceX uses Linux kernel 3.2 (with real-time patches) on the primary flight computers of both Dragon and its rockets (Falcon 9 and Starship).

SpaceX’s flight computers use the typical triple redundancy setup, with three independent dual-core processors running the exact same calculations and a different Linux instance on each of its cores, and the result being compared afterwards. If any result doesn’t match that of the others, it is dropped. This approach also allows SpaceX to use fairly off-the-shelf (OTS) x86 computing hardware, with the flight software written in C++.

NASA’s efforts are similar, with Ingenuity in particular heavily using OTS parts, along with NASA’s open source, C++-based F’ (F Prime) framework. The chopper also uses some version of the Linux kernel on a Snapdragon 801 SoC, which as we have seen over the past 72 flights works very well.

Which is not to say using Linux is a no-brainer when it comes to use in avionics and similar critical applications. There is a lot of code in the monolithic Linux kernel that requires you to customize it for a specific task, especially if it’s on a resource-constrained platform. Linux isn’t particularly good at hard real-time applications either, but using it does provide access to a wealth of software and documentation — something that needs to be weighed up against the project’s needs.

Breaking Through The 1 MB Barrier In DOS With Unreal Mode And More

The memory map of the original 8086 computer with its base and extended memory made the original PC rather straightforward, but also posed countless issues for DOS-based applications as they tried to make use of memory beyond the legacy 1 MB address space. The initial ways to deal with this like EMS, XMS and UMB were rather cumbersome and often impractical, but with the arrival of the 80286 and 80386 processors more options opened up, including protected mode. More interestingly, this led to unreal mode, DOS extenders and the somewhat more obscure LOADALL instruction, as covered by [Julio Merino] in a new article.

This article builds on the first one which covered the older methods and covered the basics of protected mode. Where protected mode is convenient compared to real mode is that with the former the memory accesses go via the MMU and thus allows for access to 16 MB on the 80286 and 4 GB on the 80386. The segment descriptors and resolving of these that make this possible can be (ab)used on the 80286 and up by realizing that these segment descriptors are also used in real mode. Unreal mode is thus about switching to protected mode, loading arbitrary segment descriptors and switching back to real mode. As this is outside the original processor spec, it is commonly called ‘unreal mode’.

Continue reading “Breaking Through The 1 MB Barrier In DOS With Unreal Mode And More”

NIF’s Laser Fusion Experiment’s Energy Gain Passes Peer Review

Back in December of 2022, a team of researchers at the USA’s National Ignition Facility (NIF) announced that they had exceeded ‘scientific breakeven’ with their laser-based inertial confinement fusion (ICF) system. Their work has now been peer-reviewed and passed scrutiny, confirming that the energy put into fusing a small amount of deuterium-tritium fuel resulted in a net gain (Q) of 1.5.

Laser Bay 2, one of NIF's two laser bays
Laser Bay 2 at the NIF.

The key take-away here of course remains that ICF is not a viable method of producing energy, as we detailed back in 2021 when we covered the 1.3 MJ yield announcement, and again in 2022 following the subject of this now completed peer review.  The sheer amount of energy required to produce the laser energy targeting the fuel capsule and loss therein, as well as the energy required to manufacture each of these fuel capsules (Hohlraum) and sustaining a cycle make it a highly impractical proposition for anything except weapons research.

Despite this, it’s good to see that the NIF’s ICF research is bearing fruit, even if for energy production we should look towards magnetic confinement fusion (MCF), which includes the many tokamaks active today like Japan’s JT-60SE, as well as stellarators like Germany’s Wendelstein 7-X and other efforts to make MCF a major clean-energy source for the future.

How Airplanes Mostly Stopped Flying Into Terrain And Other Safety Improvements

We have all heard the statistics on how safe air travel is, with more people dying and getting injured on their way to and from the airport than while traveling by airplane. Things weren’t always this way, of course. Throughout the early days of commercial air travel and well into the 1980s there were many crashes that served as harsh lessons on basic air safety. The most tragic ones are probably those with a human cause, whether it was due to improper maintenance or pilot error, as we generally assume that we have a human element in the chain of events explicitly to prevent tragedies like these.

Among the worst pilot errors we find the phenomenon of controlled flight into terrain (CFIT), which usually sees the pilot losing track of his bearings due to a variety of reasons before a usually high-speed and fatal crash. When it comes to keeping airplanes off the ground until they’re at their destination, here ground proximity warning systems (GPWS) and successors have added a layer of safety, along with stall warnings and other automatic warning signals provided by the avionics.

With the recent passing of C. Donald Bateman – who has been credited with designing the GPWS – it seems like a good time to appreciate the technology that makes flying into the relatively safe experience that it is today.

Continue reading “How Airplanes Mostly Stopped Flying Into Terrain And Other Safety Improvements”

A schematic representation of the different ionospheric sub-layers and how they evolve daily from day to night periods. (Credit: Carlos Molina)

Will Large Satellite Constellations Affect Earth’s Magnetic Field?

Imagine taking a significant amount of metals and other materials out of the Earth’s crust and scattering it into the atmosphere from space. This is effectively what we have been doing ever since the beginning of the Space Age, with an increasing number of rocket stages, satellites and related objects ending their existence as they burn up in the Earth’s atmosphere. Yet rather than vanish into nothing, the debris of this destruction remains partially in the atmosphere, where it forms pockets of material. As this material is often conductive, it will likely affect the Earth’s magnetic field, as argued by [Sierra Solter-Hunt] in a pre-publication article.

A summary by [Dr. Tony Phillips] references a 2023 NASA research article by [Daniel M. Murphy] et al. which describes the discovery that about 10% of the aerosol particles in the stratosphere are aluminium and other metals whose origin can be traced back to the ‘burn-up’ of the aforementioned space objects. This is a factor which can increase the Debye length of the ionosphere. What the exact effects of this may be is still largely unknown, but fact remains that we are launching massively more objects into space than even a decade ago, with the number of LEO objects consequently increasing.

Although the speculation by [Sierra] can be called ‘alarmist’, the research question of what’ll happen if over the coming years we’ll have daily Starlink and other satellites disintegrating in the atmosphere is a valid one. As this looks like it will coat the stratosphere and ionosphere in particular with metal aerosols at levels never seen before, it might be worth it to do the research up-front, rather than wait until we see something odd happening.

Vesuvius Challenge 2023 Grand Prize Awarded And 2024’s New Challenge

In the year 79 CE, a massive cloud of volcanic ash rained down on the Roman city of Herculaneum after an eruption of Mount Vesuvius. Along with the city of Pompeii, Herculaneum was subsequently engulfed and buried by a pyroclastic flow that burned everything in its path, including the scrolls in the library of what today is known as the Villa of the Papyri. After the charred but still recognizable scrolls were found in the 18th century, many fruitless attempts were made to recover the text hidden within these charred ruins, but not until 2023 did we get our first full glimpse at their contents, along with the awarding of the Vesuvius Challenge 2023.

We previously covered the run-up to this award, but with only a small fraction of the scrolls now read, there’s still a long way to go. This leads to the 2024 prize challenge, which sees teams strive to read 90% of scrolls 1-4 each, for a $100,000 award. The expectation is that with this ability, it should be possible to read all 800 scrolls known today, but as detailed in the Master Plan there is still more to come. Being able to scan and process scrolls faster and more efficiently is one of the biggest challenges, as is that of recovering any more scrolls that may be stuck in the mud at the Villa of the Papyri. As easy as it may sound to pull stuff out of the mud, archaeological excavations are expensive and time-consuming.

With time running out on how long both the recovered and still lost scrolls will last, it’s pertinent that we do not lose this opportunity to double our knowledge of historical texts from this era.

Evidence For Graphite As A Room Temperature Superconductor

Magnetization M(H) hysteresis loops measured for the HOPG sample, before and after 800 K annealing to remove ferromagnetic influences. (Credit: Kopelevich et al., 2023)
Magnetization M(H) hysteresis loops measured for the HOPG sample, before and after 800 K annealing to remove ferromagnetic influences. (Credit: Kopelevich et al., 2023)

Little has to be said about why superconducting materials are so tantalizing, or what the benefits of an ambient pressure, room temperature material with superconducting properties would be. The main problem here is not so much the ‘room temperature’ part, as metallic hydrogen is already capable of this feat, if at pressures far too high for reasonable use. Now a recent research article in Advanced Quantum Technologies by Yakov Kopelevich and colleagues provides evidence that superconducting properties can be found in cleaved highly oriented pyrolytic graphite (HOPG). The fact that this feat was reported as having been measured at ambient pressure and room temperature makes this quite noteworthy.

What is claimed is that the difference from plain HOPG is the presence of parallel linear defects that result from the cleaving process, a defect line in which the authors speculate that the strain gradient fluctuations result in the formation of superconducting islands, linked by the Josephson effect into Josephson junctions. In the article, resistance and magnetization measurements on the sample are described, which provide results that provide evidence for the presence of these junctions that would link superconducting islands on the cleaved HOPG sample together.

As with any such claim, it is of course essential that it is independently reproduced, which we are likely to see the results of before long. An interesting part of the claim made is that this type of superconductivity in linear defects of stacked materials could apply more universally, beyond just graphite. Assuming this research data is reproduced successfully, the next step would likely be to find ways to turn this effect into practical applications over the coming years and decades.