Arctic Adventures With A Data General Nova II — The Equipment

As I walked into the huge high bay that was to be my part-time office for the next couple of years, I was greeted by all manner of abandoned equipment haphazardly scattered around the room. As I later learned, this place was a graveyard for old research projects, cast aside to be later gutted for parts or forgotten entirely. This was my first day on the job as a co-op student at the Georgia Tech Engineering Experiment Station (EES, since renamed to GTRI). The engineer who gave me the orientation tour that day pointed to a dusty electronic rack in one corner of the room. Steve said my job would be to bring that old minicomputer back to life. Once running, I would operate it as directed by the radar researchers and scientists in our group. Thus began a journey that resulted in an Arctic adventure two years later.

The Equipment

The computer in question was a Data General (DG) mini computer. DG was founded by former Digital Equipment Corporation (DEC) employees in the 1960s. They introduced the 16-bit Nova computer in 1969 to compete with DEC’s PDP-8. I was gawking at a fully-equipped Nova 2 system which had been introduced in 1975. This machine and its accessories occupied two full racks, with an adjacent printer and a table with a terminal and pen plotter. There was little to no documentation. Just to turn it on, I had to pester engineers until I found one who could teach me the necessary front-panel switch incantation to boot it up. Continue reading “Arctic Adventures With A Data General Nova II — The Equipment”

Could Moon Mining Spoil Its Untouched Grandeur And Science Value?

It’s 2024. NASA’s Artemis program is in full swing, and we’re hoping to get back to the surface of the Moon real soon. Astronauts haven’t walked on the beloved sky rock since 1972! A human landing was scheduled for 2025, which has now been pushed back to 2026, and we’re all getting a bit antsy about it. Last time we wanted to go, it only took 8 years!

Now, somehow, it’s harder, but NASA also has its sights set higher. It no longer wants to just toddle about the Moon for a bit to wave at the TV cameras. This time, there’s talk of establishing permanent bases on the Moon, and actually doing useful work, like mining. It’s a tantalizing thought, but what does this mean for the sanctity of one of the last pieces of real estate yet to be spoilt by humans? Researchers are already arguing that we need to move to protect this precious, unique environment.

Continue reading “Could Moon Mining Spoil Its Untouched Grandeur And Science Value?”

Bell Labs Is Leaving The Building

If you ever had the occasion to visit Bell Labs at Murray Hill, New Jersey, or any of the nearby satellite sites, but you didn’t work there, you were probably envious. For one thing, some of the most brilliant people in the world worked there. Plus, there is the weight of history — Bell Labs had a hand in ten Nobel prizes, five Turing awards, 22 IEEE Medals of Honor, and over 20,000 patents, including several that have literally changed the world. They developed, among other things, the transistor, Unix, and a host of other high-tech inventions. Of course, Bell Labs hasn’t been Bell for a while — Nokia now owns it. And Nokia has plans to move the headquarters lab from its historic Murray Hill campus to nearby New Brunswick. (That’s New Jersey, not Canada.)

If your friends aren’t impressed by Nobels, it is worth mentioning the lab has also won five Emmy awards, a Grammy, and an Academy award. Not bad for a bunch of engineers and scientists. Nokia bought Alcatel-Lucent, who had wound up with Bell Labs after the phone company was split up and AT&T spun off Lucent.

Continue reading “Bell Labs Is Leaving The Building”

How IBM Stumbled Onto RISC

There are a ton of inventions out in the world that are almost complete accidents, but are still ubiquitous in our day-to-day lives. Things like bubble wrap which was originally intended to be wallpaper, or even superglue, a plastic compound whose sticky properties were only discovered later on. IBM found themselves in a similar predicament in the 1970s after working on a type of mainframe computer made to be a phone switch. Eventually the phone switch was abandoned in favor of a general-purpose processor but not before they stumbled onto the RISC processor which eventually became the IBM 801.

As [Paul] explains, the major design philosophy at the time was to use a large amount of instructions to do specific tasks within the processor. When designing the special-purpose phone switch processor, IBM removed many of these instructions and then, after the project was cancelled, performed some testing on the incomplete platform to see how it performed as a general-purpose computer. They found that by eliminating all but a few instructions and running those without a microcode layer, the processor performance gains were much more than they would have expected at up to three times as fast for comparable hardware.

These first forays into the world of simplified processor architecture both paved the way for the RISC platforms we know today such as ARM and RISC-V, but also helped CISC platforms make tremendous performance gains as well. In fact, RISC-V is a direct descendant from these early RISC processors, with three intermediate designs between then and now. If you want to play with RISC-V yourself, our own [Jonathan Bennett] took a look at a recent RISC-V SBC and its software this past March.

Thanks to [Stephen] for the tip!

Photo via Wikimedia Commons

Generating 3D Scenes From Just One Image

The LucidDreamer project ties a variety of functions into a pipeline that can take a source image (or generate one from a text prompt) and “lift” its content into 3D, creating highly-detailed Gaussian splats that look great and can even be navigated.

Gaussian splatting is a method used to render NeRFs (Neural Radiance Fields), which are themselves a method of generating complex scenes from sparse 2D sources, and doing it quickly. If that is all news to you, that’s probably because this stuff has sprung up with dizzying speed from when the original NeRF concept was thought up barely a handful of years ago.

What makes LucidDreamer neat is the fact that it does so much with so little. The project page has interactive scenes to explore, but there is also a demo for those who would like to try generating scenes from scratch (some familiarity with the basic tools is expected, however.)

In addition to the source code itself the research paper is available for those with a hunger for the details. Read it quick, because at the pace this stuff is expanding, it honestly might be obsolete if you wait too long.

Slab Casting – A New Way To Combine 3D Printing And Ceramics

Slip casting can be messy both in processing and in making the original plaster mold. What if there was a better way, thanks to 3D printing?

[Allie Katz] has developed a new technique using 3D printed slab molds to make ceramics. By combining the ability of 3D printing to make intricate designs and the formability of clay, they have found a way to make reproducible clay objects without all that tedious mucking about with liquid clay.

[Katz] takes us through a quick “Mould Making 101” before showing how the slab casting press molds were made. Starting with a positive CAD design, the molds were designed to eliminate undercuts and allow for air infiltration since a plastic mold can’t suck the water out of the clay like a plaster one would. Some cookie clay cutters were also designed to help with the trickier bits of geometry. Once everything was printed, the molds were coated with cornstarch and clay was pressed in. After removal, any final details like handles can be added and the pieces are then fired as normal.

If you’d like to see some more 3D printing mixed up with ceramics, check out 3D printing glass with a laser, reliable ceramic slurry printing, or this TPU-based approach.

Continue reading “Slab Casting – A New Way To Combine 3D Printing And Ceramics”

A Transistor, But For Heat Instead Of Electrons

Researchers at UCLA recently developed what they are calling a thermal transistor: a solid-state device able to control the flow of heat with an electric field. This opens the door to controlling the transfer of heat in some of the same ways we are used to controlling electronics.

Heat management can be a crucial task, especially where electronics are involved. The usual way to manage heat is to draw it out with things like heat sinks. If heat isn’t radiating away fast enough, a fan can be turned on (or sped up) to meet targets. Compared to the precision and control with which modern semiconductors shuttle electrons about, the ability to actively manage heat seems lacking.

This new device can rapidly adjust thermal conductivity of a channel based on an electrical field input, which is very similar to what a transistor does for electrical conductivity. Applying an electrical field modifies the strength of molecular bonds in a cage-like array of molecules, which in turn adjusts their thermal conductivity.

It’s still early, but this research may open the door to better control of heat within semiconductor systems. This is especially interesting considering that 3D chips have been picking up speed for years (stacking components is already a thing, it’s called Package-on-Package assembly) and the denser and deeper semiconductors get, the harder it is to passively pull heat out.

Thanks to [Jacob] for the tip!