Experiencing Visual Deficits And Their Impact On Daily Life, With VR

Researchers presented an interesting project at the 2024 IEEE Conference on Virtual Reality and 3D User Interfaces: it uses VR and eye tracking to simulate visual deficits such as macular degeneration, diabetic retinopathy, and other visual diseases and impairments.

Typical labels and pill bottles can be shockingly inaccessible to a variety of common visual deficits.

VR offers a unique method of allowing people to experience the impact of living with such conditions, a point driven home particularly well by having the user see for themselves the effect on simple real-world tasks such as choosing a pill bottle, or picking up a mug. Conditions like macular degeneration (which causes loss of central vision) are more accurately simulated by using eye tracking, a technology much more mature nowadays than it was even just a few years ago.

The abstract for the presentation is available here, and if you have some time be sure to check out the main index for all of the VR research demos because there are some neat ones there, including a method of manipulating a user’s perception of the shape of the ground under their feet by electrically-stimulating the tendons of the ankle.

Eye tracking is in a few consumer VR products nowadays, but it’s also perfectly feasible to roll your own in a surprisingly slick way. It’s even been used on jumping spiders to gain insights into the fascinating and surprisingly deep perceptual reality these creatures inhabit.

Reprogrammable Transistors

Not every computer can make use of a disk drive when it needs to store persistent data. Embedded systems especially have pushed the development of a series of erasable programmable read-only memories (EPROMs) because of their need for speed and reliability. But erasing memory and writing it over again, whether it’s an EPROM, an EEPROM, an FPGA, or some other type of configurable solid-state memory is just scratching the surface of what it might be possible to get integrated circuits and their transistors to do. This team has created a transistor that itself is programmable.

Rather than doping the semiconductor material with impurities to create the electrical characteristics needed for the transistor, the team from TU Wien in Vienna has developed a way to “electrostatically dope” the semiconductor, using electric fields instead of physical impurities to achieve the performance needed in the material. A second gate, called the program gate, can be used to reconfigure the electric fields within the transistor, changing its properties on the fly. This still requires some electrical control, though, so the team doesn’t expect their new invention to outright replace all transistors in the future, and they also note that it’s unlikely that these could be made as small as existing transistors due to the extra complexity.

While the article from IEEE lists some potential applications for this technology in the broad sense, we’d like to see what these transistors are actually capable of doing on a more specific level. It seems like these types of circuits could improve efficiency, as fewer transistors might be needed for a wider variety of tasks, and that there are certainly some enhanced security features these could provide as well. For a refresher on the operation of an everyday transistor, though, take a look at this guide to the field-effect transistor.

Arctic Adventures With A Data General Nova II — The Equipment

As I walked into the huge high bay that was to be my part-time office for the next couple of years, I was greeted by all manner of abandoned equipment haphazardly scattered around the room. As I later learned, this place was a graveyard for old research projects, cast aside to be later gutted for parts or forgotten entirely. This was my first day on the job as a co-op student at the Georgia Tech Engineering Experiment Station (EES, since renamed to GTRI). The engineer who gave me the orientation tour that day pointed to a dusty electronic rack in one corner of the room. Steve said my job would be to bring that old minicomputer back to life. Once running, I would operate it as directed by the radar researchers and scientists in our group. Thus began a journey that resulted in an Arctic adventure two years later.

The Equipment

The computer in question was a Data General (DG) mini computer. DG was founded by former Digital Equipment Corporation (DEC) employees in the 1960s. They introduced the 16-bit Nova computer in 1969 to compete with DEC’s PDP-8. I was gawking at a fully-equipped Nova 2 system which had been introduced in 1975. This machine and its accessories occupied two full racks, with an adjacent printer and a table with a terminal and pen plotter. There was little to no documentation. Just to turn it on, I had to pester engineers until I found one who could teach me the necessary front-panel switch incantation to boot it up. Continue reading “Arctic Adventures With A Data General Nova II — The Equipment”

Could Moon Mining Spoil Its Untouched Grandeur And Science Value?

It’s 2024. NASA’s Artemis program is in full swing, and we’re hoping to get back to the surface of the Moon real soon. Astronauts haven’t walked on the beloved sky rock since 1972! A human landing was scheduled for 2025, which has now been pushed back to 2026, and we’re all getting a bit antsy about it. Last time we wanted to go, it only took 8 years!

Now, somehow, it’s harder, but NASA also has its sights set higher. It no longer wants to just toddle about the Moon for a bit to wave at the TV cameras. This time, there’s talk of establishing permanent bases on the Moon, and actually doing useful work, like mining. It’s a tantalizing thought, but what does this mean for the sanctity of one of the last pieces of real estate yet to be spoilt by humans? Researchers are already arguing that we need to move to protect this precious, unique environment.

Continue reading “Could Moon Mining Spoil Its Untouched Grandeur And Science Value?”

Bell Labs Is Leaving The Building

If you ever had the occasion to visit Bell Labs at Murray Hill, New Jersey, or any of the nearby satellite sites, but you didn’t work there, you were probably envious. For one thing, some of the most brilliant people in the world worked there. Plus, there is the weight of history — Bell Labs had a hand in ten Nobel prizes, five Turing awards, 22 IEEE Medals of Honor, and over 20,000 patents, including several that have literally changed the world. They developed, among other things, the transistor, Unix, and a host of other high-tech inventions. Of course, Bell Labs hasn’t been Bell for a while — Nokia now owns it. And Nokia has plans to move the headquarters lab from its historic Murray Hill campus to nearby New Brunswick. (That’s New Jersey, not Canada.)

If your friends aren’t impressed by Nobels, it is worth mentioning the lab has also won five Emmy awards, a Grammy, and an Academy award. Not bad for a bunch of engineers and scientists. Nokia bought Alcatel-Lucent, who had wound up with Bell Labs after the phone company was split up and AT&T spun off Lucent.

Continue reading “Bell Labs Is Leaving The Building”

How IBM Stumbled Onto RISC

There are a ton of inventions out in the world that are almost complete accidents, but are still ubiquitous in our day-to-day lives. Things like bubble wrap which was originally intended to be wallpaper, or even superglue, a plastic compound whose sticky properties were only discovered later on. IBM found themselves in a similar predicament in the 1970s after working on a type of mainframe computer made to be a phone switch. Eventually the phone switch was abandoned in favor of a general-purpose processor but not before they stumbled onto the RISC processor which eventually became the IBM 801.

As [Paul] explains, the major design philosophy at the time was to use a large amount of instructions to do specific tasks within the processor. When designing the special-purpose phone switch processor, IBM removed many of these instructions and then, after the project was cancelled, performed some testing on the incomplete platform to see how it performed as a general-purpose computer. They found that by eliminating all but a few instructions and running those without a microcode layer, the processor performance gains were much more than they would have expected at up to three times as fast for comparable hardware.

These first forays into the world of simplified processor architecture both paved the way for the RISC platforms we know today such as ARM and RISC-V, but also helped CISC platforms make tremendous performance gains as well. In fact, RISC-V is a direct descendant from these early RISC processors, with three intermediate designs between then and now. If you want to play with RISC-V yourself, our own [Jonathan Bennett] took a look at a recent RISC-V SBC and its software this past March.

Thanks to [Stephen] for the tip!

Photo via Wikimedia Commons

Generating 3D Scenes From Just One Image

The LucidDreamer project ties a variety of functions into a pipeline that can take a source image (or generate one from a text prompt) and “lift” its content into 3D, creating highly-detailed Gaussian splats that look great and can even be navigated.

Gaussian splatting is a method used to render NeRFs (Neural Radiance Fields), which are themselves a method of generating complex scenes from sparse 2D sources, and doing it quickly. If that is all news to you, that’s probably because this stuff has sprung up with dizzying speed from when the original NeRF concept was thought up barely a handful of years ago.

What makes LucidDreamer neat is the fact that it does so much with so little. The project page has interactive scenes to explore, but there is also a demo for those who would like to try generating scenes from scratch (some familiarity with the basic tools is expected, however.)

In addition to the source code itself the research paper is available for those with a hunger for the details. Read it quick, because at the pace this stuff is expanding, it honestly might be obsolete if you wait too long.