Genetic Therapy Aims To Bring Hearing To Those Born Deaf

For those born with certain types of congenital deafness, the cochlear implant has been a positive and enabling technology. It uses electronics to step in as a replacement for the biological ear that doesn’t quite function properly, and provides a useful, if imperfect, sense of hearing to its users.

New research has promised another potential solution for some sufferers of congenital deafness. Instead of a supportive device, a gene therapy is used to enable the biological ear to function more as it should. The result is that patients get their sense of hearing, not from a prosthetic, but from their own ears themselves.

Continue reading “Genetic Therapy Aims To Bring Hearing To Those Born Deaf”

ESP32-P4 Powers Retro Handheld After A Transplant

The ESP32-P4 is the new hotness on the microcontroller market. With RISC-V architecture and two cores running 400 MHz, to ears of a certain vintage it sounds more like the heart of a Unix workstation than a traditional MCU. Time’s a funny thing like that. [DynaMight] was looking for an excuse to play with this powerful new system on a chip, so put together what he calls the GB300-P4: a commercial handheld game console with an Expressif brain transplant.

Older ESP32 chips weren’t quite up to 16-bit emulation, but that hadn’t stopped people trying; the RetroGo project by [ducalex] already has an SNES and Genesis/Mega Drive emulation mode, along with all the 8-bit you could ask for. But the higher-tech consoles can run a bit slow in emulation on other ESP32 chips. [DynaMight] wanted to see if the P4 performed better, and to no ones surprise, it did.

If the build quality on this handheld looks suspiciously professional, that’s because it is: [DynaMight] started with a GB300, a commercial emulator platform. Since the ESP32-P4 is replacing a MIPS chip clocked at 914 MHz in the original — which sounds even more like the heart of a Unix workstation, come to think of it — the machine probably doesn’t have better performance than it did from factory unless its code was terribly un-optimized. In this case, performance was not the point. The point was to have a handheld running RetroGo on this specific chip, which the project has evidently accomplished with flying colours. If you’ve got a GB300 you’d rather put an “Expressif Inside” sticker on, the project is on github. Otherwise you can check out the demo video below. (DOOM starts at 1:29, because of course it runs DOOM.)

The last P4 project we featured was a Quadra emulator; we expect to see a lot of projects with this chip in the new year, and they’re not all going to be retrocomputer-related, we’re sure. If you’re cooking up something using the new ESP32, or know someone who is, you know what to do.

Continue reading “ESP32-P4 Powers Retro Handheld After A Transplant”

Clone Wars: IBM Edition

If you search the Internet for “Clone Wars,” you’ll get a lot of Star Wars-related pages. But the original Clone Wars took place a long time ago in a galaxy much nearer to ours, and it has a lot to do with the computer you are probably using right now to read this. (Well, unless it is a Mac, something ARM-based, or an old retro-rig. I did say probably!)

IBM is a name that, for many years, was synonymous with computers, especially big mainframe computers. However, it didn’t start out that way. IBM originally made mechanical calculators and tabulating machines. That changed in 1952 with the IBM 701, IBM’s first computer that you’d recognize as a computer.

If you weren’t there, it is hard to understand how IBM dominated the computer market in the 1960s and 1970s. Sure, there were others like Univac, Honeywell, and Burroughs. But especially in the United States, IBM was the biggest fish in the pond. At one point, the computer market’s estimated worth was a bit more than $11 billion, and IBM’s five biggest competitors accounted for about $2 billion, with almost all of the rest going to IBM.

So it was somewhat surprising that IBM didn’t roll out the personal computer first, or at least very early. Even companies that made “small” computers for the day, like Digital Equipment Corporation or Data General, weren’t really expecting the truly personal computer. That push came from companies no one had heard of at the time, like MITS, SWTP, IMSAI, and Commodore. Continue reading “Clone Wars: IBM Edition”

Pushing China’s EAST Tokamak Past The Greenwald Density Limit

Getting a significant energy return from tokamak-based nuclear fusion reactors depends for a large part on plasma density, but increasing said density is tricky, as beyond a certain point the plasma transitions back from the much more stable high-confinement mode (H-mode) into L-mode. Recently Chinese researchers have reported that they managed to increase the plasma density in the EAST tokamak beyond the previously known upper Greenwald Density Limit (GDL), as this phenomenon is known.

We covered these details with nuclear fusion reactors in great detail last year, noting the importance of plasma edge stability, as this causes tokamak wall erosion as well as loss of energy. The EAST tokamak (HT-7U) is a superconducting tokamak that was upgraded and resumed operations in 2014, featuring a 1.85 meter major radius and 7.5 MW heating power. As a tokamak the issue of plasma and edge stability are major concerns, even in H-mode, requiring constant intervention.

Continue reading “Pushing China’s EAST Tokamak Past The Greenwald Density Limit”

An Open Source Electromagnetic Resonance Tablet

Drawing tablets have been a favorite computer peripheral of artists since its inception in the 1980s. If you have ever used a drawing tablet of this nature, you may have wondered, how it works, and if you can make one. Well, wonder no longer as [Yukidama] has demonstrated an open source electromagnetic resonance (EMR) drawing tablet build!

The principle of simple EMR tablets is quite straight forward. A coil in the tablet oscillates from around 400 kHz to 600 kHz. This induces a current inside a coil within the pen at its resonant frequency. This in turn, results in a voltage spike within the tablet around the pen’s resonant frequency. For pressure sensing, a simple circuit within the pen can shift its resonant frequency, which likewise is picked up within the tablet. The tablet’s input buttons work in similar ways!

But this is merely one dimensional. To sample two dimensions, two arrays of coils are needed. One to sample the horizontal axis, and one the vertical. The driver circuit simply sweeps over the array and samples every coil at any arbitrary speed the driver can achieve.

Finally, [Yukidama] made a last demo by refining the driver board, designed to drive a flexible circuit containing the coils. This then sits behind the screen of a Panasonic RZ series laptop, turning the device into a rather effective drawing tablet!

If tablets aren’t your style, check out this drawing pen. 

Continue reading “An Open Source Electromagnetic Resonance Tablet”

The Atari 800

Atari Brings The Computer Age Home

[The 8-Bit Guy] tells us how 8-bit Atari computers work.

Personal Computer Market Share in 1984The first Atari came out in 1977, it was originally called the Atari Video Computer System. It was followed two years later, in 1979, by the Atari 400 and Atari 800. The Atari 800 had a music synthesizer, bit-mapped graphics, and sprites which compared favorably to the capabilities of the other systems of the day, known as the Trinity of 1977, being the Apple II, Commodore PET, and TRS-80. [The 8-Bit Guy] says the only real competition in terms of features came from the TI-99/4 which was released around the same time.

The main way to load software into the early Atari 400 and 800 computers was to plug in cartridges. The Atari 400 supported one cartridge and the Atari 800 supported two. The built-in keyboards were pretty terrible by today’s standards, but as [The 8-Bit Guy] points out there wasn’t really any expectations around keyboards back in the late 1970s because everything was new and not many precedents had been set.

Continue reading “Atari Brings The Computer Age Home”

When Electricity Doesn’t Take The Shortest Path

Everyone knows that the path of least resistance is the path that will always be taken, be it by water, electricity or the feet of humans. This is where the PCB presented by [ElectrArc240] on YouTube is rather confusing, as it demonstrates two similarly sized traces, one of which is much shorter than the other, yet the current opts to travel via the much longer trace. If you were to measure this PCB between each path, the shorter path has the lowest resistance at 0.44 Ω while the longer path is 1.44 Ω. Did the laws of physics break down here?

Of course, this is just a trick question, as the effective resistance for an electrical circuit isn’t just about ohmic resistance. Instead the relevant phrasing here is ‘path of least impedance‘, which is excellently demonstrated here using this PCB. Note that its return path sneaks on the back side along the same path as the long path on the front. To this is added a 1 MHz high current source that demonstrates the impact of alternating current, with reactance combining with the resistance.

Although for direct current it’s fair to say that impedance is the equivalent of resistance, once the inductance of a trace has to be taken into account – as in the case of AC and high-frequency signaling – the much higher inductance of the short path means that now the long path is actually the shortest.

When you are doing some impedance matching in your favorite EDA software while implementing an Ethernet RMII link or similar, this is basically part of the process, with higher frequencies requiring ever more stringent mechanisms to keep both sides happy. At some point any stray signals from nearby traces and components become a factor, never mind the properties of the PCB material.

Continue reading “When Electricity Doesn’t Take The Shortest Path”