Enhance VR Immersion By Shoehorning An Ambilight Into A Headset

Everyone wants a wider field of view in their VR headsets, but that’s not an easy nut to crack. [Statonwest] shows there’s a way to get at least some of the immersion benefits with a bit of simple hardware thanks to the VR Ambilight.

Continue reading “Enhance VR Immersion By Shoehorning An Ambilight Into A Headset”

A Commodore 128 with a video capture device attached

Hacking The Commodore 128 To Capture Almost Real-Time Video

Although watching and editing videos may be among the primary tasks of many PCs today, it wasn’t that long ago that working with video required powerful processors and expensive video capture hardware. Even in the 1980s, home computer users were looking for ways to connect video sources to their Commodores and Ataris despite their hardware limitations. [Cameron Kaiser] has a mid-1980s consumer-grade video capture device, which he has managed to turn into an almost real-time video capture system.

A distorted video image on a C128's monitor
Allowing the graphics chip to interrupt the CPU mid-capture results in a severely distorted image

His work revolves around a device called “ComputerEyes”, a 1984-vintage hardware interface that made it possible to connect a composite video source to a home computer. The limitations of mid-1980s CPUs meant that it took around six seconds for the computer to do a quick scan of a single video frame, or a multiple of that if you wanted a higher-quality image. Another limitation, at least on Commodore machines, was that the screen had to be turned off during video capture – otherwise, the video chip would interrupt the CPU halfway through the process, causing it to lose its synchronization with the video source.

[Cameron] however, plugged his ComputerEyes into a Commodore 128. This machine, largely designed by Hackaday contributor [Bil Herd], has an unusual hardware architecture consisting of two different CPUs and, crucially, two separate video chips. The primary 8564 “VIC-II” graphics chip is used to keep compatibility with existing Commodore 64 programs, while the secondary 8563 “VDC” is mainly aimed at newer high-resolution text-based software. The VDC is also much more independent from the main system bus than the VIC-II, allowing it to display an image without disturbing the CPU.

More after the break.

Continue reading “Hacking The Commodore 128 To Capture Almost Real-Time Video”

Take A Ride In The Bathysphere

[Tom Scott] has traveled the world to see interesting things.  So when he’s impressed by a DIY project, we sit up and listen. In this case, he’s visiting the Bathysphere, a project created by a couple of passionate hobbyists in Italy. The project is housed at Explorandia, which based on google translate, sounds like a pretty epic hackerspace.

The Bathysphere project itself is a simulation of a submarine. Sounds simple, but this project is anything but.  There are no VR goggles involved.  Budding captains who are up for the challenge find themselves inside the cockpit of a mini-submarine. The sub itself is on a DIY motion platform. Strong electric motors move the system causing riders to feel like they are truly underwater. Inside the cockpit, the detail is amazing. All sorts of switches, lights, and greebles make for a realistic experience.  An electronic voice provides the ship status, and let’s the crew know of any emergencies. (Spoiler alert — there will be emergencies!)

The real gem is how this simulation operates. A Logitec webcam is mounted on an XY gantry. This camera then is dipped underwater in a small pond. Video from the camera is sent to a large monitor which serves as the sub’s window. It’s all very 1960’s simulator tech, but the effect works. The subtle movements of the simulator platform really make the users feel like they are 20,000 leagues under the sea.

Check out the video after the break for more info!

Continue reading “Take A Ride In The Bathysphere”

Mangle Videos With RecurBOY And A Raspberry Pi Zero

You used to need a lot of equipment to be a video DJ. Now you can do it all with a Raspberry Pi Zero and [cyberboy666]’s recurBOY. And if you missed out on the 1970’s video-editing psychedelia, now’s your chance to catch up – recurBOY is a modern video synth with all of the bells and whistles, and it’ll fit in your pocket. Check out [cyberboy666]’s demo video if you don’t yet know what you’re getting into. (Embedded below.)

RecurBOY has four modes: video, shader, effects, and external input, and each of these is significantly cooler than the previous. Video mode plays videos straight off of the SD card through the recurBOY’s composite video out. Shader mode lets you program your own shaders using the GLES shader dialect for resource-constrained devices. And this is where the various knobs and buttons come in. You can program the various shader routines to read any of the pots as input, allowing you to tweak the graphics demos on the fly.

Effects mode overlays your shaders on the video that’s playing, and external mode allows you to plug in a USB video capture card or a webcam so you can do all that same mangling with a live camera feed. And these two modes are where it gets awesome. The shader effects in the demo video cover all of the analog classics – including bloom and RGB separation – but also some distinctly digital effects. And again, you can tweak them all live with the knobs. Or plug in a MIDI controller and control it all externally. What hasn’t he thought of?

Old school analog video effects are really fun, and recurBOY brings them to you with the flexibility of modern shader coding. What’s not to love? If you want to see the pinnacle of the pre-digital era, that would be the Scanimate. For a video synth that integrates with your audio synth, check out Hypno. And if glitching the video is more your style, you can hijack the RAM of a VGA/composite converter.

Trippy, man!

Continue reading “Mangle Videos With RecurBOY And A Raspberry Pi Zero”

NVIDIA Jetson Powers Real-Time Iron Man HUD

If you could recreate any of the capabilities of Tony Stark’s Iron Man suit in real life, it would probably be the ability to fly, the super strength, or maybe even the palm-mounted lasers that can cut through whatever obstacle is in your path. But let’s be real, all that stuff is way too hard to try and pull off. Plus you’ll probably just end up accidentally killing yourself in the backyard.

But judging by the videos he’s been posting, [Kris Kersey] is doing one hell of a job creating a functional heads-up display (HUD) similar to the one Tony uses in the films. He’s even building it into a 3D printed Iron Man helmet, with the NVIDIA Jetson board that’s powering the show inside a chest-mounted “Arc Reactor”. He goes into a bit more detail about the project and his goals in an interview recently published on NVIDIA’s own blog. Continue reading “NVIDIA Jetson Powers Real-Time Iron Man HUD”

Very Slow Movie Player Avoids E-Ink Ghosting With Machine Learning

[mat kelcey] was so impressed and inspired by the concept of a very slow movie player (which is the playing of a movie at a slow rate on a kind of DIY photo frame) that he created his own with a high-resolution e-ink display. It shows high definition frames from Alien (1979) at a rate of about one frame every 200 seconds, but a surprising amount of work went into getting a color film intended to look good on a movie screen also look good when displayed on black & white e-ink.

The usual way to display images on a screen that is limited to black or white pixels is dithering, or manipulating relative densities of white and black to give the impression of a much richer image than one might otherwise expect. By itself, a dithering algorithm isn’t a cure-all and [mat] does an excellent job of explaining why, complete with loads of visual examples.

One consideration is the e-ink display itself. With these displays, changing the screen contents is where all the work happens, and it can be a visually imperfect process when it does. A very slow movie player aims to present each frame as cleanly as possible in an artful and stylish way, so rewriting the entire screen for every frame would mean uglier transitions, and that just wouldn’t do.

Delivering good dithering results despite sudden contrast shifts, and with fewest changed pixels.

So the overall challenge [mat] faced was twofold: how to dither a frame in a way that looked great, but also tried to minimize the number of pixels changed from the previous frame? All of a sudden, he had an interesting problem to solve and chose to solve it in an interesting way: training a GAN to generate the dithers, aiming to balance best image quality with minimal pixel change from the previous frame. The results do a great job of delivering quality visuals even when there are sharp changes in scene contrast to deal with. Curious about the code? Here’s the GitHub repository.

Here’s the original Very Slow Movie Player that so inspired [mat], and here’s a color version that helps make every frame a work of art. And as for dithering? It’s been around for ages, but that doesn’t mean there aren’t new problems to solve in that space. For example, making dithering look good in the game Return of the Obra Dinn required a custom algorithm.

A home-made tape robot that stores VHS tapes

VHS Robot Swaps Tapes, As Seen In Hackers

Tape robots are typically used in places that store vast amounts of data – think film studios and government archives. If you’ve seen the 1995 cult movie Hackers, you might remember a scene where the main character hacks into a TV station and reprograms their tape ‘bot to load a series he wanted to watch. It’s this scene that inspired [Nathan] over at [Midwest Cyberpunk] to make his own tape robot that loads VHS tapes.

[Nathan] has thousands of tapes in his collection, but the robot is not built to manage all of them. Instead, it’s meant to help him run his VHS streaming channel, saving him from having to physically go to his VCR every time a tape needs swapping. For that, a ten-tape storage capacity is plenty.

A custom cyberdeck used to drive a tape robotThe main parts of the tape robot are a grabber that holds the tape, an extender that moves it forward and backward, and a linear rail that moves it up and down. The vertical motion is generated by a hybrid stepper motor through a belt drive system, while the grabber and extender are operated pneumatically. Once the grabber reaches the VCR, a pneumatic pusher shoves the tape inside. All of this is nearly identical to the robot seen in the movie, which was most likely not a commercial machine but a custom-made prop.

The whole system is controlled by an ESP32 running FluidNC inside the robot as well as a handmade cyberdeck next to it that manages the overall process of loading and storing tapes. Although [Nathan] is currently using the robot for his streaming channel, he’s planning to also use it for digitizing part of his massive tape collection, which contains a few titles that were never released on newer formats.

Working with old tapes can be tricky: some types of tape degrade over time, while others might come with primitive copy protection systems. But moving information over to newer media is a necessity if you don’t want to risk losing it forever.

Continue reading “VHS Robot Swaps Tapes, As Seen In Hackers