No-Battery HD Video Streaming Does It with Backscatter

What if Google Glass didn’t have a battery? That’s not too far fetched. This battery-free HD video streaming camera could be built into a pair of eyeglass frames to stream HD video to a nearby phone or other receiver using no bulky batteries or external power source. Researchers at the University of Washington are using backscatter to pull this off.

The problem is that a camera which streams HD video wirelessly to a receiver consumes over 1 watt due to the need for a digital processor and transmitter. The researchers have separated the processing hardware into the receiving unit. They then send the analog pixels from the camera sensor directly to backscatter hardware. Backscatter involves reflecting received waves back to where they came from. By adding the video signal to those reflected waves, they eliminated the need for the power-hungry transmitter. The full details are in their paper (PDF), but here are the highlights.

Battery-free camera design approach

On the camera side, the pixel voltages (CAM Out) are an analog signal which is fed into a comparator along with a triangular waveform. Wherever the triangle wave’s voltage is lower than the pixel voltage, the comparator outputs a 0, otherwise, it outputs a 1. In this way, the pixel voltage is converted to different pulse widths. The triangular waveform’s minimum and maximum voltages are selected such that they cover the full possible range of the camera voltages.

The sub-carrier modulation with the XOR gate in the diagram is there to address the problem of self-interference. This is unwanted interference from the transmitter of the same frequency as the carrier. And so the PWM output is converted to a different frequency using a sub-carrier. The receiver can then filter out the interference. The XOR gate is actually part of an FPGA which also inserts frame and line synchronization patterns.

They tested two different implementations with this circuit design, a 112 x 112 grayscale one at up to 13 frames per second (fps) and an HD one. Unfortunately, no HD camera on the market gives access to the raw analog pixel outputs so they took HD video from a laptop using USB and ran that through a DAC and then into their PWM converter. The USB limited it to 10 fps.

The result is that video streaming at 720p and 10 fps uses as low as 250 μW and can be backscattered up to sixteen feet. They also simulated an ASIC which achieved 720p and 1080p at 60 fps using 321 μW and 806 μW respectively. See the video below for an animated explanation and a demonstration. The resulting video is quite impressive for passive power only.

If the University of Washington seems familiar in the context of backscatter, that’s because we’ve previously covered their battery-free (almost) cell phone. Though they’re not the only ones experimenting with it. Here’s where backscatter is being used for a soil network. All of this involves power harvesting, and now’s a great time to start brushing up on these concepts and building your own prototypes. The Hackaday Prize includes a Power Harvesting Challenge this year.

Continue reading “No-Battery HD Video Streaming Does It with Backscatter”

Motion-Controlled KVM Switch

Once upon a time, [hardwarecoder] acquired a Gen8 HP microserver that he began to toy around with. It started with ‘trying out’ some visualization before spiraling off the rails and fully setting up FreeBSD with ZFS as a QEMU-KVM virtual machine. While wondering what to do next, he happened to be lamenting how he couldn’t also fit his laptop on his desk, so he built himself a slick, motion-sensing KVM switch to solve his space problem.

At its heart, this device injects DCC code via the I2C pins on his monitors’ VGA cables to swap inputs while a relay ‘replugs’ the keyboard and mouse from the server to the laptop — and vice-versa — at the same time. On the completely custom PCB are a pair of infrared diodes and a receiver that detects Jedi-like hand waves which activate the swap. It’s a little more complex than some methods, but arguably much cooler.

Using an adapter, the pcb plugs into his keyboard, and the monitor data connections and keyboard/mouse output to the laptop and server stream out from there. There is a slight potential issue with cables torquing on the PCB, but with it being so conveniently close, [hardwarecoder] doesn’t need to handle it much.

Continue reading “Motion-Controlled KVM Switch”

High-Effort Streaming Remote for Low-Effort Bingeing

There’s no limit to the amount of work some people will put into avoiding work. For instance, why bother to get up from your YouTube-induced vegetative state to adjust the volume when you can design and build a remote to do it for you?

Loath to interrupt his PC streaming binge sessions, [miroslavus] decided to take matters into his own hands. When a commercially available wireless keyboard proved simultaneously overkill for the job and comically non-ergonomic, he decided to build a custom streaming remote. His recent microswitch encoder is prominently featured and provides scrolling control for volume and menu functions, and dedicated buttons are provided for play controls. The device reconfigures at the click of a switch to support Netflix, which like YouTube is controlled by sending keystrokes to the PC through a matching receiver. It’s a really thoughtful design, and we’re sure the effort [miroslavus] put into this will be well worth the dozens of calories it’ll save in the coming years.

A 3D-printed DIY remote is neat, but don’t forget that printing can also save a dog-chewed remote and win the Repairs You Can Print contest.

Continue reading “High-Effort Streaming Remote for Low-Effort Bingeing”

Discrete Pong Project Goes Big, Adds a Player

Some projects just take on a life of their own. What started as a pleasant diversion or a simple challenge becomes an obsession, and the next thing you know you’ve built a two-player color Pong game with audio completely from discrete components.

If this one seems familiar, it’s because we were dazzled by its first incarnation last year. As impressive as version 1.0 was, all the more so since it was built using the Manhattan method and seemingly over the course of a weekend, it did have its limitations. [GK] has been refining his design ever since and keeping accurate track of the process, to the tune of 22 pages on the EEVblog forum. We haven’t pored through it all yet, but the state of the project now is certainly worth a look. The original X-Y output to an oscilloscope was swapped out to composite video for a monitor, in both mono and color. This version also allows two people to play head-to-head instead of just battling the machine. It looks like [GK] had to add a couple of blocks worth of real estate to his Manhattan board to accommodate the changes, and he tidied the wiring significantly while he was at it.

It’s a project that keeps on giving, so feast your eyes and learn. We suspect [GK] doesn’t have any plans to finish this soon, but if he does, we can’t wait to see what’s next.

Thanks to [David Gustafik] for reminding us to check back on this one.

More Than Just An Atari Look-Alike

The Raspberry Pi has been a boon for hackers with a penchant for retro gaming. Redditor [KaptinBadkruk] Wanted to get on board the game train and so built himself an Atari 2600-inspired Raspberry Pi 3 console!

A key goal was the option to play Nintendo 64 titles, so [KaptinBadkruk] had to overclock the Pi and then implement a cooling system. A heatsink, some copper pads, and a fan from an old 3D printer — all secured by a 3D printed mount — worked perfectly after giving the heatsink a quick trim. An old speaker and a mono amp from Adafruit — and a few snags later — had the sound set up, with the official RPi touchscreen as a display.

After settling on an Atari 2600-inspired look, [KaptinBadkruk] laboured through a few more obstacles in finishing it off — namely, power. He originally intended for this  project to be portable, but power issues meant that idea had to be sidelined until the next version. However — that is arguably offset by [KaptinBadkruk]’s favourite part: a slick 3D Printed item box from Mario Kart front and center completes the visual styling in an appropriately old-meets-new way.

That item block isn’t the first time a lightshow has accompanied an Atari console, but don’t let that stop you from sticking one in your pocket.

[Via /r/DIY]

Know Your Video Waveform

When you acquired your first oscilloscope, what were the first waveforms you had a look at with it? The calibration output, and maybe your signal generator. Then if you are like me, you probably went hunting round your bench to find a more interesting waveform or two. In my case that led me to a TV tuner and IF strip, and my first glimpse of a video signal.

An analogue video signal may be something that is a little less ubiquitous in these days of LCD screens and HDMI connectors, but it remains a fascinating subject and one whose intricacies are still worthwhile knowing. Perhaps your desktop computer no longer drives a composite monitor, but a video signal is still a handy way to add a display to many low-powered microcontroller boards. When you see Arduinos and ESP8266s producing colour composite video on hardware never intended for the purpose you may begin to understand why an in-depth knowledge of a video waveform can be useful to have.

The purpose of a video signal is to both convey the picture information in the form of luminiance and chrominance (light & dark, and colour), and all the information required to keep the display in complete synchronisation with the source. It must do this with accurate and consistent timing, and because it is a technology with roots in the early 20th century all the information it contains must be retrievable with the consumer electronic components of that time.

We’ll now take a look at the waveform and in particular its timing in detail, and try to convey some of its ways. You will be aware that there are different TV systems such as PAL and NTSC which each have their own tightly-defined timings, however for most of this article we will be treating all systems as more-or-less identical because they work in a sufficiently similar manner.

Continue reading “Know Your Video Waveform”

Easy Time-lapse Video via Phone and Command Line

A good time-lapse video can be useful visual documentation, and since [Tommy]’s phone is the best camera he owns he created two simple shell scripts to grab time-lapse images and assemble them into a video. [Tommy]’s work is just the glue between two other things: an app that turns the phone into an IP camera with a web server on the local network, and the ability to grab a still image from that server on demand.

The app he uses for his iPhone normally serves video but has an undocumented feature that allows single frames to be downloaded by adding ‘/photo’ to the end of the URL, but the ability to get a still image is a common feature on IP camera apps for smartphones. His capture script (GitHub repository here) should therefore need only minor changes to work with just about any IP camera app.

Perching a phone over a workspace and using it to create a time-lapse with a couple of shell scripts is a great example of combining simple tools to get better functionality. It could be a good way to get additional use out of an older smartphone, too. Heck, even older dumbphones can still get some use out of them; Shmoocon 2017 brought us details on rolling your own 1G network.