Nvidia Transforms Standard Video Into Slow Motion Using AI

Nvidia is back at it again with another awesome demo of applied machine learning: artificially transforming standard video into slow motion – they’re so good at showing off what AI can do that anyone would think they were trying to sell hardware for it.

Though most modern phones and cameras have an option to record in slow motion, it often comes at the expense of resolution, and always at the expense of storage space. For really high frame rates you’ll need a specialist camera, and you often don’t know that you should be filming in slow motion until after an event has occurred. Wouldn’t it be nice if we could just convert standard video to slow motion after it was recorded?

That’s just what Nvidia has done, all nicely documented in a paper. At its heart, the algorithm must take two frames, and artificially create one or more frames in between. This is not a manual algorithm that interpolates frames, this is a fully fledged deep-learning system. The Convolutional Neural Network (CNN) was trained on over a thousand videos – roughly 300k individual frames.

Since none of the parameters of the CNN are time-dependent, it’s possible to generate as many intermediate frames as required, something which sets this solution apart from previous approaches.  In some of the shots in their demo video, 30fps video is converted to 240fps; this requires the creation of 7 additional frames for every pair of consecutive frames.

The video after the break is seriously impressive, though if you look carefully you can see the odd imperfection, like the hockey player’s skate or dancer’s arm. Deep learning is as much an art as a science, and if you understood all of the research paper then you’re doing pretty darn well. For the rest of us, get up to speed by wrapping your head around neural networks, and trying out the simplest Tensorflow example.

Continue reading “Nvidia Transforms Standard Video Into Slow Motion Using AI”

360 Live VR Teleportation Uses Drones, Neural Networks, And Perseverance

This past semester I added research to my already full schedule of math and engineering classes, as any masochistic student eagerly would. Packed schedule aside, how do you pass up the chance to work on implementing 360° virtual teleportation to anywhere in the world, in real-time. Yes, it is indeed the same concept as the cult worshipped Star Trek transporter, minus the ability to physically be at the location. Perhaps we can add a, “beam me up, Scotty” command when shutting down.

The research lab I was working with is the Laboratory for Immersive CommunicatiON (LION). It’s funded by NSF, Microsoft, and Adobe and has been on the pursuit of VR teleportation for some time now.  There’s a lot of cool technologies at work here, like drones which are used as location collection devices. A network of drones will survey landscape anywhere in the world and build the collection assets needed for recreating it in VR. Okay, so a swarm of drones might seem a little intimidating at first, but when has emerging technology not?

Continue reading “360 Live VR Teleportation Uses Drones, Neural Networks, And Perseverance”

Neural Network Learns SDR Ham Radio

Identifying ham radio signals used to be easy. Beeps were Morse code, voice was AM unless it sounded like Donald Duck in which case it was sideband. But there are dozens of modes in common use now including TV, digital data, digital voice, FM, and more coming on line every day. [Randaller] used CUDA to build a neural network that could interface with an RTL-SDR dongle and can classify the signals it hears. Since it is a neural network, it isn’t so much programmed to do it as it is trained. The proof of concept has training to distinguish FM, SECAM, and tetra. However, you can train it to recognize other modulation schemes if you want to invest the time into it.

Continue reading “Neural Network Learns SDR Ham Radio”

Colorize Your Election Party

blue_red
[Eric] has put together a simple python script to scrape election results from CNN.com. It uses urllib2 to return the popular and electoral votes for each party and throws an ElectionWon exception when CNN calls the race. He’s planning on hooking this to DMX controlled RGB LED lighting that will shift to either blue or red as the night progresses. It’s a great starting point if you want to pull off something similar.

You may remember [Eric] for building the IKEA MAME table and the TRS-80 wireless terminal.

[photo: skenmy]

UPDATE: [Garrett] of macetech is putting the finishing touches on his version which uses 32 ShiftBrite modules and 2 4-digit displays controlled by a CuBLOC.