Water Runner Robot


Researchers at Carnegie Mellon University’s NanoRobotics Lab have developed a robot that is capable of running on the surface of a pool of water. Like their wall climbing Waalbot, the Water Runner was inspired by the abilities of a lizard, in this case, the basilisk. The team studied the motions of the basilisk and found morphological features and aspects of the lizard’s stride that make running on water possible. Both the lizard and the robot run on water by slapping the surface to create an air cavity like the one above, then push against the water for the necessary lift and thrust. Several prototypes have been built, and there are variants with 2 or 4 legs and with on and off-board power sources. You can see a slow motion video of the robot’s movement below.

The purpose of their research is to create robots that can traverse any surface on earth and waste less energy to viscous drag than a swimming robot would. Though another of the team’s goals is to further legged robot research, the Water Runner is not without potential practical applications. It could be used to collect water samples, monitor waterways with a camera, or even deliver small packages. Download the full abstract in PDF format for more information.

Continue reading “Water Runner Robot”

Singing Tesla Coils


The video above is ArcAttack! playing the classic “Popcorn” through their signature Tesla coils. Solid state Tesla coils (SSTC) can generate sound using what [Ed Ward] calls pulse repetition frequency (PRF) modulation. The heat generated by the plasma flame causes rapid expansion of the surrounding air and a resulting soundwave. An SSTC can be operated at just about any frequency, so you just need to build a controller to handle it. The task is made more difficult because very few electronics are stable in such an intense EM field. [Ed] constructed a small Faraday cage for his microcontroller and used optical interconnects to deliver the signals to the Tesla coils.

[via Laughing Squid]

Stabilized Video Collages


This is some beautiful work. The clip features multiple video streams stabilized and then assembled into a whole. First, [ibftp] used the “Stabilize” feature in Motion 3 (part of Apple’s Final Cut Studio 2) to remove the camera shake from the clips. Then he was able to blend the videos with “fusion” set to “multiplication”. If you’ve got access to the tools, this shouldn’t be too hard to do yourself. We’re certain someone in SIGGRAPH is already attempting to do the same thing live. If you want to see image stabilization really making a difference, have a look at the stabilized version of the Zapruder film embedded below.

Continue reading “Stabilized Video Collages”

Perceptual Chronograph


All praise to [Limor] for uncovering this incredibly odd project. [magician]’s perceptual chronograph is designed to test whether time “slows down” in stressful situations. The device flashes a random number on the display very quickly so that it is impossible to perceive what is actually being displayed. If you can read the number while under stress, it means that your ability perceive time has increased. It’s hard to believe, but check out the video embedded after the break that investigates the phenomenon. We can’t help, but wonder how [magician] personally plans on testing this.

Continue reading “Perceptual Chronograph”

BBtv: Playing The Building

Today’s episode of BoingBoing TV visits [David Byrne]’s Playing the Building installation which we covered before. The video provides some insight into the artistic process: they wandered around and whacked things with mallets to see what sounded good. They use counterweighted motors to vibrate cast iron girders and columns. Many of the empty radiators are being struck by solenoids. He says the installation is very approachable because people realize that even if a skilled musician sat down they wouldn’t be any better at playing the device.

The Chief Cook Robot


We feel the need to apologize immediately for the use of Yakkity Sax in the preceding video and recommend you watch the longer, yak free, video below. It shows researchers at the Learning Algorithms and Systems Laboratory teaching a robot how to make a ham and cheese omelet. Each working area and food item is labeled with a machine recognizable tag. The researcher demonstrates the task by guiding the robot’s hand. The robot combines multiple demonstrations to generalize the skill. It can then adapt the learned skill to the specific task. You can see this in the video when the robot adjusts to the location of the bowl and cutting board when they’re moved around. Teaching through demonstration would make the use of robotics much easier for the general population.

Continue reading “The Chief Cook Robot”

AudioCubes By Percussa

[Peter Nyboer] has written an extensive post about his experience with AudioCubes from Percussa. Aside from their unique glowing exterior, these cubes are an innovative way to control and even produce audio tracks. Four faces of each cube are equipped with IR sensors to detect distance and communicate with other cubes. The cubes also have USB, a rechargeable battery, and audio in/out. Moving your hands around the sensors changes the MIDI output of the cube. Changing the cubes’ orientation and distance from each other also changes the signal. Max/MSP and Live are both supported out of the box, but that doesn’t mean it’ll be easy to get started. [Peter] makes an important point: unlike traditional instruments, there’s no obvious way to get started. At 400euro for 2 cubes and 650euro for 4 cubes, these devices aren’t exactly being given away, but it’s great to see new interfaces being imagined. A video of [Peter]’s first experiments with the cubes is embedded below; read his full post to see more footage of the cubes in action… and naturally we’d love to see any DIY versions of this you can come up with.

Continue reading “AudioCubes By Percussa”