Two-axis Panning Time Lapse Rig Built From Lego

2_axis_lego_timelapse_dolly

[Jochem] wrote in to share a neat time lapse camera dolly he constructed out of Lego bricks. He is a big fan of the two-axis panning time lapse effect where the camera moves while recording images. He figured it would be easy enough to construct one of his own, so he dug out his pail of Lego and got to work.

The rig consists of a stationary motor platform which pulls a movable sled using a simple gear and string. The motor platform is controlled by an Arduino, which pulls the movable sled along every so often, snapping pictures along the way. [Jochem’s] Nikon D80 supports shutter release via IR, so he programmed the Arduino to send a quick IR pulse each time it has finished moving the dolly.

The rig looks like it works pretty well as you can see by the video below, but [Jochem] says that it still needs a bit of work. We just can’t wait to see what other time lapse movies he puts together once he finds an “interesting” time lapse subject.

Continue reading “Two-axis Panning Time Lapse Rig Built From Lego”

DIY Orb Display Puts The Earth In Your Hands

diy_spherical_projection_globe

[Nirav] liked the idea of having his own personal Earth at the tip of his fingers, and since that’s not happening any time soon, he decided to build the next best thing. Sure, he could have simply gone out and purchased a globe, but there is no fun in that. Instead, he shows us how he put together an interactive spherical display that won’t break the bank.

The sphere uses a Microvision SHOWWX to drive its display, which projects an image inside of a frosted glass light fixture. The pico projector gets some help from a 180° fisheye lens along the way, enabling the picture to be stretched across the entire inner surface of the globe.

[Nirav] used his 3D extruder to build a base for the globe, which attaches to the projector via a printed mounting plate. A GorillaPod was used to keep things upright while he dusted off his trigonometry skills in order to figure out how to get the image just right.

We think that he did a great job – it definitely looks to be on par (albeit a bit smaller) than the eye of Sauron globe we saw a while back. We can’t wait to see a video of this thing in action once it’s completely finished!

Model Rocketry From The Rocket’s Point Of View

model_rocketry_video

When someone writes in and says, “Hey, I strapped a camera to a rocket and took videos of it launching!” it’s really hard for us to not get suckered in. Try as we might, we just couldn’t resist taking a look at the videos [Vlad] recorded of his model rocketry “exploration”.

Inspired by our 4th of July post featuring POV videos of bottle rocket launches, he bought himself an 808 keychain camera and decided to try his hand at some high flying video. He strapped the camera to his 46” Estes rocket with a few pieces of scotch tape in an effort to keep weight down, and set off to his launch pad.

He used a Estes C6-5 engine for each launch, which he estimates took the rocket up to a height of 300 feet rather than the typical 500 feet, due to the added weight. While not particularly useful, the video is still awfully fun to watch, and perhaps it will inspire others to mount cameras on even larger, more powerful rockets.

We can only hope.

Continue reading to check out the videos [Vlad] shot, but be warned, the descent is vertigo-inducing.

Continue reading “Model Rocketry From The Rocket’s Point Of View”

NTSC Video Out With The Papilio One

video_out_with_the_papilio_one

[Ben Leperchey] is working on building a Sega Master System clone using the Papilio One FPGA board, and although his ultimate goal has yet to be reached, he’s bringing some great stuff to the table in the meantime.

One component that is necessary for any sort of game system clone is NTSC/PAL video output, naturally. Since no one had constructed a TV output “Wing” (The Papilio One’s version of a shield or breakout board), [Ben] went and did it on his own. Using only 14 resistors and a low-pass audio filter, he was able to get the video output he was looking for with relatively little trouble. His VHDL code running on the Papilio does all the hard work of creating the video signal, while the wing he designed mostly handles the connectivity.

This is one of the first few projects/components we’ve seen come out of the Papilio camp, and it looks like things are off to a good start. We can’t wait to see the Master System implementation once it has been wrapped up!

Continue reading to see a quick video demonstration of the Papilio One and [Ben’s] TV output wing.

Continue reading “NTSC Video Out With The Papilio One”

Adding Video Out To The Open Pandora

There’s very few users out there who actually have their hands on an Open Pandora Console. But the ones who do might find this hack useful for getting TV out up and running. It’s actually not hard at all, but if you don’t want to alter the hardware on the device you’ll first have to find a cable plug that will fit the EXT jack. This proved more difficult than it needed to be, since TI carries the connector but only sells them in multiples of 2200. A group buy was organized and we’d bet you can still get in on that action.

The connector in question carries TVout1 and TVout2 conductors. These correspond to the Luminance and Chrominance signals needed for the S-video protocol. But [MarkoeZ] wanted to use a composite connection. Turns out that’s not hard either, he hooked up the ground from the plug to the ground of the RCA jack, then connected both video lines to the center conductor, making sure to add an inline 470pf capacitor on the Chrominance side. Check out the demo video embedded after the break to see the final product.

Continue reading “Adding Video Out To The Open Pandora”

LG TV Hacking Via Serial Connection Or IR Codes

[Brendan Robert] has been sending us forum thread links outlining the things he’s learned while hacking LG televisions. They were a bit hard to follow for the uninitiated, so we asked if he could give us an overview of what he’s been working on. Not only did he do that, but he made a little Hackaday shout-out seen above by adding the skull and cross-wrenches as one of the menu overlays.

He’s using a TV as his computer monitor, which he picked up at a discount because it was a display model. Without the original remote, and wanting to have features like power-saving mode which is standard on monitors but not on this TV, he decided to see what he could accomplish. A couple of things made this quite a bit easier. First, there’s an RS232 port built into the back which removes the need to investigate and solder your own onto the board. Secondly, since LG built on the Linux kernel for the set, you can download some of the firmware sources from their website.

What he came up with is a script that will find and communicate with the TV over the serial connection. The test script used during development polled every possible command, looking for valid return values. Once [Brendan] established which commands work and what they do, he was able to take command of the unit, writing scripts to adjust brightness based on the ambient light in the room as seen from the computer’s webcam. Make sure you check out the sub-pages to his post that detail the brightness adjustments, stand-by functionality, custom overlay graphics, and the extra commands he uncovered.

word_clouds_from_broacast_television

Analyzing TV’s Talking Heads With Processing

[Michael] from Nootropic Design wrote in to share an interesting and fun project he put together using one of the products his company sells. The gadget in question is their “Video Experimenter” shield which was designed for the Arduino. It is typically used to allow the manipulation of composite video streams via overlays and the like, but it can also serve as a video analyzer as well.

When used for video analysis, the board lets you decode closed captioning data, which is exactly what [Michael] did here. He decided it would be fun to scrape the closed captioning information from various shows and commercials to do a little bit of content analysis.

Using a Processing sketch on his Arduino, he reads the closed captioning feed from his cable box, keeping a count of every word mentioned in the broadcast. As the show progresses, his sketch dynamically constructs a cloud that shows the most commonly used words in the video feed.

The results he gets are quite interesting, especially when he watches the nightly news, or some other broadcast with a specific target audience. We think it would be cool to run this application during a political debate or perhaps during a Hollywood awards ceremony to discover which set of speakers is the most vapid.

if you’re interested in learning more about the decoding process, [Michael] has put together a detailed explanation of how the closed captioning data can be pulled from a video stream. For those of you who just want to see the decoder in action, keep reading to see a quick video demonstration.

Continue reading “Analyzing TV’s Talking Heads With Processing”