Electronics for Aliens

We are surrounded by displays with “millions” of colors and hundreds of pixels per inch. With super “high fidelity” sound producing what we perceive to be realistic replicas of the real world.

Of course this is not the case, we rarely stop and think how our electronic systems have been crafted around the limitations of human perception. So to explore this issue, in this article we ask the question: “What might an alien think of human technology?”. We will assume a lifeform which senses the world around it much as we do. But has massively improved sensing abilities. In light of these abilities we will dub it the Oculako.

Let’s begin with the now mostly defunct CRT display and see what our hypothetical alien thinks of it. The video below shows a TV screen shot at 10,000 frames per second.

Continue reading “Electronics for Aliens”

Update: What You See Is What You Laser Cut

If there’s one thing about laser cutters that makes them a little difficult to use, it’s the fact that it’s hard for a person to interact with them one-on-one without a clunky computer in the middle of everything. Granted, that laser is a little dangerous, but it would be nice if there was a way to use a laser cutter without having to deal with a computer. Luckily, [Anirudh] and team have been working on solving this problem, creating a laser cutter that can interact directly with its user.

The laser cutter is tied to a visual system which watches for a number of cues. As we’ve featured before, this particular laser cutter can “see” pen strokes and will instruct the laser cutter to cut along the pen strokes (once all fingers are away from the cutting area, of course). The update to this system is that now, a user can import a drawing from a smartphone and manipulate it with a set of physical tokens that the camera can watch. One token changes the location of the cut, and the other changes the scale. This extends the functionality of the laser cutter from simply cutting at the location of pen strokes to being able to cut around any user-manipulated image without interacting directly with a computer. Be sure to check out the video after the break for a demonstration of how this works.

Continue reading “Update: What You See Is What You Laser Cut”

Mikey, the Robot That Charges Itself



Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.

[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.

One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.

Continue reading “Mikey, the Robot That Charges Itself”

We are the Borg. We will add heat and distance sensing to your vision.


[Gregory McRoberts] was born with reduced vision in one eye and has never experienced the three dimensional sight which most of us take for granted. Recently he was inspired by the concept of a hearing aid to build a device which can augment his vision. Behold, the very Borg-like eye-patch that he wears to add distance and heat to his palette of senses.

The hardware he chose is an Arduino-compatible Lilypad board. It is wired to an ultrasonic rangefinder and an infrared sensor which monitor the area in front of him. The function of his right eye is still capable of seeing light and color, so a pair of LED boards are mounted on the inside. One is connected to the thermal sensor, displaying blue when below eighty degrees Fahrenheit and red when above. The other LED is green and flashes at a different speed based on the range sensor’s reading.

This is distracting when a person with normal sight wears it because of the intensity of the LEDs. We found [Gregory’s] explanation of this (called Helmet Fire) quite interesting.

[via Adafruit]

We don’t need to brainstorm projects; xkcd does that for us

[Randall Munroe], the guy behind our favorite web comic xkcd, gave us yet another great project idea that falls on the heels of securing our valuables and silencing loud car stereos. The xkcd forum has been talking about how to implement this, and we’d like to hear what Hack A Day readers think about this idea.

The project isn’t much different from 3D photography. [Carl Pisaturo] has done a lot of art and experimentation based on this idea that basically amounted to largish binoculars. A poster on the xkcd forum has already built this using mirrors, but we’re wondering how much the parallax can be increased with this method. Two cameras and a smart phone would also allow automatic pan and tilt that corresponds to head movement.

We’re not quite sure if this idea can be applied to astronomy. The angular resolution of the human eye is around one arc minute, every star except for the Sun has an annual parallax less than one arc second. If anyone wants to try this out with a longer baseline (From Earth to Pluto for example), we would suggest simulating this in Stellarium. Seeing the moon as a sphere would be possible with a few hundred miles between cameras, though.

Tell us how you would build this in the comments, and be sure to send in your write-up if you manage to build it. We’ll put it up right away.

Thanks to [Theon144] for sending this in.

EDIT: Because the comments are actually bearing fruit, check out the thread on the Hack A Day forums for this post: link.

Vintage Hack – Game Boy Camera


Back in 2005, a member of a French robotics team named [Laurent] wrote a wonderful how-to that we somehow missed on using the Game Boy Camera as a vision device for a robot. The images above are actual shots from his project. The Game Boy Camera features a stunning 128×123 pixel resolution in a gorgeous 4 color gray-scale palette. Possibly the most attractive feature of this hack is that it is still possible to get a hold of these cameras for under ten dollars on ebay.

He connected the camera sensor to an Atmel AT90S4433 using a combination of digital and analog signals, and then used the microcontroller to echo the data back to his PC. His write up includes schematics for wiring up the sensor/microcontroller, the datasheet for the sensor, his C code for the whole project, and an easy to read pin out of the GBC connector. Although his project simply offloaded the image to a computer, it would be entirely possible to have the microcontroller respond to the image or simply just log and store it. It would also be just as easy to replace his Atmel chip with your own favorite microcontroller, as long as it has a couple Digital I/O ports and at least one Analog port (or an external analog to digital converter).

UPDATE: Good catch r4v5, it would require an ADC, not a DAC.