Pick-And-Place Machine For Candy

Every December and May the senior design projects from engineering schools start to roll in. Since the students aren’t yet encumbered with real-world detractors (like management) the projects are often exceptional, unique, and solve problems we never even thought we had. Such is the case with [Mark] and [Peter]’s senior design project: a pick and place machine that promises to solve all of life’s problems.

Of course we’ve seen pick-and-place machines before, but this one is different. Rather than identifying resistors and capacitors to set on a PCB, this machine is able to identify and sort candies. The robot — a version of the MeARM — has three degrees of freedom and a computer vision system to alert the arm as to what it’s picking up and where it should place it. A Raspberry Pi handles the computer vision and feeds data to a PIC32 which interfaces with the hardware.

One of the requirements for the senior design class was to keep the budget under $100, which they were able to accomplish using pre-built solutions wherever possible. Robot arms with dependable precision can’t even come close to that price restraint. But this project overcomes the lack of precision in the MeArm by using incremental correcting steps to reach proper alignment. This is covered in the video demo below.

Senior design classes are a great way to teach students how to integrate all of their knowledge into a final class, and the professors often include limits they might find in the real world (like the budget limit in this project). The requirement to thoroughly document the build process is also a lesson that more people could stand to learn. Senior design classes have attempted to solve a lot of life’s other problems, too; from autonomous vehicles to bartenders, there’s been a solution for almost every problem.

Continue reading “Pick-And-Place Machine For Candy”

Hack Your Brain: The McCollough Effect

There is a fascinating brain reaction known as the McCollough Effect which is like side-loading malicious code through your eyeballs. Although this looks and smells like an optical illusion, the science would argue otherwise. What Celeste McCollough observed in 1965 can be described as a contingent aftereffect although we refer to this as “The McCollough Effect” due to McCollough being the first to recognize this phenomena. It’s something that can’t be unseen… sometimes affecting your vision for months!

I am not suggesting that you experience the McCollough Effect yourself. We’ll look at the phenomena of the McCollough Effect, and it can be understood without subjecting yourself to it. If you must experience the McCollough Effect you do so at your own risk (here it is presented as a video). But read on to understand what is happening before you take the plunge.

Continue reading “Hack Your Brain: The McCollough Effect”

Electronics For Aliens

We are surrounded by displays with “millions” of colors and hundreds of pixels per inch. With super “high fidelity” sound producing what we perceive to be realistic replicas of the real world.

Of course this is not the case, we rarely stop and think how our electronic systems have been crafted around the limitations of human perception. So to explore this issue, in this article we ask the question: “What might an alien think of human technology?”. We will assume a lifeform which senses the world around it much as we do. But has massively improved sensing abilities. In light of these abilities we will dub it the Oculako.

Let’s begin with the now mostly defunct CRT display and see what our hypothetical alien thinks of it. The video below shows a TV screen shot at 10,000 frames per second.

Continue reading “Electronics For Aliens”

Update: What You See Is What You Laser Cut

If there’s one thing about laser cutters that makes them a little difficult to use, it’s the fact that it’s hard for a person to interact with them one-on-one without a clunky computer in the middle of everything. Granted, that laser is a little dangerous, but it would be nice if there was a way to use a laser cutter without having to deal with a computer. Luckily, [Anirudh] and team have been working on solving this problem, creating a laser cutter that can interact directly with its user.

The laser cutter is tied to a visual system which watches for a number of cues. As we’ve featured before, this particular laser cutter can “see” pen strokes and will instruct the laser cutter to cut along the pen strokes (once all fingers are away from the cutting area, of course). The update to this system is that now, a user can import a drawing from a smartphone and manipulate it with a set of physical tokens that the camera can watch. One token changes the location of the cut, and the other changes the scale. This extends the functionality of the laser cutter from simply cutting at the location of pen strokes to being able to cut around any user-manipulated image without interacting directly with a computer. Be sure to check out the video after the break for a demonstration of how this works.

Continue reading “Update: What You See Is What You Laser Cut”

Mikey, The Robot That Charges Itself

 

mikey-the-robot

Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.

[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.

One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.

Continue reading “Mikey, The Robot That Charges Itself”

We Are The Borg. We Will Add Heat And Distance Sensing To Your Vision.

we-are-borg

[Gregory McRoberts] was born with reduced vision in one eye and has never experienced the three dimensional sight which most of us take for granted. Recently he was inspired by the concept of a hearing aid to build a device which can augment his vision. Behold, the very Borg-like eye-patch that he wears to add distance and heat to his palette of senses.

The hardware he chose is an Arduino-compatible Lilypad board. It is wired to an ultrasonic rangefinder and an infrared sensor which monitor the area in front of him. The function of his right eye is still capable of seeing light and color, so a pair of LED boards are mounted on the inside. One is connected to the thermal sensor, displaying blue when below eighty degrees Fahrenheit and red when above. The other LED is green and flashes at a different speed based on the range sensor’s reading.

This is distracting when a person with normal sight wears it because of the intensity of the LEDs. We found [Gregory’s] explanation of this (called Helmet Fire) quite interesting.

[via Adafruit]

We Don’t Need To Brainstorm Projects; Xkcd Does That For Us

[Randall Munroe], the guy behind our favorite web comic xkcd, gave us yet another great project idea that falls on the heels of securing our valuables and silencing loud car stereos. The xkcd forum has been talking about how to implement this, and we’d like to hear what Hack A Day readers think about this idea.

The project isn’t much different from 3D photography. [Carl Pisaturo] has done a lot of art and experimentation based on this idea that basically amounted to largish binoculars. A poster on the xkcd forum has already built this using mirrors, but we’re wondering how much the parallax can be increased with this method. Two cameras and a smart phone would also allow automatic pan and tilt that corresponds to head movement.

We’re not quite sure if this idea can be applied to astronomy. The angular resolution of the human eye is around one arc minute, every star except for the Sun has an annual parallax less than one arc second. If anyone wants to try this out with a longer baseline (From Earth to Pluto for example), we would suggest simulating this in Stellarium. Seeing the moon as a sphere would be possible with a few hundred miles between cameras, though.

Tell us how you would build this in the comments, and be sure to send in your write-up if you manage to build it. We’ll put it up right away.

Thanks to [Theon144] for sending this in.

EDIT: Because the comments are actually bearing fruit, check out the thread on the Hack A Day forums for this post: link.