What Makes A Hacker

I think I can sum up the difference between those of us who regularly visit Hackaday and the world of non-hackers. As a case study, here is a story about how necessity is the mother of invention and the people who invent.

Hackaday has overlap with sites like Pinterest and Instructables but there is one vital difference, we choose to create something new and beautiful with the materials at hand. Often these tools and techniques are very simple. We look to make things elegant by reducing the unnecessary clutter, not adding glitter. If something could be built with a 555 timer we will let you know. If there is a better choice for a processor, we will tell you.

My first real work commute was a forty-minute eastward drive every morning and a forty-minute westward drive every evening. This route pointed my car directly into the sun twice a day. Staring into a miasma of incandescent plasma for an hour and a half a day isn’t fun, and probably isn’t safe, but we can fix that.

Continue reading “What Makes A Hacker”

Arduino Video isn’t Quite 4K

Video resolution is always on the rise. The days of 640×480 video have given way to 720, 1080, and even 4K resolutions. There’s no end in sight. However, you need a lot of horsepower to process that many pixels. What if you have a small robot powered by a microcontroller (perhaps an Arduino) and you want it to have vision? You can’t realistically process HD video, or even low-grade video with a small processor. CORTEX systems has an open source solution: a 7 pixel camera with an I2C interface.

The files for SNAIL Vision include a bill of materials and the PCB layout. There’s software for the Vishay sensors used and provisions for mounting a lens holder to the PCB using glue. The design is fairly simple. In addition to the array of sensors, there’s an I2C multiplexer which also acts as a level shifter and a handful of resistors and connectors.

Continue reading “Arduino Video isn’t Quite 4K”

Pick-And-Place Machine for Candy

Every December and May the senior design projects from engineering schools start to roll in. Since the students aren’t yet encumbered with real-world detractors (like management) the projects are often exceptional, unique, and solve problems we never even thought we had. Such is the case with [Mark] and [Peter]’s senior design project: a pick and place machine that promises to solve all of life’s problems.

Of course we’ve seen pick-and-place machines before, but this one is different. Rather than identifying resistors and capacitors to set on a PCB, this machine is able to identify and sort candies. The robot — a version of the MeARM — has three degrees of freedom and a computer vision system to alert the arm as to what it’s picking up and where it should place it. A Raspberry Pi handles the computer vision and feeds data to a PIC32 which interfaces with the hardware.

One of the requirements for the senior design class was to keep the budget under $100, which they were able to accomplish using pre-built solutions wherever possible. Robot arms with dependable precision can’t even come close to that price restraint. But this project overcomes the lack of precision in the MeArm by using incremental correcting steps to reach proper alignment. This is covered in the video demo below.

Senior design classes are a great way to teach students how to integrate all of their knowledge into a final class, and the professors often include limits they might find in the real world (like the budget limit in this project). The requirement to thoroughly document the build process is also a lesson that more people could stand to learn. Senior design classes have attempted to solve a lot of life’s other problems, too; from autonomous vehicles to bartenders, there’s been a solution for almost every problem.

Continue reading “Pick-And-Place Machine for Candy”

Hack Your Brain: the McCollough Effect

There is a fascinating brain reaction known as the McCollough Effect which is like side-loading malicious code through your eyeballs. Although this looks and smells like an optical illusion, the science would argue otherwise. What Celeste McCollough observed in 1965 can be described as a contingent aftereffect although we refer to this as “The McCollough Effect” due to McCollough being the first to recognize this phenomena. It’s something that can’t be unseen… sometimes affecting your vision for months!

I am not suggesting that you experience the McCollough Effect yourself. We’ll look at the phenomena of the McCollough Effect, and it can be understood without subjecting yourself to it. If you must experience the McCollough Effect you do so at your own risk (here it is presented as a video). But read on to understand what is happening before you take the plunge.

Continue reading “Hack Your Brain: the McCollough Effect”

Electronics for Aliens

We are surrounded by displays with “millions” of colors and hundreds of pixels per inch. With super “high fidelity” sound producing what we perceive to be realistic replicas of the real world.

Of course this is not the case, we rarely stop and think how our electronic systems have been crafted around the limitations of human perception. So to explore this issue, in this article we ask the question: “What might an alien think of human technology?”. We will assume a lifeform which senses the world around it much as we do. But has massively improved sensing abilities. In light of these abilities we will dub it the Oculako.

Let’s begin with the now mostly defunct CRT display and see what our hypothetical alien thinks of it. The video below shows a TV screen shot at 10,000 frames per second.

Continue reading “Electronics for Aliens”

Update: What You See Is What You Laser Cut

If there’s one thing about laser cutters that makes them a little difficult to use, it’s the fact that it’s hard for a person to interact with them one-on-one without a clunky computer in the middle of everything. Granted, that laser is a little dangerous, but it would be nice if there was a way to use a laser cutter without having to deal with a computer. Luckily, [Anirudh] and team have been working on solving this problem, creating a laser cutter that can interact directly with its user.

The laser cutter is tied to a visual system which watches for a number of cues. As we’ve featured before, this particular laser cutter can “see” pen strokes and will instruct the laser cutter to cut along the pen strokes (once all fingers are away from the cutting area, of course). The update to this system is that now, a user can import a drawing from a smartphone and manipulate it with a set of physical tokens that the camera can watch. One token changes the location of the cut, and the other changes the scale. This extends the functionality of the laser cutter from simply cutting at the location of pen strokes to being able to cut around any user-manipulated image without interacting directly with a computer. Be sure to check out the video after the break for a demonstration of how this works.

Continue reading “Update: What You See Is What You Laser Cut”

Mikey, the Robot That Charges Itself



Mikey is [Mike’s] autonomous robot. Like any good father, he’s given the robot his name. Mikey is an Arduino based robot, which uses a Pixy camera for vision.

[Mike] started with a common 4WD robot platform. He added an Arduino Uno, a motor controller, and a Pixy. The Pixy sends directions to the Arduino via a serial link. Mikey’s original task was driving around and finding frogs on the floor. Since then, [Mike] has found a higher calling for Mikey: self charging.

One of the most basic features of life is eating. In the case of autonomous robots, that means self charging. [Mike] gave Mikey the ability to self charge by training the Pixy to detect a green square. The green square identifies Mikey’s charging station. Probes mounted on 3D printed brackets hold the positive leads while springs on the base of the station make contact with conductive tape on Mikey’s belly. Once the circuit is complete, Mike stops moving and starts charging.

Continue reading “Mikey, the Robot That Charges Itself”