I bet the hand saw really changed some things. One day you’re hacking away at a log with an ax. It’s sweaty, awful work, and the results are never what you’d expect. The next day the clever new apprentice down at the blacksmith’s shop is demoing his beta of his new Saw invention and looking for testers, investors, and a girlfriend. From that day onward the work is never the same again. It’s not an incremental change, it’s a change. Pure and simple.
This is one of those moments. The world of tools is seeing a new change, and I think this is the first of many tools that will change the way we build.
Like most things that are a big change, the components to build them have been around for a while. In fact, most of the time, the actual object in question has existed in some form or another for years. Like a crack in a dam, eventually someone comes up with the variation on the idea that is just right. That actually does what everything else has been promising to do. It’s not new, but it’s the difference between crude and gasoline.
My poetic rasping aside, the Shaper Origin is the future of making things. It’s tempting to boil it down and say that it’s a CNC machine, or a router. It’s just, more than that. It makes us more. Suddenly complex cuts on any flat surface are easy. Really easy. There’s no endless hours with the bandsaw and sander. There’s no need for a 25,000 dollar gantry router to take up half a garage. No need for layout tools. No need to stress about alignment. There’s not even a real need to jump between the tool and a computer. It can be both the design tool and the production tool. It’s like a magic pencil that summons whatever it draws. But even I had to see it to believe it.
Continue reading “Hands-On the Shaper Origin: A Tool That Changes How We Build”
This Raspberry Pi 2 with computer vision and two solenoid “fingers” was getting absurdly high scores on a mobile game as of late 2015, but only recently has [Kristian] finished fleshing the project out with detailed documentation.
Developed for a course in image analysis and computer vision, this project wasn’t really about cheating at a mobile game. It wasn’t even about a robotic interface to a smartphone screen; it was a platform for developing and demonstrating the image analysis theory he was learning, and the computer vision portion is no hack job. OpenCV was used as a foundation for accessing the camera, but none of the built-in filters are used. All of the image analysis is implemented from scratch.
The game is a simple. Humans and zombies move downward in two columns. Zombies (green) should get a screen tap but not humans. The Raspberry Pi camera takes pictures of the smartphone’s screen, to which a HSV filter is applied to filter out everything except green objects (zombies). That alone would be enough to get you some basic results, but not nearly good enough to be truly reliable and repeatable. Therefore, after picking out the green objects comes a whole chain of additional filtering. The details of that are covered on [Kristian]’s blog post, but the final report for the project (PDF) is where the real detail is.
If you’re interested mainly in seeing a machine pound out flawless victories, the video below shows everything running smoothly. The pounding sounds make it seem like the screen is taking a lot of abuse, but [Kristian] mentions that’s actually noise from the solenoids and not a product of them battling the touchscreen. This setup can be easily adapted to test out apps on different models of phones — something that has historically cost quite a bit of dough.
If you’re interested in the nitty-gritty details of the reasons and methods used for the computer vision portions, be sure to go through [Kristian]’s github repository where everything about the project lives (including the aforementioned final report.)
Continue reading “Abusing a Cellphone Screen with Solenoids Posts High Score”
Want to get somewhere safely, but all you have is a Segway? An afternoon spent tinkering can turn your Segway into a lounging cruiser with this hoverseat attachment, just like YouTuber [Inflatable Boats]’s hot new ride.
The backbone of the cart is the Segway Mini Pro. An aluminium frame attaches to the Segway via an eye-bolt and two carabiners, the larger of which has some tape wrapped around it to reduce wear. A swivel caster is attached with u-bolts to support the weight of the rider along the middle of this makeshift go-cart. Pushing on a t-handle made of pvc — connected to the Segway’s knee brace with a simple strap — engages the motor in lieu of the normal lean-to-go-forward action. Turning is simply done by swinging the handle or pressing with your feet.
Continue reading “Hoverchair For Your Hoverboard Turns Your Segway into a Go-Kart”
Film photography began with a mercury-silver amalgam, and ended with strips of nitrocellulose, silver iodide, and dyes. Along the way, there were some very odd chemistries going on in the world of photography, from ferric and silver salts to the prussian blue found in Cyanotypes and blueprints.
Metal salts are fun, and for his Hackaday Prize entry, [David Brown] is building a printer for these alternative photographic processes. It’s not a dark room — it’s a laser printer designed to reproduce images with weird, strange chemistries.
Cyanotypes are made by applying potassium ferricyanide and ferric ammonium citrate to some sort of medium, usually paper or cloth. This is then exposed via UV light (i.e. the sun), and whatever isn’t exposed is washed off. Instead of the sun, [David] is using a common UV laser diode to expose his photographs. he already has the mechanics of this printer designed, and he should be able to reach his goal of 750 dpi resolution and 8-bit monochrome.
Digital photography will never go away, but there will always be a few people experimenting with light sensitive chemicals. We haven’t seen many people experiment with these strange alternative photographic processes, and anything that gets these really cool prints out into the world is great news for us.
If you need a truly random event generator, just wait till your next rainstorm. Whether any given spot on the ground is hit by a drop at a particular time is anyone’s guess, and such randomness is key to this simple rig that estimates the value of pi using raindrop sensors.
You may recall [AlphaPhoenix]’s recent electroshock Settlers of Catan expeditor. The idea with this less shocking build is to estimate the value of pi using the ratio of the area of a square sensor to a circular one. Simple piezo transducers serve as impact sensors that feed an Arduino and count the relative number of raindrops hitting the sensors. In the first video below, we see that as more data accumulates, the Arduino’s estimate of pi eventually converges on the well-known 3.14159 value. The second video has details of the math behind the method, plus a discussion of the real-world problems that cropped up during testing — turns out that waterproofing and grounding were both key to noise-free data from the sensor pads.
In the end, [AlphaPhoenix] isn’t proving anything new, but we like the method here and can see applications for it. What about using such sensors to detect individual popcorn kernels popping to demonstrate the Gaussian distribution? We also can’t help but think of other ways to measure raindrops; how about strain gauges that weigh the rainwater as it accumulates differentially in square and circular containers? Share your ideas in the comments below.
Continue reading “Rainy Day Fun by Calculating Pi”
Almost every big corporation has a research and development organization, so it came as no surprise when we found a tip about Disney Research in the Hackaday Tip Line. And that the project in question turned out to involve human-safe haptic telepresence robots makes perfect sense, especially when your business is keeping the Happiest Place on Earth running smoothly.
That Disney wants to make sure their Animatronics are safe is good news, but the Disney project is about more than keeping guests healthy. The video after the break and the accompanying paper (PDF link) describe a telepresence robot with a unique hydrostatic transmission coupling it to the operator. The actuators are based on a rolling-diaphragm design that limits hydraulic pressure. In a human-safe system that’s exactly what you want.
The system is a hybrid hydraulic-pneumatic design; two actuators, one powered by water pressure and the other with air, oppose each other in each joint. The air-charged actuators behave like a mass-efficient spring that preloads the hydraulic actuator. This increases safety by allowing the system to be de-energized instantly by venting the air lines. What’s more, the whole system presents very low mechanical impedance, allowing haptic feedback to the operator through the system fluid. This provides enough sensitivity to handle an egg, thread a needle — or even bop a kid’s face with impunity.
There are some great ideas here for robotics hackers, and you’ve got to admire the engineering that went into these actuators. For more research from the House of Mouse, check out this slightly creepy touch-sensitive smart watch, or this air-cannon haptic feedback generator.
Continue reading “Keeping Humanity Safe from Robots at Disney”
Do you always look at it encoded? – Well you have to. The image translators work for the construct program.
Word clocks are supposed to de-encode time into a more readable format. Luckily [Xose Pérez] managed to recover the encoded time signal of the simulation we are all living in with his word clock that displays time using a stylish Matrix code animation.
[Xose] already built his own versions of [Philippe Chrétien’s] Fibonacci Clock and [Jeremy Williams’s] Game Frame, and while doing so he designed a nice little PCB. It’s powered by an ATmega328p, features an RTC with backup battery, an SD-card socket, and it’s ready to drive a bunch of WS2812Bs aka NeoPixels. Since he still had a few spare copies of his design in stock, his new word clock is also driven by this board.
Continue reading “Realize the Truth… There Is No Word Clock”