[Teaching Tech] has been interested in adding a tool changer to his 3D printer. E3D offers a system that allows you to switch print heads or even change out a hot end for a laser or a (probably) light-duty CNC head. The price of the entire device, though, is about $2,500, which put him off. But now he’s excited about a product from PrinterMods called XChange. This is a kit that will allow rapid tool changes on many existing printers and costs quite a bit less. Preorder on KickStarter is about $150, but that probably won’t be the final price.
Not all printers are compatible. It appears the unit attaches to printers that have linear rails and there is an adapter for printers that have V rollers in extrusions. Supposedly, there is an adapter in the works for printers that use rods and bearings.
A mechanical and manufacturing engineer by day, [Tyler Collins] taught himself electronics and firmware development in his spare time and created an open source Lego controller called Evlōno One. It is based on the STM32 and Arduino ecosystems, and compatible with a impressive variety of existing Lego controllers, sensors and actuators. [Tyler] encountered Lego Mindstorms while helping in an after-school program, and got to wondering whether he could make a more flexible controller. We’d have to say he succeeded, and it’s amazing how much he has packed into this 4 x 4 single-height brick format.
The Evlōno One is based on an ESP32 dual-core MCU, and has WiFi, Bluetooth, and an IR transmitter for wireless connectivity. It also boasts USB-C power delivery, three motor controllers, speakers, LEDs and a button. Dig through the Kickstarted page for more details on these interfaces and specifications. Both the firmware and the hardware will be published as open source on GitHub.
Although [Tyler] has the prototypes all running, he notes this is his first big production effort. FCC certification testing and production mold tooling are the two biggest items driving the scheduled Feb 2021 shipments. If computer driven Lego modeling is one of your hobbies, definitely check out [Tyler]’s project. And if you missed our [Daniel Pikora]’s FOSSCON 2018 presentation about the intersection (collision) of Legos and Open Source, our article must-read for you folks in the Adult Fan of Lego (AFOL) community.
The Japan Aerospace Exploration Agency (JAXA) recently contributed their Int-Ball technology to a Kickstarter campaign operated by the Japanese electronics manufacturer / distributor Bit Trade One (Japanese site). This technology is based on the Cubli project out of the Swiss Federal Institute of Technology in Zurich (ETH Zurich), which we covered back in 2013. The Cubli-based technology has been appearing in various projects since then, including the Nonlinear Mechatronic Cube in 2016. Alas, the current JAXA-based “3-Axis Attitude Control Module” project doesn’t have a catchy name — yet.
One interesting application of these jumping cubes, presumably how JAXA got involved with these devices, is a floating video camera that was put to use on board the International Space Station (ISS) in 2017. The version being offered by the Kickstarter campaign doesn’t include the cameras, and you will need to provide your own a gravity-free environment to duplicate that application. Instead, they seem to be marketing this for educational uses. You’d better dig deep in your wallet if you want one — a fully assembled unit requires a pledge of over $5000 ( there is a “some assembly required” kit that can save you about $1000 ). Most of us won’t be backing this project for that reason alone, but it is nice to see the march of progress of such a cool technology: from inception to space applications to becoming available to the general public. Thanks to [Lincoln Uehara] for sending in this tip.
Vizy, a new machine vision camera from Charmed Labs, has blown through their crowdfunding goal on the promise of making machine vision projects both easier and simpler to deploy. The camera, which starts around $250, integrates a Raspberry Pi 4 with built-in power and shutdown management, and comes with a variety of pre-installed applications so one can dive right in.
The Sony IMX477 camera sensor is the same one found in the Raspberry Pi high quality camera, and supports capture rates of up to 300 frames per second (under the right conditions, anyway.) Unlike the usual situation faced by most people when a Raspberry Pi is involved, there’s no need to worry about adding a real-time clock, enclosure, or ensuring shutdowns happen properly; it’s all taken care of.
Charmed Labs are the same folks behind the Pixy and Pixy 2 cameras, and Vizy goes further in the sense that everything required for a machine vision project has been put onboard and made easy to use and deploy, even the vision processing functions work locally and have no need for a wireless data connection (though one is needed for things like automatic uploading or sharing.) For outdoor or remote applications, there’s a weatherproof enclosure option, and wireless connectivity in areas with no WiFi can be obtained by plugging in a USB cellular modem.
A few of the more hacker-friendly hardware features are things like a high-current I/O header and support for both C/CS and M12 lenses for maximum flexibility. The IR filter can also be enabled or disabled via software, so no more swapping camera modules for ones with the IR filter removed. On the software side, applications are all written in Python and use open software like Tensorflow and OpenCV for processing.
The feature list looks good, but Vizy also seems to have a clear focus. It looks best aimed at enabling projects with the following structure:
Detect Things (people, animals, cars, text, insects, and more) and/or Measure Things (size, speed, duration, color, count, angle, brightness, etc.)
Perform an Action (for example, push a notification or enable a high-current I/O) and/or Record (save images, video, or other data locally or remotely.)
A good example of this structure is the Birdfeeder application which comes pre-installed. With the camera pointed toward a birdfeeder, animals coming for a snack are detected. If the visitor is a bird, Vizy identifies the species and uploads an image. If the animal is not a bird (for example, a squirrel) then Vizy can detect that as well and, using the I/O header, could briefly turn on a sprinkler to repel the hungry party-crasher. A sample Birdfeeder photo stream is here on Google Photos.
Motionscope is a more unusual but very interesting-looking application, and its purpose is to capture moving objects and measure the position, velocity, and acceleration of each. A picture does a far better job of explaining what Motionscope does, so here is a screenshot of the results of watching some billiard balls and showing what it can do.
Cameras are getting smarter and more capable than ever, able to run embedded machine vision algorithms and pull off tricks far beyond what something like a serial camera and microcontroller board would be capable of, and the upcoming Vizy aims to be even smarter and easier to use yet. Vizy is the work of Charmed Labs, and this isn’t their first foray into accessible machine vision. Charmed Labs are the same folks behind the Pixy and Pixy 2 cameras. Vizy’s main goal is to make object detection and classification easy, with thoughtful hardware features and a browser-based interface.
The usual way to do machine vision is to get a USB camera and run something like OpenCV on a desktop machine to handle the processing. But Vizy leverages a Raspberry Pi 4 to provide a tightly-integrated unit in a small package with a variety of ready-to-run applications. For example, the “Birdfeeder” application comes ready to take snapshots of and identify common species of bird, while also identifying party-crashers like squirrels.
The demonstration video on their page shows off using the built-in high-current I/O header to control a sprinkler, repelling non-bird intruders with a splash of water while uploading pictures and video clips. The hardware design also looks well thought out; not only is there a safe shutdown and low-power mode for the Raspberry Pi-based hardware, but the lens can be swapped and the camera unit itself even contains an electrically-switched IR filter.
Vizy has a Kickstarter campaign planned, but like many others, Charmed Labs is still adjusting to the changes the COVID-19 pandemic has brought. You can sign up to be notified when Vizy launches; we know we’ll be keen for a closer look once it does. Easier machine vision is always a good thing, because it helps free people to focus on clever ideas like machine vision-based tool alignment.
OpenCV is an open source library of computer vision algorithms, its power and flexibility made many machine vision projects possible. But even with code highly optimized for maximum performance, we always wish for more. Which is why our ears perk up whenever we hear about a hardware accelerated vision module, and the latest buzz is coming out of the OpenCV AI Kit (OAK) Kickstarter campaign.
There are two vision modules launched with this campaign. The OAK-1 with a single color camera for two dimensional vision applications, and the OAK-D which adds stereo cameras for that third dimension. The onboard brain is a Movidius Myriad X processor which, according to team members who have dug through its datasheet, have been massively underutilized in other products. They believe OAK modules will help the chip fulfill its potential for vision applications, delivering high performance while consuming low power in a small form factor. Reading over the spec sheet, we think it’s fair to call these “Ultimate Myriad X Dev Boards” but we must concede “OpenCV AI Kit” sounds better. It does not provide hardware acceleration for the entire OpenCV library (likely an impossible task) but it does cover the highly demanding subset suitable for Myriad X acceleration.
Since the campaign launched a few weeks ago, some additional information have been released to help assure backers that this project has real substance. It turns out OAK is an evolution of a project we’ve covered almost exactly one year ago that became a real product DepthAI, so at least this is not their first rodeo. It is also encouraging that their invitation to the open hardware community has already borne fruit. Check out this thread discussing OAK for robot vision, where a question was met with an honest “we don’t have expertise there” from the OAK team, but then ArduCam pitched in with their camera module experience to help.
We wish them success for their planned December 2020 delivery. They have already far surpassed their funding goals, they’ve shipped hardware before, and we see a good start to a development community. We look forward to the OAK-1 and OAK-D joining the ranks of other hacking friendly vision modules like OpenMV, JeVois, StereoPi, and AIY Vision.
Making something that has to get into others’ hands involves solving a lot of different problems, many of which have nothing at all to do with actually building the dang things. [Conor Patrick] encountered them when he ran a successful Kickstarter campaign for an open-source USB security key that was not only shipped to backers, but also made available as an ongoing product for sale. There was a lot of manual and tedious work that could have been avoided, and so [Conor] laid out all the things he wishes he had done when first setting up a product line.
If the whole process is a river, then the more “upstream” an issue is, the bigger its potential impact on everything that comes afterwards. One example is the product itself: the simplest and most easily managed product line is one that has only one product with no variations. That not only minimizes errors but makes supply, production, and shipping more straightforward. Striving for a minimum number of products and variations is also an example of something [Conor] didn’t do. In their crowdfunding campaign they offered the SoloKeys USB device — an implementation of the FIDO2 authentication token — as either USB-A or USB-C. There were also two types of key: NFC-capable (for tapping to a smartphone) and USB only. That is four products so far.
Offering keys in an unlocked state for those who want to tamper makes it eight different products. On top of that, they offered color choices which not only adds complexity to production, but also makes it harder to keep track of what everyone ordered. [Conor] also observed that the Kickstarter platform and back end are really not set up like a store, and it is clunky at best to try to offer (and manage) different products and variations from within it.
Another major point is fulfillment and in [Conor]’s opinion, unless the quantities are small, an order fulfillment company is worth partnering with. He says there are a lot of such companies out there, and it can be very time consuming to find the right one, but it will be nothing compared to the time and effort needed to handle, package, address, and ship several hundreds (or thousands!) of orders personally. His team did their own fulfillment for a total of over 2000 units, and found it a long and tedious process filled with hidden costs and challenges.
There’s good advice and background in [Conor]’s writeup, and this isn’t his first rodeo. He also shared his thoughts on taking electronics from design to production and the more general advice remains the same for it all: be honest and be open. Under-promise and over-deliver, especially when it comes to time estimates.