Bolt-On Stepper Motor Driver For The Raspberry Pi

For his entry into the 2019 Hackaday Prize, [Tobius Daichi] is working on adding some motion control capabilities to everyone’s favorite Linux SBC. His 3+Pi board attaches to the Raspberry Pi’s GPIO header and gives you a convenient way to control four individual stepper motors. Perfect for a 3D printer, laser cutter, CNC, or anything else you can think of that needs to move in a few dimensions.

But such a simplistic description of the 3+Pi might be underselling it a bit. While [Tobius] says he was inspired by the classic Arduino CNC Shield that powers countless DIY 3D printers, he’s managed to improve on the concept. Rather than having the host Pi communicate directly with the stepper drivers, the 3+Pi features an onboard STM32F302CBT6 that handles the actual motor control. The Pi just needs to tell it what to do over UART.

If you’re looking to do things in real-time, having an onboard microcontroller handle the low-level aspects of talking to the stepper drivers can be a big help. A natural extension for this board could be support for the Klipper firmware, which leverages the fact that the Raspberry Pi is many times more powerful than your average 3D printer control board. With the Pi handling the math and providing the microcontroller instructions, Klipper allows for faster and more accurate printing than the microcontroller alone could accomplish.

As for the stepper drivers themselves, [Tobius] has decided to go with the Trinamic TMC2041-LA-T. This chip is notable as it puts dual drivers in one 48-QFN package, which is great if you’re looking to save space on your board. Some might complain that the 3+Pi doesn’t allow for easily swapping out the stepper drivers if you manage to cook one like on the Arduino CNC shield, but realistically you could say the same about many purpose-built stepper control boards.

[Tobius] is tackling this project by himself currently, but does mention that he’s open to teaming up with anyone who’s got an interest in this sort of thing. There have been previous attempts at creating Linux-powered 3D printer controllers in the past, but we think this approach holds particular promise if for no other reason than the Raspberry Pi’s popularity.

660 FPS Raspberry Pi Video Captures The Moment In Extreme Slo-Mo

Filming in slow-motion has long become a standard feature on the higher end of the smartphone spectrum, and can turn the most trivial physical activity into a majestic action shot to share on social media. It also unveils some little wonders of nature that are otherwise hidden to our eyes: the formation of a lightning flash during a thunderstorm, a hummingbird flapping its wings, or an avocado reaching that perfect moment of ripeness. Altogether, it’s a fun way of recording videos, and as [Robert Elder] shows, something you can do with a few dollars worth of Raspberry Pi equipment at a whopping rate of 660 FPS, if you can live with some limitations.

Taking the classic 24 FPS, this will turn a one-second video into a nearly half-minute long slo-mo-fest. To achieve such a frame rate in the first place, [Robert] uses [Hermann-SW]’s modified version of raspiraw to get raw image data straight from the camera sensor to the Pi’s memory, leaving all the heavy lifting of processing it into an actual video for after all the frames are retrieved. RAM size is of course one limiting factor for recording length, but memory bandwidth is the bigger problem, restricting the resolution to 64×640 pixels on the cheaper $6 camera model he uses. Yes, sixty-four pixels height — but hey, look at that super wide-screen aspect ratio!

While you won’t get the highest quality out of this, it’s still an exciting and inexpensive way to play around with slow motion. You can always step up your game though, and have a look at this DIY high-speed camera instead. And well, here’s one mounted on a lawnmower blade destroying anything but a printer.

Continue reading “660 FPS Raspberry Pi Video Captures The Moment In Extreme Slo-Mo”

Designing An Advanced Autonomous Robot: Goose

Robotics is hard, maybe not quite as difficult as astrophysics or understanding human relationships, but designing a competition winning bot from scratch was never going to be easy. Ok, so [Paul Bupe, Jr’s] robot, named ‘Goose’, did not quite win the competition, but we’re very interested to learn what golden eggs it might lay in the aftermath.

The mechanics of the bot is based on a fairly standard dual tracked drive system that makes controlling a turn much easier than if it used wheels. Why make life more difficult than it is already? But what we’re really interested in is the design of the control system and the rationale behind those design choices.

The diagram on the left might look complicated, but essentially the system is based on two ‘brains’, the Teensy microcontroller (MCU) and a Raspberry Pi, though most of the grind is performed by the MCU. Running at 96 MHz, the MCU is fast enough to process data from the encoders and IMU in real time, thus enabling the bot to respond quickly and smoothly to sensors. More complicated and ‘heavier’ tasks such as LIDAR and computer vision (CV) are performed on the Pi, which runs ‘Robot operating system’ (ROS), communicating with the MCU by means of a couple of ‘nodes’.

The competition itself dictated that the bot should travel in large circles within the walls of a large box, whilst avoiding particular objects. Obviously, GPS or any other form of dead reckoning was not going to keep the machine on track so it relied heavily on ‘LiDAR point cloud data’ to effectively pinpoint the location of the robot at all times. Now we really get to the crux of the design, where all the available sensors are combined and fed into a ‘particle filter algorithm’:

What we particularly love about this project is how clearly everything is explained, without too many fancy terms or acronyms. [Paul Bupe, Jr] has obviously taken the time to reduce the overall complexity to more manageable concepts that encourage us to explore further. Maybe [Paul] himself might have the time to produce individual tutorials for each system of the robot?

We could well be reading far too much into the name of the robot, ‘Goose’ being Captain Marvel’s bazaar ‘trans-species’ cat that ends up laying a whole load of eggs. But could this robot help reach a de-facto standard for small robots?

We’ve seen other competition robots on Hackaday, and hope to see a whole lot more!

Video after the break: Continue reading “Designing An Advanced Autonomous Robot: Goose”

A SuperCap UPS

If you treat your Pi as a wearable or a tablet, you will already have a battery. If you treat your Pi as a desktop you will already have a plug-in power supply, but how about if you live where mains power is unreliable? Like [jwhart1], you may consider building an uninterruptible power supply into a USB cable. UPSs became a staple of office workers when one-too-many IT headaches were traced back to power outages. The idea is that a battery will keep your computer running while the power gets its legs back. In the case of a commercial UPS, most generate an AC waveform which your computer’s power supply converts it back to DC, but if you can create the right DC voltage right to the board, you skip the inverting and converting steps.

Cheap batteries develop a memory if they’re drained often, but if you have enough space consider supercapacitors which can take that abuse. They have a lower energy density rating than lithium batteries, but that should not be an issue for short power losses. According to [jwhart1], this quick-and-dirty approach will power a full-sized Pi, keyboard, and mouse for over a minute. If power is restored, you get to keep on trucking. If your power doesn’t come back, you have time to save your work and shut down. Spending an afternoon on a power cable could save a weekend’s worth of work, not a bad time-gamble.

We see what a supercap UPS looks like, but what about one built into a lightbulb or a feature-rich programmable UPS?

A Tiny Train Departure Board, Just Like The Real Thing

If you travel on the British rail system, you’ll be familiar with the ubiquitous orange dot-matrix departure display boards. At a glance they tell you the expected arrival times of the next few trains, where they are headed, and at the bottom the current time.  [Chris Crocker-White] was inspired by a Tweet to recreate one of these displays in miniature and hang it under his monitor.

The hardware is a Raspberry Pi Zero with an OLED screen, in a custom 3D-printed case. A soldered USB cable takes power from the monitor’s USB ports. Software wise it’s a demonstration vehicle for the Balena cloud service that pulls its data from their transport API, but the choice of dot matrix typeface is perfect and absolutely looks the part.

There is some question as to whether a project such as this one should need a cloud service as its backend, and of course it serves as a demonstration piece rather than a definitive way to enact a departure board. It does however bring a ready-packaged API for transport data, which given that many data sources can be opaque, is a useful feature.

Train time displays seem to be a popular choice on the Eastern side of the Atlantic, here’s another British one, and one from Ireland.

Thanks [Pyrofer] for the tip.

High Performance Stereo Computer Vision For The Raspberry Pi

Up until now, running any kind of computer vision system on the Raspberry Pi has been rather underwhelming, even with the addition of products such as the Movidius Neural Compute Stick. Looking to improve on the performance situation while still enjoying the benefits of the Raspberry Pi community, [Brandon] and his team have been working on Luxonis DepthAI. The project uses a carrier board to mate a Myriad X VPU and a suite of cameras to the Raspberry Pi Compute Module, and the performance gains so far have been very promising.

So how does it work? Twin grayscale cameras allow the system to perceive depth, or distance, which is used to produce a “heat map”; ideal for tasks such as obstacle avoidance. At the same time, the high-resolution color camera can be used for object detection and tracking. According to [Brandon], bypassing the Pi’s CPU and sending all processed data via USB gives a roughly 5x performance boost, enabling the full potential of the main Intel Myriad X chip to be unleashed.

For detecting standard objects like people or faces, it will be fairly easy to get up and running with software such as OpenVino, which is already quite mature on the Raspberry Pi. We’re curious about how the system will handle custom models, but no doubt [Brandon’s] team will help improve this situation for the future.

The project is very much in an active state of development, which is exactly what we’d expect for an entry into the 2019 Hackaday Prize. Right now the cameras aren’t necessarily ideal, for example the depth sensors are a bit too close together to be very effective, but the team is still fine tuning their hardware selection. Ultimately the goal is to make a device that helps bikers avoid dangerous collisions, and we’ve very interested to watch the project evolve.

The video after the break shows the stereoscopic heat map in action. The hand is displayed as a warm yellow as it’s relatively close compared to the blue background. We’ve covered the combination Raspberry Pi and the Movidius USB stick in the past, but the stereo vision performance improvements Luxonis DepthAI really takes it to another level.

Continue reading “High Performance Stereo Computer Vision For The Raspberry Pi”

Jazzberry Bakes The Pi Into A Mechanical Keyboard

If you hang around Hackaday long enough, pretty soon you’ll start to see some patterns emerging. As the nexus of all things awesome in the hacking world, our front page offers a unique vantage point by which you can see what’s getting folks excited this particular month, year, or decade. Right now we can tell you hackers love the Raspberry Pi, 3D printing, and perhaps above all, they can’t get enough mechanical keyboards.

So that makes the Jazzberry by [Mattis Folkestad] something of a perfect storm in the hacker world. The project uses a 3D printed enclosure to combine a Raspberry Pi 3B+ and an Ajazz AK33 mechanical keyboard into a single unit like the home computers of old. Honestly, we’re just glad he didn’t sneak an ESP8266 in there; as the resulting combination might have been enough to crash the site.

That being said, we can’t help but notice there’s a lot of open space inside the 3D printed enclosure. Right now there’s nothing inside but the Raspberry Pi, which only takes up a fraction of the internal volume. Adding a battery and hard drive would be the logical next steps, but it could also be outfitted with a suite of radios and various other hacking and security research accoutrements. We’ve seen an influx of such builds over the last few months, and the Jazzberry seems like it could make a very slick entry into this burgeoning category of mobile pentesting devices.

The STL files are designed specifically for the combination of hardware that [Mattis] used, but it shouldn’t be too difficult to modify them for your own purposes. Even if you stick with the same AK33 keyboard, an upgrade to the impressively powerful Raspberry Pi 4 would be more than worth the time fiddling with the STLs in your CAD tool of choice. If you really want to go all in, add a display and you’re well on the way to that cyberdeck you’ve always wanted.