Sensing, Connected, Utility Transport Taxi For Level Environments

If that sounds like a mouthful, just call it SCUTTLE – the open-source mobile robot designed at Texas A&M University. SCUTTLE is a low cost (under $350) robot designed for teaching Aggies at the Multidisciplinary Engineering Technology (MXET) program, where it is used for in-lab lessons and semester projects for the MXET 300 – Mobile Robotics undergraduate course. Since it is designed for academic purposes, the robot is very well documented, making it easy to replicate when you follow the instructions. In fact, the team is looking for others to build SCUTTLE’s and give them feedback in order to improve its design.

Available on the SCUTTLE website are a large collection of videos to walk you through fabrication, electronics setup, robot assembly, programming, and robot operation. They are designed to help students build and operate the mobile robot within one semester. Most of the mechanical and electronics parts needed for the robot are off-the-shelf and easy to procure and the rest of the custom parts can be easily 3D printed. Its modular design allows you the freedom to try different options, features and upgrades. SCUTTLE is powerful enough to carry a payload up to 9 kg (20 pounds) allowing additional hardware to be added. To keep cost low and construction easy, the robot uses a simple, two wheel drive system, using a pair of geared motors. This forces the robot to literally scuttle in a “non-holonomic” fashion to move from origin to destination in a sequence of left / right turns and forward moves, so motion planning is interestingly tricky.

The SCUTTLE robot is programmed using Python3 running under Linux and has been tested working on either a BeagleBone Blue or a Raspberry Pi. The SCUTTLE software guide is a good place to get acquainted with the system architecture.

The standard configuration uses ultrasonic sensors for collision avoidance, a standard USB camera for vision, and encoders coupled to the wheel drive pulleys for determining position with respect to the starting origin. An optional USB LiDAR can be added for area mapping. The additional payload capability allows adding on extra sensors, actuators or battery packs.

To complement information on the website, additional resources are posted on GitHub, GrabCAD and YouTube. Building a SCUTTLE robot ought to be a great group project at maker spaces wanting to get hackers started with Robotics. We have covered many Educational Robot projects in the past, but the SCUTTLE really shines with its ability to carry a pretty decent payload at a low cost.

Continue reading “Sensing, Connected, Utility Transport Taxi For Level Environments”

Faux Cow Munches Faux Grass On A Faux Roomba

Out in the countryside, having a cow or to two wouldn’t be a big deal. You can have a cattle shed full of them, and no one will bat an eyelid. But what if you’re living in the big city and have no need of pet dogs or cats, but a pet cow. It wouldn’t be easy getting it to ride in the elevator, and you’d have a high chance of being very, very unpopular in the neighbourhood. [Dane & Nicole], aka [8 Bits and a Byte] were undaunted though, and built the Moomba – the Cow Roomba to keep them company in their small city apartment.

The main platform is built from a few pieces of lumber and since it needs to look like a Roomba, cut in a circular shape. Locomotion comes from two DC geared motors, and a third swivel free wheel, all attached directly to the wooden frame. The motors get their 12V juice from eight “AA” batteries. The free range bovine also needs some smarts to allow it to roam at will. For this, it uses a Raspberry Pi powered by a power bank. The Pi drives a 2-channel relay board which controls the voltage applied to the two motors. Unfortunately, this prevents the Moomba from backing out if it gets stuck at a dead end. For anyone else trying to build this it should be easy enough to fix with an electronic speed controller or even by adding a second 2-channel relay board which can reverse the voltage applied to the motors. The Moomba needs to “Moo” when it feels like, so the Raspberry Pi streams a prerecorded mp3 audio clip to a pair of USB speakers.

If you see the video after the break, you’ll notice that making the Moomba sentient is a simple matter of doing “ctrl+C” and “ctrl+V” and you’re good to go. The python code is straight forward, doing one of four actions – move forward, turn left, turn right or play audio. The code picks a random number from 0 to 3, and then performs the action associated with that number. Finally, as an added bonus, the Moomba gets a lush carpet of artificial green grass and it’s free to roam the range.

At first sight, many may quip “where’s the hack” ? But simple, easy to execute projects like these are ideal for getting younglings started down the path to hacking, with adult supervision. The final result may appear frivolous, but it’ll excite young minds as they learn from watching.

Continue reading “Faux Cow Munches Faux Grass On A Faux Roomba”

Robot Allows Remote Colleagues To Enjoy Office Shenanigans

[Esther Rietmann] and colleagues built a Telepresence Robot to allow work at home teammates to have a virtual, but physical presence in the office. A telepresence robot is like a tablet mounted on a Roomba, providing motion capability in addition to an audio/video connection. Built during a 48 hour hackathon, it is a bit crude under the hood and misses out on some features, such as a bidirectional video feed. But overall, it pretty much does what is expected from such a device.

The main structure is build from cheap aluminium profiles and sheets. A Raspberry Pi is at the heart of the electronics hardware, with a servo mounted Pi-camera and speaker-microphone pair taking care of video and audio. The two DC motors are driven by H-bridges controlled from the Pi and an idle swivel caster is attached as the third wheel. The whole thing is powered by a power bank. The one important thing missing is an HDMI display which can show a video feed from the remote laptop camera. That may have been due to time constraints, but this feature should not be too difficult to add as a future upgrade. It’s important for both sides to be able to see each other.

The software is built around WebRTC protocol, with the WebRTC Extension from UV4L doing most of the heavy lifting. The UV4L Streaming Server not only provides its own built-in set of web applications and services, but also embeds a general-purpose web server on another port, allowing the user to run and deploy their own custom web apps. This allowed [Esther Rietmann]’s team to build a basic but functional front-end to transmit data from the remote interface for controlling the robot. The remote computer runs a Python control script, running as a system service, to control the drive motors and camera servo.

The team also played with adding basic object, gesture and action recognition features. This was done using PoseNet – a machine learning model, which allows for real-time human pose estimation in the browser using TensorFlowJS – allowing them to demonstrate some pose detection capability. This could be useful as a “follow me” feature for the robot.

Another missing feature, which most other commercial telepresence robots have, is a sensor suite for collusion avoidance, object detection and awareness such as micro switches, IR / ultrasonic detectors, time of flight cameras or LiDAR’s. It would be relatively easy to add one or several sensors to the robot.

If you’d like to build one for yourself, check out their code repository on Github and the videos below.

Continue reading “Robot Allows Remote Colleagues To Enjoy Office Shenanigans”

DIY Video Microscopy

Owning a Microscope is great fun as a hobby in general, but for hackers, it is a particularly useful instrument for assembly and inspection, now that we are building hardware with “grain of sand” sized components in our basements and garages. [voidnill] was given an Eduval 4 microscope by a well-meaning friend during a holiday trip. This model is pretty old, but it’s a Carl Zeiss after all, made in Jena in the erstwhile GDR. Since an optical microscope was of limited use for him, [voidnill] set about digitizing it.

He settled on the Raspberry-Pi route. The Pi and a hard disk were attached directly to the frame of the microscope, and a VGA display connected via a converter. Finally, the Pi camera was jury-rigged to one of the eyepieces using some foam. It’s a quick and dirty hack, and not the best solution, but it works well for [voidnill] since he wanted to keep the original microscope intact.

The standard Pi camera has a wide angle lens. It is designed to capture a large image and converge it on to the small sensor area. Converting it to macro mode is possible, but requires a hack. The lens is removed and ‘flipped over’, and fixed at a distance away from the sensor – usually with the help of an extension tube. This allows the lens to image a very small area and focus it on the (relatively) large sensor. This hack is used in the “OpenFlexure” microscope project, which you can read about in the post we wrote earlier this year or at this updated link. If you want even higher magnification and image quality, OpenFlexure provides a design to mate the camera sensor directly to an RMS threaded microscope objective. Since earlier this year, this open source microscope project has made a lot of progress, and many folks around the world have successfully built their own versions. It offers a lot of customisation options such as basic or high-resolution optics and manual or motorised stages, which makes it a great project to try out.

If the OpenFlexure project proves to be an intimidating build, you can try something easier. Head over to the PublicLab where [partsandcrafts] shows you how to “Build a Basic Microscope with Raspberry Pi”. It borrows from other open source projects but keeps things simpler making it much easier to build.

In the video embed below, [voidnill] gives a brief overview (in German) of his quick hack. If you’ve got some microscope hacks, or have built one of your own, let us know in the comments section.

Continue reading “DIY Video Microscopy”

What The Scale? Mouse Teardown Throws Up A Few Surprises

[Eric Weinhoffer] and his colleagues did a great comparative tear down of the MX Master 3 and the MX Master 2S mice from Logitech. Tear down’s are great fun and often end up teaching us a lot. Looking at the insides of a product can tell us a great deal about how to solve certain problems, or avoid pitfalls. Opening up two versions of the same product provides an even greater wealth of useful information on how product design evolves based on lessons learned from the earlier versions. Logitech is no greenhorn when it comes to Mice design, so the MX Master 2S was already almost perfect. But looking at the Master MX 3 shows where the earlier version fell short of expectations and how it could be improved upon.

These mice have intelligent scroll wheels, which can rotate in either “detente” or “freewheel” modes. Detente allows slower, precise scrolling, while freewheeling allows rapid scrolling. The two mice models have completely different, and interesting, methods of achieving these actions. The older version has a rubber-coated wheel and uses a motor, which turns a cam. This forces a detent ball onto the inside of the wheel for detent mode and releases it for free mode. Once the rubber wears off, the mouse is pretty much headed for the dumpster. The new metal wheel does away with the rubber coating as well as the noisy, slow, and prone to wear-and-tear motor assembly. The actuation is now done using a bi-stable electromagnet. A 25 V pulse magnetizes the coil which sits inside the wheel and it pulls on little metal teeth on the inside rim of the wheel. This gives a noiseless detente feel, without any physical contact. A second 25 V spike de-magnetizes the coil, allowing the scroll wheel to spin freely.

[Eric] points out several incremental changes in design which have resulted in improved ergonomics. He also uncovers a few nuggets of useful information. The use of interchangeable mold inserts help make molds last longer while still offering the flexibility to make changes in the molded part. It’s interesting to see special components being used for withstanding vibration and high-G forces. Some of these insights can be useful for those moving from prototyping to production. There’s one puzzling feature on the new PCB that [Eric] cannot figure out. There is a 15 mm scale screen-printed over the blue tooth antenna. If you have an answer on its purpose, let us know in the comments below.

If you are left-handed (which makes 10% of us), you’re out of luck with these right-handed mice and might like to sign one of the several online petitions demanding lefty versions.

Python Script Sends Each Speaker Its Own Sound File

When it comes to audio, the number of speakers you want is usually governed by the number of tracks or channels your signal has. One for mono, two for stereo, four for quadrophonic, five or more for surround sound and so on. But all of those speakers are essentially playing different tracks from a “single” audio signal. What if you wanted a single audio device to play eight different songs simultaneously, with each song being piped to its own speaker? That’s the job [Devon Bray] was tasked with by interdisciplinary artist [Sara Dittrich] for one of her “Giant Talking Ear” installation project. He built a device to play multiple sound files on multiple output devices using off the shelf hardware and software.

But maybe a hack like this could be useful in many applications other than just art installations. It could be used in an Escape room, where you may want the various audio streams to start in synchronicity at the same time, or as part of a DJ console, sending one stream to the speakers and another to the head phones, or a game where you have to run around a room full of speakers in the right sequence and speed to listen to a full sentence for clues.

His blog post lists links for the various pieces of hardware required, although all of it is pretty generic, and the github repository hosts the code. At the heart of the project is the Sounddevice library for python. The documentation for the library is sparse, so [Bray]’s instructions are handy. His code lets you “take a directory with .wav files named in numeric order and play them over USB sound devices attached to the host computer over and over forever, looping all files once the longest one finishes”. As a bonus, he shows how to load and play sound files automatically from an attached USB drive. This lets you swap out your playlist on the Raspberry Pi without having a use a keyboard/mouse, SSH or RDP.

Check the video after the break for a quick roundup of the project.

Continue reading “Python Script Sends Each Speaker Its Own Sound File”

New Part Day : A Sensor Chip For 3D Color X-Ray Imaging

We all know CERN as that cool place where physicists play with massive, superconducting rings to smash atoms and subatomic particles to uncover secrets of matter in the Universe. To achieve this aim, they need to do a ton of research in other areas, such as development of special particle detectors.

While such developments are essential to the core research needs of the Centre, they also lead to spinoff applications for the benefit of society at large. One such outcome has been the Medipix Collaborations – a family of read-out chips for particle imaging and detection that can count single photons, allowing X-rays and gamma rays to be converted to electrical signals. It may not be possible for us hackers to get our hands on these esoteric sensors, but these devices are pretty interesting and deserve a closer look. Medipix sensors work like a camera, detecting and counting each individual particle hitting the pixels when its electronic shutter is open. This enables high-resolution, high-contrast, noise hit free images – making it unique for imaging applications.

Some months back, CERN announced the first 3D color X-ray of a human made possible using the Medipix devices. The result is a high-resolution, 3D, color image of not just living structures like bones, muscular tissues and vessels, but metal objects too like the wrist watch, seen in the accompanying photograph. The Medipix sensors have been in development since the 1990’s and are presently in their 4th “generation”. Each chip consists of a top semiconducting sensor array, made from gallium arsenide or cadmium telluride. The charge collected by each pixel is transported to the CMOS ASIC electronics via “bump bonds”. The integration is vertical, with each sensing pixel connected via the bump bond to an analog section followed by a digital processing layer. Earlier versions were limited, by technology, in their tiling ability for creating larger matrices of multiple sensors. They could be abutted on three sides only, with the fourth being used for on-chip peripheral logic and wire-bond pads that permit electronic read-out. The latest Medipix4 Collaboration, still under some development, eliminates this short coming. Through-silicon-via (TSV) technology provides the possibility of reading the chips through copper-filled holes that bring the signals from the front side of the chip to its rear. All communication with the pixel matrix flows through the rear of the chip – the peripheral logic and control elements are integrated inside the pixel matrix.

The Analog front end consists of a pre-amplifier followed by a window discriminator which has upper and lower threshold levels. The discriminator has four bits for threshold adjustment as well as polarity sensing. This allows the capture window to be precisely set. The rest of the digital electronics – multiplexers, shift registers, shutter and logic control – helps extract the data.

Further development of the Medipix (Tech Brief, PDF) devices led to a separate version called Timepix (Tech Brief, PDF). These new devices, besides being able to count photons, are capable of two additional modes. The first mode records “Time-Over-Threshold”, providing rough analog information about the energy of the photon. It does this by counting clock pulses for the duration when the signal stays above the discrimination levels. The other mode, “Time of Arrival”, measures arrival time of the first particle to impinge on the pixel. The counters record time between a trigger and detection of radiation quanta with energy above the discrimination level, allowing time-of-flight applications in imaging.

Besides medical imaging, the devices have applications in space, material analysis, education and of course, high energy physics. Hopefully, in a few years, hackers will lay their hands on these interesting devices and we can get to know them better. At the moment, the Medipix website has some more details and data sheets if you would like to dig deeper. For an overview on the development of such single photon detectors, check out this presentation from CERN – “Single X-Ray Photon Counting Systems: Existing Systems, Systems Under Development And Future Trends” (PDF).