Adafruit AVRProg Grows UPDI Interface Support

Making a small number of things with an embedded application is pretty straightforward, you usually simply plug in a programmer or debugger dongle (such as an AVRISP2) into your board with an appropriate adaptor cable, load your code into whatever IDE tool is appropriate for the device and hit the program button. But when you scale up a bit to hundreds or thousands of units, this way of working just won’t cut it. Add in any functional or defect-oriented testing you need, and you’re going to need a custom programming rig.

Adafruit have a fair bit of experience with building embedded boards and dealing with the appropriate testing and programming, and now they’ve updated their AVR Programming library to support the latest devices which have moved to the UPDI (Unified Programming and Debug Interface) programming interface. UPDI is a single-wire bidirectional asynchronous serial interface which enables programming and debugging of embedded applications on slew of the new AVR branded devices from Microchip. An example would be the AVR128DAxx which this scribe has been tinkering with lately because it is cheap, has excellent capacitive touch support, and is available in a prototype-friendly 28-pin SOIC package, making it easy peasy to solder.

The library is intended for use with the Arduino platform, so it should run on a vast array of hardware, without any special requirements, so making a custom programming jig out of hardware lots of us have lying around is not a huge hassle.

Adafruit provide a few application examples in the project GitHub to get you going, such as this ATTiny817 example that wipes the flash memory, sets appropriate fuses and drops in a bootloader.

The UPDI code was taken from the [brandanlane’s] portaprog which is hosted on the TTGO T-Display ESP32 board from Chinese outfit LilyGo, which is also worth checking out.

A little while ago we saw how the AVR Multitool, the AVRGPP learned to speak UPDI, and since we’re on programming interfaces, its possible to get the cheap-as-chips USBasp to speak TPI as well.

Continue reading “Adafruit AVRProg Grows UPDI Interface Support”

Interconnected CPU nodes forming a system-wide network

With Luos Rapid Embedded Deployment Is Simplified

Those of us tasked with developing firmware for embedded systems have a quite a few hurdles to jump through compared to those writing for the desktop or mobile platforms. Solved problems such as code reuse or portability are simply harder. It was with considerable interest that we learnt of another approach to hardware abstraction, called Luos, which describes itself as micro-services for embedded systems.

This open source project enables deployment of distributed architectures composed of collaborating micro-services. By containerizing applications and hardware drivers, interfaces to the various components are hidden behind a consistent API. It doesn’t even matter where a resource is located, multiple services may be running on the same microcontroller, or separate ones, yet they can communicate in the same way.

By following hardware and software design rules, it’s possible to create an architecture of cooperating computing units, that’s completely agnostic of the actual hardware. Microcontrollers talk at the hardware level with a pair of bidirectional signals, so the hardware cost is very low. It even integrates with ROS, so making robots is even easier.

Luos architecture

By integrating a special block referred to as a Gate, it is possible to connect to the architecture in real-time from a host computer via USB, WiFi, or serial port, and stream data out,  feed data in, or deploy new software. The host software stack is based around Python, running under Jupyter Notebook, which we absolutely love.

Current compatibility is with many STM32 and ATSAM21 micros, so chances are good you can use it with whatever you have lying around, but more platforms are promised for the future.

Now yes, we’re aware of CMSIS, and the idea of Hardware Abstraction Layers (HALs) used as part of the platform-specific software kits, this is nothing new. But, different platforms work quite differently, and porting code from one to another, just because you can no longer get your preferred microcontroller any more, is a real drag we could all do without, so why not go clone the GitHub and have a look for yourselves?

Continue reading “With Luos Rapid Embedded Deployment Is Simplified”

Embedded Rust Hack Chat

Join us on Wednesday, May 12 at noon Pacific for the Embedded Rust Hack Chat with James Munns!

Programming languages, like fashion, are very much a matter of personal taste. Professional developers often don’t have much say in which language they’ll use for a given project, either for legacy or team reasons, but if they did have a choice, they’d probably choose the language that works best with the way they think. Some languages just “fit” different brains better than others, and when everything is in sync between language and developer, code just seems to flow effortlessly through the keyboard and onto the screen.

One language that consistently scores at the top of developers’ “most loved” lists is Rust. For a language that started as a personal project and has only existed for a little more than a decade, that’s really saying something. The emphasis Rust puts on safety and performance probably has a lot to do with that. And thanks to its safe concurrency, its memory safety, and its interoperability with C and other languages, Rust has made considerable in-roads with the embedded development community.

To learn more about Rust in embedded systems, James Munns will stop by the Hack Chat. James is an embedded systems engineer, with a history of working on software for a wide range of systems, including safety-critical avionics, and rapidly prototyped IoT systems. He’s a founding member of the Rust Embedded Working Group, as well as a founder of Ferrous Systems, a consultancy focused on systems development in Rust, with a specialty in embedded systems development. James also used to write for Hackaday, so he must be a pretty cool guy. So swing by the Hack Chat and find out where Rust might be able to help you out with your next embedded project.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, May 12 at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.
Continue reading “Embedded Rust Hack Chat”

DOOM On A Bootloader Is The Ultimate Cheat Code

Porting DOOM to run on hardware never meant to run it is a tradition as old as time. Getting it to run on embedded devices, ancient computers, virtual computers, and antique video game consoles are all classic hacks, but what DOOM ports have been waiting for is something with universal applicability that don’t need a bespoke solution for each piece of hardware. Something like DOOM running within a bootloader.

The bootloader that [Ahmad] works with is called Barebox and is focused on embedded systems, often those running Linux. This is the perfect environment for direct hardware access, since the bootloader doubles as a bare metal hardware bring-up toolkit. Now that DOOM runs on this bootloader, it effectively can run anywhere from embedded devices to laptops with minimal work, and although running it in a bootloader takes away a lot of the hard work that would normally need to be done during a port, it may still need some tweaking for specific hardware not otherwise supported.

For those already running Barebox, the bareDOOM code can be found on [Ahmad]’s GitHub page. For those not running Barebox, it does have a number of benefits compared to other bootloaders, even apart from its new ability to play classic FPS games. For those who prefer a more custom DOOM setup, though, we are always fans of DOOM running within an NES cartridge.

Photo: AntonioMDA, CC BY-SA 4.0 via Wikimedia Commons

Free RTOS

Real-Time OS Basics: Picking The Right RTOS When You Need One

When do you need to use a real-time operating system (RTOS) for an embedded project? What does it bring to the table, and what are the costs? Fortunately there are strict technical definitions, which can also help one figure out whether an RTOS is the right choice for a project.

The “real-time” part of the name namely covers the basic premise of an RTOS: the guarantee that certain types of operations will complete within a predefined, deterministic time span. Within “real time” we find distinct categories: hard, firm, and soft real-time, with increasingly less severe penalties for missing the deadline. As an example of a hard real-time scenario, imagine a system where the embedded controller has to respond to incoming sensor data within a specific timespan. If the consequence of missing such a deadline will break downstream components of the system, figuratively or literally, the deadline is hard.

In comparison soft real-time would be the kind of operation where it would be great if the controller responded within this timespan, but if it takes a bit longer, it would be totally fine, too. Some operating systems are capable of hard real-time, whereas others are not. This is mostly a factor of their fundamental design, especially the scheduler.

In this article we’ll take a look at a variety of operating systems, to see where they fit into these definitions, and when you’d want to use them in a project. Continue reading “Real-Time OS Basics: Picking The Right RTOS When You Need One”

Pi Compute Module Is Love-child Of Raspberry And Arduino

The Raspberry Pi compute module is a powerful piece of hardware, especially for the price. With it, you get more IO than a normal Pi, plus the ability to design hardware around it that’s specifically tailored to your needs rather than simply to general-purpose consumers. However, this comes at the cost of needing a way to interface with it since the compute module doesn’t have the normal IO pins or ports, but [Timon] has come up with a handy development board for this module called the Piunora which solves a lot of these prototyping issues.

The development board expands the compute module to the familiar Arduino-like form factor, complete with IO headers, USB ports, and HDMI output. It doesn’t stop there, though. It has an M.2 connector, some built-in LEDs, a camera connector, and a few other features. It also opens up some other possibilities that would be difficult or impossible with a standard Pi 4, such as the ability to run the Pi as a USB gadget rather than as a host device which simplifies certain types of development, which is [Timon]’s intended function.

As a development board, this project has a lot of potential for the niche uses of the compute module when compared to the standard Raspberry Pi. For embedded applications it’s much easier to deploy, with the increased development costs as a tradeoff. If you’re still unsure what to do with the compute module 4, we have some reading for you. And Timon’s previous project is a great springboard.

Vizy The AI Camera Aims To Ease Machine Vision

Cameras are getting smarter and more capable than ever, able to run embedded machine vision algorithms and pull off tricks far beyond what something like a serial camera and microcontroller board would be capable of, and the upcoming Vizy aims to be even smarter and easier to use yet. Vizy is the work of Charmed Labs, and this isn’t their first foray into accessible machine vision. Charmed Labs are the same folks behind the Pixy and Pixy 2 cameras. Vizy’s main goal is to make object detection and classification easy, with thoughtful hardware features and a browser-based interface.

Vizy can identify common birds with “Birdfeeder”, one of the several built-in applications that uses local processing only.

The usual way to do machine vision is to get a USB camera and run something like OpenCV on a desktop machine to handle the processing. But Vizy leverages a Raspberry Pi 4 to provide a tightly-integrated unit in a small package with a variety of ready-to-run applications. For example, the “Birdfeeder” application comes ready to take snapshots of and identify common species of bird, while also identifying party-crashers like squirrels.

The demonstration video on their page shows off using the built-in high-current I/O header to control a sprinkler, repelling non-bird intruders with a splash of water while uploading pictures and video clips. The hardware design also looks well thought out; not only is there a safe shutdown and low-power mode for the Raspberry Pi-based hardware, but the lens can be swapped and the camera unit itself even contains an electrically-switched IR filter.

Vizy has a Kickstarter campaign planned, but like many others, Charmed Labs is still adjusting to the changes the COVID-19 pandemic has brought. You can sign up to be notified when Vizy launches; we know we’ll be keen for a closer look once it does. Easier machine vision is always a good thing, because it helps free people to focus on clever ideas like machine vision-based tool alignment.