OAK-D Depth Sensing AI Camera Gets Smaller And Lighter

The OAK-D is an open-source, full-color depth sensing camera with embedded AI capabilities, and there is now a crowdfunding campaign for a newer, lighter version called the OAK-D Lite. The new model does everything the previous one could do, combining machine vision with stereo depth sensing and an ability to run highly complex image processing tasks all on-board, freeing the host from any of the overhead involved.

Animated face with small blue dots as 3D feature markers.
An example of real-time feature tracking, now in 3D thanks to integrated depth sensing.

The OAK-D Lite camera is actually several elements together in one package: a full-color 4K camera, two greyscale cameras for stereo depth sensing, and onboard AI machine vision processing with Intel’s Movidius Myriad X processor. Tying it all together is an open-source software platform called DepthAI that wraps the camera’s functions and capabilities together into a unified whole.

The goal is to give embedded systems access to human-like visual perception in real-time, which at its core means detecting things, and identifying where they are in physical space. It does this with a combination of traditional machine vision functions (like edge detection and perspective correction), depth sensing, and the ability to plug in pre-trained convolutional neural network (CNN) models for complex tasks like object classification, pose estimation, or hand tracking in real-time.

So how is it used? Practically speaking, the OAK-D Lite is a USB device intended to be plugged into a host (running any OS), and the team has put a lot of work into making it as easy as possible. With the help of a downloadable application, the hardware can be up and running with examples in about half a minute. Integrating the device into other projects or products can be done in Python with the help of the DepthAI SDK, which provides functionality with minimal coding and configuration (and for more advanced users, there is also a full API for low-level access). Since the vision processing is all done on-board, even a Raspberry Pi Zero can be used effectively as a host.

There’s one more thing that improves the ease-of-use situation, and that’s the fact that support for the OAK-D Lite (as well as the previous OAK-D) has been added to a software suite called the Cortic Edge Platform (CEP). CEP is a block-based visual coding system that runs on a Raspberry Pi, and is aimed at anyone who wants to rapidly prototype with AI tools in a primarily visual interface, providing yet another way to glue a project together.

Earlier this year we saw the OAK-D used in a system to visually identify weeds and estimate biomass in agriculture, and it’s exciting to see a new model being released. If you’re interested, the OAK-D Lite is available at a considerable discount during the Kickstarter campaign.

Here’s A 100 MHz Pin-Compatible 6502 Replacement

The MOS Technology 6502 CPU was a popular part in its day. In various modified versions, it powered everything from the Commodore 64 to the Nintendo Entertainment System, and showed up in a million other applications too. A popular variant is the 65C02, and [Jürgen] decided to whip up a pin-compatible FPGA version that runs at a blazing 100MHz.

The CPU core was borrowed from work by [Arlet Ottens] and extended with 65C02 functionality by [Ed Spittles] and [David Banks]. [Jürgen] then packaged that core in a Spartan-6 FPGA and placed it on a small PCB the size of the original 65C02’s 40-pin dual inline package.

The FPGA is set up to access the external CPU bus with the timing matched to the clock of the host machine. However, internally, the CPU core runs at 100MHz. It copies RAM and ROM from the host machine into its own internal 64 kilobyte RAM, minus the areas used for memory-mapped I/O by the host. The CPU then runs at full 100MHz speed except when it needs to talk to those I/O addresses.

It allows the chip to accelerate plenty of tasks without completely flipping out when used with older hardware that can’t run at anywhere near 100MHz. The pin-compatible design has been tested successfully in an Apple II and a Commodore 8032, as well as a variety of vintage chess computers.

We’ve seen the opposite before too, with a real 6502 paired with a FPGA acting as the rest of the computer. If you’ve got any cutting-edge 6502 hacks of your own (not a misprint!), let us know!

[Thanks to David Palmer for the tip]

Hackaday Remoticon: Call For Proposals Extended To October 20th

In just a few short weeks, we’ll all be meeting up online for the second Hackaday Remoticon on November 19th and 20th. This is the year of the Talk, and who better than you, dear reader, to give one? Good news — we’ve extended the deadline for proposals to Wednesday, October 20th. We’ve all got at least one or two subjects that we could happily bloviate about for hours, be it hardware, software, skill sets, or the stuff that inspires you to stop dreaming and start doing. Why not share your wit and wisdom with the rest of the community?

So, what are you waiting for? Submit your talk proposal today! We’re not looking for you to pack the whole talk into the description box, but we would like to know what your talk will be about, and why it’s relevant to an audience of geeks, hackers, and engineers. Talks are typically 30 minutes in length, but we can likely accommodate shorter or longer talks if needed.

Everyone has something worth sharing, and the fact is, we are always looking for first-time speakers to showcase. Just share the things you’re doing that you’re passionate about, and you’re bound to have a great talk that generates excitement all around.

So grab some go-juice and start brainstorming the outline of your talk — give us enough information that we’ll be thirsty for more. Have you got terrible stage fright? Then encourage your outgoing hackerspace buddy to give one and cheer from the sidelines. Although we would rather see all of you in person, moving this conference online comes with the flexibility to hear from hackers all over the world, and no one has to leave home.

Electroplating Carbon Fibers Can Have Interesting Results

Typically, electroplating is used to put coatings of one metal upon another, often for reasons of corrosion protection or to reduce wear. However, other conductive materials can be electroplated, as demonstrated by [Michaɫ Baran].

Finer details are sparse, but [Michaɫ’s] images show the basic concept behind producing a composite metal material hand sculpture. The initial steps involve 3D printing a perforated plastic shell of a hand, and stuffing it with carbon fibers. It appears some kind of plastic balls are also used in order to help fill out the space inside the hand mold.

Then, it’s a simple matter of dunking the plastic hand in a solution for what appears to be copper electroplating, with the carbon fiber hooked up as one of the electrodes. The carbon fibers are then knitted together by the copper attached by the electroplating process. The mold can then be cut away, and the plastic filling removed, and a metal composite hand is all that’s left.

[Michaɫ] has experimented with other forms too, but the basic concept is that these conductive fibers can readily be stuffed into molds or held in various shapes, and then coated with metal. We’d love to see the results more closely to determine the strength and usefulness of the material.

Similar techniques can be used to strengthen 3D printed parts, too. If you’ve got your own ideas on how to best use this technique, sound off below. If you’ve already done it, though, do drop us a line!

[Thanks to Krzysztof for the tip]

Carbon Sequestration As A Service Doesn’t Quite Add Up

Burning fossil fuels releases carbon dioxide into the atmosphere. While most attempts to reduce greenhouse-gas emissions focus on reducing the amount of CO2 output, there are other alternatives. Carbon capture and sequestration has been an active area of research for quite some time. Being able to take carbon dioxide straight out of the air and store it in a stable manner would allow us to reduce levels in the atmosphere and could make a big difference when it comes to climate change.

A recent project by a company called Climeworks is claiming to be doing just that, and are running it as a subscription service. The company has just opened up its latest plant in Iceland, and hopes to literally suck greenhouses gases out of the air. Today, we’ll examine whether or not this technology is a viable tool in the fight against climate change.

Continue reading “Carbon Sequestration As A Service Doesn’t Quite Add Up”

Hackaday Podcast 140: Aqua Battery, IBM Cheese Cutter, Waiting For USB-C, And Digging ADCs

Hackaday editors Elliot Williams and Mike Szczys chew the fat over the coolest of hacks. It’s hard to beat two fascinating old-tech demonstraters; one is a mechanical IBM computer for accurate cheese apportionment, the other an Analog-to-Digital Converter (ADC) built from logic chips. We gawk two very different uses of propeller-based vehicles; one a flying-walker, the other a ground-effect coaster. Big news shared at the top of the show is that Keith Thorne of LIGO is going to present a keynote at Hackaday Remoticon. And we wrap the episode talking about brighter skies from a glut of satellites and what the world would look like if one charging cable truly ruled all smartphones.

Take a look at the links below if you want to follow along, and as always, tell us what you think about this episode in the comments!

Direct download (55 MB)

Continue reading “Hackaday Podcast 140: Aqua Battery, IBM Cheese Cutter, Waiting For USB-C, And Digging ADCs”

Several shirts side by side, each with a custom design

3D Print A Custom T-Shirt Design, Step-by-Step

Want to make a t-shirt with a custom design printed on it? It’s possible to use a 3D printer, and Prusa Research have a well-documented blog post and video detailing two different ways to use 3D printing to create colorful t-shirt designs. One method uses a thin 3D print as an iron-on, the other prints directly onto the fabric. It turns out that a very thin PLA print makes a dandy iron-on that can survive a few washes before peeling, but printing flexible filament directly onto the fabric — while more complicated — yields a much more permanent result. Not sure how to turn a graphic into a 3D printable model in the first place? No problem, they cover that as well.

Making an iron-on is fairly straightforward, and the method can be adapted to just about any printer type. One simply secures a sheet of baking paper (better known as parchment paper in North America) to the print bed with some binder clips, then applies glue stick so that the print can adhere. A one- or two-layer thick 3D print will stick to the sheet, which can then be laid print-side down onto a t-shirt and transferred to the fabric by ironing it at maximum temperature. PLA seems to work best for iron-ons, as it preserves details better. The results look good, and the method is fairly simple.

Direct printing to the fabric with flexible filament can yield much better (and more permanent) results, but the process is more involved and requires 3D printing a raised bed adapter for a Prusa printer, and fiddling quite a few print settings. But the results speak for themselves: printed designs look sharp and won’t come loose even after multiple washings. So be certain to have a few old shirts around for practice, because mistakes can’t be undone.

That 3D printers can be used to embed designs directly onto fabric is something many have known for years, but it’s always nice to see a process not just demonstrated as a concept, but documented as a step-by-step workflow. A video demonstration of everything, from turning a graphic into a 3D model to printing on a t-shirt with both methods is all in the short video embedded below, so give it a watch.

Continue reading “3D Print A Custom T-Shirt Design, Step-by-Step”