If you’ve done anything with modern lighting effects, you’ve probably heard of DMX, also known as DMX512. Ever wonder what’s really happening under the hood? If so, then you should have a look at [EEForEveryone’s] video on the topic, which you can see below.
At the core, the DMX512 uses RS485, but adds software layers and features. The video uses the OSI model to show how the system works.
Even if you aren’t a giant history buff, you probably know that the French royal family had some difficulties in the late 1700s. The end of the story saw the King beheaded and, a bit later, his wife the famous Marie Antoinette suffered the same fate. Marie wrote many letters to her confidant, and probable lover, Swedish count Axel von Fersen. Some of those letters have survived to the present day — sort of. An unknown person saw fit to blot out parts of the surviving letters with ink, rendering them illegible. Well, that is, until now thanks to modern x-ray technology.
Anne Michelin from the French National Museum of Natural History and her colleagues were able to foil the censor and they even have a theory as to the ink blot’s origin: von Fersen, himself! The technique used may enable the recovery of other lost portions of historical documents and was published in the journal Science Advances.
The OAK-D is an open-source, full-color depth sensing camera with embedded AI capabilities, and there is now a crowdfunding campaign for a newer, lighter version called the OAK-D Lite. The new model does everything the previous one could do, combining machine vision with stereo depth sensing and an ability to run highly complex image processing tasks all on-board, freeing the host from any of the overhead involved.
The OAK-D Lite camera is actually several elements together in one package: a full-color 4K camera, two greyscale cameras for stereo depth sensing, and onboard AI machine vision processing with Intel’s Movidius Myriad X processor. Tying it all together is an open-source software platform called DepthAI that wraps the camera’s functions and capabilities together into a unified whole.
The goal is to give embedded systems access to human-like visual perception in real-time, which at its core means detecting things, and identifying where they are in physical space. It does this with a combination of traditional machine vision functions (like edge detection and perspective correction), depth sensing, and the ability to plug in pre-trained convolutional neural network (CNN) models for complex tasks like object classification, pose estimation, or hand tracking in real-time.
So how is it used? Practically speaking, the OAK-D Lite is a USB device intended to be plugged into a host (running any OS), and the team has put a lot of work into making it as easy as possible. With the help of a downloadable application, the hardware can be up and running with examples in about half a minute. Integrating the device into other projects or products can be done in Python with the help of the DepthAI SDK, which provides functionality with minimal coding and configuration (and for more advanced users, there is also a full API for low-level access). Since the vision processing is all done on-board, even a Raspberry Pi Zero can be used effectively as a host.
There’s one more thing that improves the ease-of-use situation, and that’s the fact that support for the OAK-D Lite (as well as the previous OAK-D) has been added to a software suite called the Cortic Edge Platform (CEP). CEP is a block-based visual coding system that runs on a Raspberry Pi, and is aimed at anyone who wants to rapidly prototype with AI tools in a primarily visual interface, providing yet another way to glue a project together.
The MOS Technology 6502 CPU was a popular part in its day. In various modified versions, it powered everything from the Commodore 64 to the Nintendo Entertainment System, and showed up in a million other applications too. A popular variant is the 65C02, and [Jürgen] decided to whip up a pin-compatible FPGA version that runs at a blazing 100MHz.
The CPU core was borrowed from work by [Arlet Ottens] and extended with 65C02 functionality by [Ed Spittles] and [David Banks]. [Jürgen] then packaged that core in a Spartan-6 FPGA and placed it on a small PCB the size of the original 65C02’s 40-pin dual inline package.
The FPGA is set up to access the external CPU bus with the timing matched to the clock of the host machine. However, internally, the CPU core runs at 100MHz. It copies RAM and ROM from the host machine into its own internal 64 kilobyte RAM, minus the areas used for memory-mapped I/O by the host. The CPU then runs at full 100MHz speed except when it needs to talk to those I/O addresses.
It allows the chip to accelerate plenty of tasks without completely flipping out when used with older hardware that can’t run at anywhere near 100MHz. The pin-compatible design has been tested successfully in an Apple II and a Commodore 8032, as well as a variety of vintage chess computers.
We’ve seen the opposite before too, with a real 6502 paired with a FPGA acting as the rest of the computer. If you’ve got any cutting-edge 6502 hacks of your own (not a misprint!), let us know!
In just a few short weeks, we’ll all be meeting up online for the second Hackaday Remoticon on November 19th and 20th. This is the year of the Talk, and who better than you, dear reader, to give one? Good news — we’ve extended the deadline for proposals to Wednesday, October 20th. We’ve all got at least one or two subjects that we could happily bloviate about for hours, be it hardware, software, skill sets, or the stuff that inspires you to stop dreaming and start doing. Why not share your wit and wisdom with the rest of the community?
So, what are you waiting for? Submit your talk proposal today! We’re not looking for you to pack the whole talk into the description box, but we would like to know what your talk will be about, and why it’s relevant to an audience of geeks, hackers, and engineers. Talks are typically 30 minutes in length, but we can likely accommodate shorter or longer talks if needed.
Everyone has something worth sharing, and the fact is, we are always looking for first-time speakers to showcase. Just share the things you’re doing that you’re passionate about, and you’re bound to have a great talk that generates excitement all around.
So grab some go-juice and start brainstorming the outline of your talk — give us enough information that we’ll be thirsty for more. Have you got terrible stage fright? Then encourage your outgoing hackerspace buddy to give one and cheer from the sidelines. Although we would rather see all of you in person, moving this conference online comes with the flexibility to hear from hackers all over the world, and no one has to leave home.
Typically, electroplating is used to put coatings of one metal upon another, often for reasons of corrosion protection or to reduce wear. However, other conductive materials can be electroplated, as demonstrated by [Michaɫ Baran].
Finer details are sparse, but [Michaɫ’s] images show the basic concept behind producing a composite metal material hand sculpture. The initial steps involve 3D printing a perforated plastic shell of a hand, and stuffing it with carbon fibers. It appears some kind of plastic balls are also used in order to help fill out the space inside the hand mold.
Then, it’s a simple matter of dunking the plastic hand in a solution for what appears to be copper electroplating, with the carbon fiber hooked up as one of the electrodes. The carbon fibers are then knitted together by the copper attached by the electroplating process. The mold can then be cut away, and the plastic filling removed, and a metal composite hand is all that’s left.
[Michaɫ] has experimented with other forms too, but the basic concept is that these conductive fibers can readily be stuffed into molds or held in various shapes, and then coated with metal. We’d love to see the results more closely to determine the strength and usefulness of the material.
Similar techniques can be used to strengthen 3D printed parts, too. If you’ve got your own ideas on how to best use this technique, sound off below. If you’ve already done it, though, do drop us a line!
Burning fossil fuels releases carbon dioxide into the atmosphere. While most attempts to reduce greenhouse-gas emissions focus on reducing the amount of CO2 output, there are other alternatives. Carbon capture and sequestration has been an active area of research for quite some time. Being able to take carbon dioxide straight out of the air and store it in a stable manner would allow us to reduce levels in the atmosphere and could make a big difference when it comes to climate change.
A recent project by a company called Climeworks is claiming to be doing just that, and are running it as a subscription service. The company has just opened up its latest plant in Iceland, and hopes to literally suck greenhouses gases out of the air. Today, we’ll examine whether or not this technology is a viable tool in the fight against climate change.