Hackaday Remoticon: Our 2020 Conference Is Packed With Workshops And We’re Calling For Proposals

We’re proud to announce the Hackaday Remoticon, taking place everywhere November 6th – 8th, 2020. It’s a weekend packed with workshops about hardware creation, held virtually for all to enjoy.

Update: Tickets are now available for 2020 Remoticon!

But we can’t do it without you. We need you to host a workshop on that skill, technique, or special know-how that you acquired through hard work over too many hours to count. Send in your workshop proposal now!

What is a Remoticon?

The Hackaday Remoticon achieves something that we just couldn’t do at the Hackaday Superconference: host more workshops that involve more people. Anyone who’s been to Supercon over the past six years can tell you it’s space-limited and, although we do our best to host a handful of workshops each day, those available seats are always in high demand.

We’re sad that we can’t get together in person for Supercon this year, but now we have an opportunity to host more workshops, engaging more live instructors and participants because they will be held virtually. This also means that we can make recordings of them available so that more people can learn from the experience. This is something that we tried way back during the first Supercon with Mike Ossmann’s RF Circuit Design workshop and 140,000 people have watched that video. (By the way, that link is worth clicking just to see Joe Kim’s excellent art.) Continue reading “Hackaday Remoticon: Our 2020 Conference Is Packed With Workshops And We’re Calling For Proposals”

Don’t Let Endianness Flip You Around

Most of the processor architectures which we come into contact with today are little-endian systems, meaning that they store and address bytes in a least-significant byte (LSB) order. Unlike in the past, when big-endian architectures, including the Motorola 68000 and PowerPC, were more common, one can often just assume that all of the binary data one reads from files and via communication protocols are in little-endian order. This will often work fine.

The problem comes with for example image formats that use big-endian formatted integers, including TIFF and PNG. When dealing directly with protocols in so-called ‘network order’, one also deals with big-endian data. Trying to use these formats and protocol data verbatim on a little-endian system will obviously not work.

Fortunately, it is very easy to swap the endianness of any data which we handle. Continue reading “Don’t Let Endianness Flip You Around”

[NileRed] Makes Superconductors

We always enjoy [NileRed’s] videos. His latest shows how he made some relatively high-temperature superconducting ceramic. After finding what appeared to be some really good instructions on the Internet, [NileRed] found there were some things in the paper that didn’t make sense. You can watch the video, below.

The superconductor was YBCO, sometimes known as 123 because of the ratio of its components. Turns out that most of the materials were available online, except for one exotic chemical that he had to buy from a more conventional source.

Continue reading “[NileRed] Makes Superconductors”

Facial Detection With Pi + MATLAB

[Monica] wanted to try a bit of facial detection with her Raspberry Pi and she found some pretty handy packages in MATLAB to help her do just that. The packages are based on the Viola-Jones algorithm which was the first real-time object detection framework for facial detection.

She had to download MATLAB’s Raspbian image to allow the Pi to interpret MATLAB commands over a custom server. That setup is mostly pretty easy and she does a good job walking you through the setup on her project page.

With that, now she can control the Pi in MATLAB: configure the camera, toggle GPIO, etc. The real fun comes with the facial detection program. In addition to opening up a live video feed of the Pi camera, the program outputs pixel data. [Monica] was mostly just testing the stock capabilities, but wants to try detecting other objects next. We’ll see what cool modifications she’s able to come up with.

If MATLAB doesn’t quite fit your taste, we have a slew of facial detection projects on Hackaday.

IOT Pinball Puts Oktoberfest Fun On Tap

We don’t really miss going out to bars all that much, unless you’re talking about the one downtown with all the pinball machines. Don’t get us wrong — pinball emulators have gotten crazy good, and you can find exact digital replicas of most machines to play on your phone or whatever. But it just doesn’t compare to the thrill of playing a real cabinet.

Don’t despair, because for the next couple of weeks, you can queue up to play on a real Oktoberfest pinball machine that’s sitting in Espoo, Finland. The controls are hooked up to a Raspberry Pi 4 through a custom HAT, along with a camera pointed at the playfield and another focused on the backglass screen. The game development/video streaming company Surrogate is hosting a tournament over the internet, and will be giving prizes to the top ten high rollers.

We usually have to wait until the holiday season to come across these remote-reality gaming opportunities. Having played it several times now, we recommend spamming the flippers until you get a feel for the lag. Also, just holding the flippers up while the ball is in the upper half of the playfield will catch a lot of balls that you might otherwise lose due to flipper lag, and sometimes they end up back in front of the launcher to shoot again. After the break, check out a brief but amusing video of setting up the cameras and Pi that includes a taste of the Oktoberfest music.

The tournament runs until the end of August, which should be enough time for somebody to set up CV and a keyboard to play this automatically. Need inspiration? Here’s an open-source pinball machine that can play itself.

Continue reading “IOT Pinball Puts Oktoberfest Fun On Tap”

Optimizing GIF Playback For Microcontrollers

Despite being cooked up by Compuserve back in the late 1980s, GIFs have seen a resurgence on the modern internet, mostly because they’re fun. However, all our small embedded systems are getting color screens these days, and they’d love to join in the party. [Larry Bank] has whipped up a solution for just that reason, letting embedded systems play back short animated GIFs with limited resources.

[Larry] does a great job of explaining how the GIF format works, using LZW compression and variable-length codes. He talks about how the design of the format presents challenges, particularly when working with microcontrollers. Despite this, the final code works well, and is able to work with most animated GIFs of the right dimensions and construction. 24K of RAM is required, and image width is limited to 320 pixels. Images can be loaded from flash, memory, or SD cards, and he notes that best performance is gained with a microcontroller with fast SPI for writing to screens quickly.

It’s a great piece of software that promises to add a lot of charm, or silliness, to microcontroller projects. It also simplifies the use of animations, which can now be designed on computers rather than by using onboard graphics libraries. GIF really is the format that never seems to die; we’ve featured cameras dedicated to the form before. Video after the break.

Continue reading “Optimizing GIF Playback For Microcontrollers”

William English, Computer Mouse Co-Creator, Has Passed

We are saddened to report that William English, co-inventor of the computer mouse, died July 26 in San Rafael, California. He was 91 years old.

Bill at the controls at Stanford Research Institute. Image via MSN

Every piece of technology starts with a vision, a vague notion of how a thing could or should be. The computer mouse is no different. In fact, the mouse was built to be an integral part of the future of personal computing — a shift away from punch cards and mystery toward a more accessible and user-friendly system of windowed data display, hyperlinks, videoconferencing, and more. And all of it would be commanded by a dot on the screen moving in sync with the operator’s intent, using a piece of hardware controlled by the hand.

The stuff of science fiction becomes fact anytime someone has the means to make it so. Often times the means includes another human being, a intellectual complement who can conjure the same rough vision and fill in the gaps. For Douglas Engelbart’s vision of the now-ubiquitous computer mouse, that person was William English.

William English was born January 27, 1929 in Lexington, Kentucky. His father was an electrical engineer and William followed this same path after graduating from a ranch-focused boarding school in Arizona. After a stint in the Navy, he took a position at Stanford Research Institute in California, where he met Douglas Engelbart.

The first computer mouse, built by William English in the 1960s. Image via Wikipedia

Engelbart showed William his notes and drawings, and he built the input device that Englebart envisioned — one that could select characters and words on the screen and revolutionize text editing. The X/Y Position Indicator, soon and ever after called the mouse: a sort of rough-yet-sleek pinewood derby car of an input device headed into the future of personal computing.

William’s mouse was utilitarian: a wooden block with two perpendicular wheels on the bottom, and a pair of potentiometers inside to interpret the wheels’ X and Y positions. The analog inputs are converted to digital and represented on the screen. The first mouse had a single button, and the cord was designed to run out the bottom, not the top.

Continue reading “William English, Computer Mouse Co-Creator, Has Passed”