While we’d love to have you all join us in Pasadena, the next best thing is to connect up to the festivities through the magic of the Internet. As always, the main stage talks will be streamed live to our YouTube channel, while the talks taking place in the DesignLab will be recorded and posted afterwards.
Though it’s not quite as immersive as being in the alleyway and listening to the dogs bark (if you know, you know), you can also join the #supercon-chat channel in the official Hackaday Discord server if you want to virtually rub shoulders with some of our favorite people in the world.
The basic principles of a motion picture film camera should be well understood by most readers — after all, it’s been well over a hundred years since the Lumière brothers wowed 19th century Paris with their first films. But making one yourself is another matter entirely, as they are surprisingly complex and high-precision devices. This hasn’t stopped [Henry Kidman] from giving it a go though, and what makes his camera more remarkable is that it’s 3D printed.
The problem facing a 16mm movie camera designer lies in precisely advancing the film by one frame at the correct rate while filming, something done in the past with a small metal claw that grabs each successive sprocket. His design eschews that for a sprocket driven by a stepper motor from an Arduino. His rotary shutter is driven by another stepper motor, and he has the basis of a good camera.
The tests show promise, but he encounters a stability problem, because as it turns out, it’s difficult to print a 16mm sprocket in plastic without it warping. He solves this by aligning frames in post-processing. After fixing a range of small problems though, he has a camera that delivers a very good picture quality, and that makes us envious.
There was a bit of a kerfuffle this week with the news that an airliner had been hit by space junk. The plane, a United Airlines 737, was operating at 36,000 feet on a flight between Denver and Los Angeles when the right windscreen was completely shattered by the impact, peppering the arm of one pilot with bits of glass. Luckily, the heavily reinforced laminated glass stayed intact, but the flight immediately diverted to Salt Lake City and landed safely with no further injuries. The “space junk” report apparently got started by the captain, who reported that they saw what hit them and that “it looked like space debris.”
We were a little skeptical of this initial assessment, mainly because the pilots and everyone aboard the flight were still alive, which we’d assume would be spectacularly untrue had the plane been hit by anything beyond the smallest bit of space junk. As it turns out, our suspicions were justified when Silicon Valley startup WindBorne Systems admitted that one of its high-altitude balloons hit the flight. The company, which uses HABs to gather weather data for paying customers, seems to have complied with all the pertinent regulations, like filing a NOTAM, so why the collision happened is a bit of a mystery.
First person view (FPV) quadcopter drones have become increasingly more capable over the years, as well as much smaller. The popular 65 mm format, as measured from hub to hub, is often considered to be about the smallest you can make an FPV drone without making serious compromises. Which is exactly why [Hoarder Sam] decided to make a smaller version that can fit inside a Pringles can, based on the electronics used in the popular Air65 quadcopter from BetaFPV.
The 22 mm FPV drone with camera installed and looking all cute. (Credit: Hoarder Sam)
The basic concept for this design is actually based on an older compact FPV drone design called the ‘bone drone’, so called for having two overlapping propellers on each end of the frame, thus creating a bone-like shape. The total hub-to-hub size of the converted Air65 drone ends up at a cool 22 mm, merely requiring a lot of fiddly assembly before the first test flights can commence. Which raises the question of just how cursed this design is when you actually try to fly with it.
Obviously the standard BetaFPV firmware wasn’t going to fly, so the next step was to modify many parameters using the Betaflight Configurator software, which unsurprisingly took a few tries. After this, the fully loaded drone with camera and battery pack, coming in at a whopping 25 grams, turns out to actually be very capable. Surprisingly, it flies not unlike an Air65 and has a similar flight time, losing only about 30 seconds of the typical three minutes.
With propellers sticking out at the top and bottom – with no propeller guards – it’s obviously a bit of a pain to launch and land. But considering what the donor Air65 went through to get to this stage, it’s honestly quite impressive that this extreme modification mostly seems to have altered its dimensions.
At 5:20 PM on November 9, 1965, the Tuesday rush hour was in full bloom outside the studios of WABC in Manhattan’s Upper West Side. The drive-time DJ was Big Dan Ingram, who had just dropped the needle on Jonathan King’s “Everyone’s Gone to the Moon.” To Dan’s trained ear, something was off about the sound, like the turntable speed was off — sometimes running at the usual speed, sometimes running slow. But being a pro, he carried on with his show, injecting practiced patter between ad reads and Top 40 songs, cracking a few jokes about the sound quality along the way.
Within a few minutes, with the studio cart machines now suffering a similar fate and the lights in the studio flickering, it became obvious that something was wrong. Big Dan and the rest of New York City were about to learn that they were on the tail end of a cascading wave of power outages that started minutes before at Niagara Falls before sweeping south and east. The warbling turntable and cartridge machines were just a leading indicator of what was to come, their synchronous motors keeping time with the ever-widening gyrations in power line frequency as grid operators scattered across six states and one Canadian province fought to keep the lights on.
They would fail, of course, with the result being 30 million people over 80,000 square miles (207,000 km2) plunged into darkness. The Great Northeast Blackout of 1965 was underway, and when it wrapped up a mere thirteen hours later, it left plenty of lessons about how to engineer a safe and reliable grid, lessons that still echo through the power engineering community 60 years later.
Back in 1966, a suitable toy for a geeky kid was a radio kit. You could find simple crystal radio sets or some more advanced ones. But some lucky kids got the Philips Electronic Engineer EE8 Kit on Christmas morning. [Anthony Francis-Jones] shows us how to build a 2-transistor AM radio from a Philips Electronic Engineer EE8 Kit.
According to [The Radar Room], the kit wasn’t just an AM radio. It had multiple circuits to make (one at a time, of course), ranging from a code oscillator to a “wetness detector.”
The kit came with a breadboard and some overlays for the various circuits, along with the required components. It relied on springs, friction, and gravity to hold most of the components to the breadboard. A little wire is used, but mostly the components are connected to each other with their leads and spring terminals.
[Inkbox] briefly explains what BIOS is, then covers how UEFI replaces it. He talks about the genesis of UEFI from Intel in the late 90s. After Intel’s implementation of UEFI was made open source it got picked up by the TianoCore community who make tools such as the TianoCore EDK II.
[Inkbox] explains that the UEFI implementation provides boot services and runtime services. Boot services include things such as loading memory management facilities or running other UEFI applications, and runtime services include things like system clock access and system reset. In addition to these services there are many more UEFI protocols that are available.
The idea of using the Apple II home computer for digital photography purposes may seem somewhat daft considering that this is not a purpose that they were ever designed for, yet this is the goal that [Colin Leroy-Mira] had, requiring some image decoder optimizations. That said, it’s less crazy than one might assume at first glance, considering that the Apple II was manufactured until 1993, while the Apple QuickTake digital cameras that [Colin] wanted to use for his nefarious purposes saw their first release in 1994.
These QuickTake cameras feature an astounding image resolution of up to 640×480, using 24-bit color. Using the official QuickTake software for Apple Macintosh System 7 through 9 the photographs in proprietary QTK format could be fetched for display and processing. Doing the same on an Apple II would obviously require a bit more work, not to mention adapting of the image to the limitations of the 8-bit Apple II compared to the Motorola 68K and PowerPC-based Macs that the QuickTake was designed to be used with.
Targeting the typical ~1 MHz 6502 CPU in an Apple II, the dcraw QTK decoder formed the basis for an initial decoder. Many memory and buffer optimizations later, an early conversion to monochrome and various other tweaks later – including a conversion to 6502 ASM for speed reasons – the decoder as it stands today manages to decode and render a QTK image in about a minute, compared to well over an hour previously.
Considering how anemic the Apple II is compared to even a budget Macintosh Classic II system, it’s amazing that displaying bitmap images works at all, though [Colin] reckons that more optimizations are possible.