Designing Sci-Fi Hack Chat

Join us on Wednesday, October 9 at noon Pacific for the Designing Sci-Fi Hack Chat with Seth Molson!

We all know the feeling of watching a movie set in a galaxy far, far away and seeing something that makes us say, “That’s not realistic at all!” The irony of watching human actors dressed up as alien creatures prancing across a fantasy landscape and expecting realism is lost on us as we willingly suspend disbelief in order to get into the story; seeing something in that artificial world that looks cheesy or goofy can shock you out of that state and ruin the compact between filmmaker and audience.

Perhaps nowhere do things get riskier for filmmakers than the design of the user interfaces of sci-fi and fantasy sets. Be they the control panels of spacecraft, consoles for futuristic computers, or even simply the screens of phones that are yet to be, sci-fi UI design can make or break a movie. The job of designing a sci-fi set used to be as simple as wiring up strings of blinkenlights; now, the job falls to a dedicated artist called a Playback Designer who can create something that looks fresh and new but still plausible to audiences used to interacting with technology that earlier generations couldn’t have dreamed of.

Seth Molson​ is one such artist, and you’ve probably seen some of his work on shows such as TimelessStargate Universe, and recently Netflix’s reboot of Lost in Space. When tasked to deliver control panels for spacecraft and systems that exist only in a writer’s mind, Seth sits down with graphics and animation software to make it happen.

Join us as we take a look behind the scenes with Seth and find out exactly what it’s like to be a Playback Designer. Find out what Seth’s toolchain looks like, how he interacts with the rest of the production design crew to come up with a consistent and believable look and feel for interfaces, and what it’s like to design futures that only exist — for now — in someone’s imagination.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, October 9 at 12:00 PM Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.

 

Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer

The first LED digital wristwatches hit the market in the 1970s. They required a button push to turn the display on, prompting one comedian to quip that giving one to a one-armed man would be in poor taste. While the UIs of watches and other wearables have improved since then, smartphones still present some usability challenges. Some of the touch screen gestures needed to operate a phone, like pinching, are nigh impossible when one-handing the phone, and woe unto those with stubby thumbs when trying to take a selfie.

You’d think that the fleet of sensors and the raw computing power on board would afford better ways to control phones. And you’d be right, if the modular mechanical input widgets described in a paper from Columbia University catch on. Dubbed “Vidgets” by [Chang Xiao] et al, the haptic devices are designed to create characteristic acceleration profiles on a phone’s inertial measurement unit (IMU) when actuated. Vidgets take various forms, from push buttons to scroll wheels, each of a similar size and shape and designed to dock into one of eight positions on the back of a 3D-printed phone case. Once trained, the algorithm watches for the acceleration signature caused by actuating a Vidget, and sends commands to the phone to mimic the corresponding gestures. The video below demonstrates a couple of use cases, of which the virtual saxophone is our favorite.

This is really clever stuff, and ventures deep into “Why didn’t I think of that?” territory. Need to get ahead of the curve on IMUs to capitalize on what they can do? You could start with [Al Williams]’ primer on micro-electromechanical systems, or MEMS.

Continue reading “Add Scroll Wheels And Buttons To Smartphones With 3D-Printed Widgets Read By Accelerometer”

Prusa Printer Gets An LCD-ectomy, Gains A VFD

What’s wrong with the OEM display on a Prusa I3 Mk3? Nothing at all. Then why replace the stock LCD with a vacuum fluorescent display? Because VFDs are much, much cooler than LCDs.

(Pedantic Editor’s Note: VFDs actually run a little warm.)

At least that’s the reasoning [Scott M. Baker] applied to his Prusa upgrade. We have to admit to a certain affection for all retro displays relying on the excitation of gasses. Nixies, Numitrons, and even the lowly neon pilot light all have a certain charm of their own, but by our reckoning the VFD leads the pack. [Scott] chose a high-quality Noritake 4×20 alphanumeric display module for his upgrade, thriftily watching eBay for bargains rather than buying from the big distributors. The module has a pinout that’s compatible with the OEM LCD, so replacing it is a snap. [Scott] simplified that further by buying a replacement Prusa control board with no display, to which he soldered the Noritake module. Back inside the bezel, the VFD is bright and crisp. We like the blue-green digits against the Prusa red-orange, but [Scott] has an orange filter on order for the VFD to make everything monochromatic. That’ll be a nice look too.

A completely none functional hack, to be sure, but sometimes aesthetics need attention too. And it’s possible that a display switch would help the colorblind use the UI better, like this oscilloscope mod aims to do.

Continue reading “Prusa Printer Gets An LCD-ectomy, Gains A VFD”

Finger Recognition On The Kinect

The Kinect is awesome, but if you want to do anything at a higher resolution detecting a person’s limbs, you’re out of luck. [Chris McCormick] over at CogniMem has a great solution to this problem: use a neural network on a chip to recognize fingers with hardware already connected to your XBox.

The build uses the very cool CogniMem CM1K neural network on a chip trained to tell the difference between counting from one to four on a single hand, as well as an ‘a-okay’ sign, Vulcan greeting (shown above), and rocking out at a [Dio] concert. As [Chris] shows us in the video, these finger gestures can be used to draw on a screen and move objects using only an open palm and closed fist; not too far off from the Minority Report and Iron Man UIs.

If you’d like to duplicate this build, we found the CM1K neural network chip available here for a bit more than we’d be willing to pay. A neural net on a chip is an exceedingly cool device, but it looks like this build will have to wait for the Kinect 2 to make it down to the consumer and hobbyist arena.

You can check out the videos of Kinect finger recognition in action after the break with World of Goo and Google Maps.

Continue reading “Finger Recognition On The Kinect”