Apple II Talks To 3D Printer With A Little Modern Help

Controlling most desktop 3D printers is as easy as sending them G-code commands over a serial connection. As you might expect, it takes a relatively quick machine to fire off the commands fast enough for a good-quality print. But what if you weren’t so picky? If speed isn’t a concern, what’s the practical limit on the type of computer you could use?

In an effort to answer that question, [Max Piantoni] set out to control his Ender 3 printer with an authentic Apple IIc. Things were made a bit easier by the fact that he really only wanted to use the printer as a 2D plotter, so he could ignore the third dimension in his code. All he needed to do was come up with a BASIC program that let him create some simple geometric artwork on the Apple and convert it into commands that could be sent out over the computer’s serial port.

Unity controlling the Ender 3

Unfortunately, [Max] ran into something of a language barrier. While the Apple had no problem generating G-code the Ender’s controller would understand, both devices couldn’t agree on a data rate that worked for both of them. The 3D printer likes to zip along at 115,200 baud, while the Apple was plodding ahead at 300. Clearly, something would have to stand in as an interpreter.

The solution [Max] came up with certainly wouldn’t be our first choice, but there’s something to be said for working with what you know. He quickly whipped up a program in Unity on his Macbook that would accept incoming commands from the Apple II at 300 baud, build up a healthy buffer, and then send them off to the Ender 3. As you can see in the video after the break, this Mac-in-the-middle approach got these unlikely friends talking at last.

We’re reminded of a project from a few years back that aimed to build a fully functional 3D printer with 1980s technology. It was to be controlled by a Commodore PET from the 1980s, which also struggled to communicate quickly enough with the printer’s electronics. Bringing a modern laptop into the mix is probably cheating a bit, but at least it shows the concept is sound.

Continue reading “Apple II Talks To 3D Printer With A Little Modern Help”

Ubuntu Update Hack Chat

Join us on Wednesday, July 22 at noon Pacific for the Ubuntu Update Hack Chat with Rhys Davies and Alan Pope!

Everyone has their favorite brands, covering everything from the clothes they wear to the cars they drive. We see brand loyalty informing all sorts of acquisition decisions, not only in regular consumer life but in technology, too. Brand decisions sort people into broad categories like Mac versus PC, or iPhone versus Android, and can result in spirited discussions of the relative merits of one choice over the others. It’s generally well-intentioned, even if it gets a bit personal sometimes.

Perhaps no choice is more personal in hacker circles than which Linux distribution to use. There are tons to choose from, each with their various features and particular pros and cons. Ubuntu has become a very popular choice for Linux aficionados, attracting more than a third of the market. Canonical is the company behind the Debian-based distro, providing editions that run on the desktop, on servers, and on a variety of IoT devices, as well as support and services for large-scale users.

To fill us in on what’s new in the world of Ubuntu, Canonical product manager Rhys Davies and developer advocate Alan Pope will stop by the Hack Chat this week. They’ll be ready to answer all your questions about the interesting stuff that’s going on with Ubuntu, including the recently announced Ubuntu Appliances, easy to install, low maintenance images for Raspberry Pis and PCs that are built for security and simplicity. We’ll also talk about snaps, desktops, and whatever else crops up.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, July 22 at 12:00 PM Pacific time. If time zones have you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about. Continue reading “Ubuntu Update Hack Chat”

Toilet Paper Chase And Indoor Cycling Race With Unity And Arduino

While we’re still far away from returning to a pre-Corona everyday life, people seem to have accepted that toilet paper will neither magically cease to exist, nor become our new global currency. But back at the height of its madness, like most of us, [Jelle Vermandere] found himself in front of empty shelves, and the solution seemed obvious to him: creating a lifelike toilet paper chasing game in hopes to distract the competition.

Using Unity, [Jelle] created a game world of an empty supermarket, with the goal to chase after distribution tubes and collect toilet paper packs into a virtual cart. Inspired by the Wii Wheel, he imitated a shopping cart handle built from — as it appears — a sunshade pole that holds an Arduino and accelerometer in a 3D-printed case as game controller. For an even more realistic feel, he added a sound sensor to the controller, and competing carts to the game, which can be pushed out of the way by simply yelling loud enough. You can witness all of this delightful absurdity in his build video after the break.

From racing shopping carts to racing bicycles

But that’s not all. With the toilet paper situation sorted out, [Jelle] found himself in a different dilemma: a cloud foiled his plans of going for a bicycle ride. In the same manner, he ended up building a cycling racing game, once again with Unity and Arduino. From a 3D-scanned model of himself and his bicycle, to automatically generating tracks on the fly and teaching an AI to ride a bike, [Jelle] clearly doesn’t joke around while he’s joking around.

However, the best part about the game has to be the controller, which is his actual bicycle. Using a magnetic door sensor to detect the speed, and a potentiometer mounted with an obscure Lego construction to the handlebar, it’s at least on par with the shopping cart handle — but judge for yourself in another build video, also attached after the break. The only thing missing now is to level up the difficulty by powering the Arduino with the bicycle itself.

Continue reading “Toilet Paper Chase And Indoor Cycling Race With Unity And Arduino”

Automate Your Xbox

First the robots took our jobs, then they came for our video games. This dystopian future is brought to you by [Little French Kev] who designed this adorable 3D-printed robot arm to interface with an Xbox One controller joystick. He shows it off in the video after the break, controlling a ball-balancing physics demonstration written in Unity.

Hats off to him on the quality of the design. There are two parts that nestle the knob of the thumbstick from either side. He mates those pieces with each other using screws, firmly hugging the stick. Bearings are used at the joints for smooth action of the two servo motors that control the arm. The base of the robotic appendage is zip-tied to the controller itself.

The build targets experimentation with machine learning. Since the computer can control the arm via an Arduino, and the computer has access to metrics of what’s happening in the virtual environment, it’s a perfect for training a neural network. Are you thinking what we’re thinking? This is the beginning of hardware speed-running your favorite video games like [SethBling] did for Super Mario World half a decade ago. It will be more impressive since this would be done by automating the mechanical bit of the controller rather than operating purely in the software realm. You’ll just need to do your own hack to implement button control.

Continue reading “Automate Your Xbox”

Tricking The Brain Into Seeing Boosted Contrast In Stereo Imagery

Last year a team of researchers published a paper detailing a method of boosting visual contrast and image quality in stereoscopic displays. The method is called Dichoptic Contrast Enhancement (DiCE) and works by showing each eye a slightly different version of an image, tricking the brain into fusing the two views together in a way that boosts perceived image quality. This only works on stereoscopic displays like VR headsets, but it’s computationally simple and easily implemented. This trick could be used to offset some of the limitations of displays used in headsets, for example making them appear capable of deeper contrast levels than they can physically deliver. This is good, because higher contrasts are generally perceived as being more realistic and three-dimensional; important factors in VR headsets and other stereoscopic displays.

Stereoscopic vision works by having the brain fuse together what both eyes see, and this process is called binocular fusion. The small differences between what each eye sees mostly conveys a sense of depth to us, but DiCE uses some of the quirks of binocular fusion to trick the brain into perceiving enhanced contrast in the visuals. This perceived higher contrast in turn leads to a stronger sense of depth and overall image quality.

Example of DiCE-processed images, showing each eye a different dynamic contrast range. The result is greater perceived contrast and image quality when the brain fuses the two together.

To pull off this trick, DiCE displays a different contrast level to both eyes in a way designed to encourage the brain to fuse them together in a positive way. In short, using a separate and different dynamic contrast range for each eye yields an overall greater perceived contrast range in the fused image. That’s simple in theory, but in practice there were a number of problems to solve. Chief among them was the fact that if the difference between what each eyes sees is too great, the result is discomfort due to binocular rivalry. The hard scientific work behind DiCE came from experimentally determining sweet spots, and pre-computing filters independent of viewer and content so that it could be applied in real-time for a consistent result.

Things like this are reminders that we experience the world only through the filter of our senses, and our perception of reality has quirks that can be demonstrated by things like this project and other “sensory fusion” edge cases like the Thermal Grill Illusion, which we saw used as the basis for a replica of the Pain Box from Dune.

A short video overview of the method is embedded below, and a PDF of the publication can be downloaded for further reading. Want a more hands-on approach? The team even made a DiCE plugin (freely) available from the Unity asset store.

Continue reading “Tricking The Brain Into Seeing Boosted Contrast In Stereo Imagery”

Millenium Falcon HID: Get Unity To Talk To Teensy

Here’s one that proves a hardware project can go beyond blinking LEDs and dumping massive chunks of data onto a serial console. Those practices are fine for some, but [dimtass] has found a more elegant hack for a more civilized age. His 3D Millennium Falcon model gets orientation data from his IMU as an an HID device.

The hardware involved is an MPU6050 6-axis sensor that is interfaced with a Teensy 3.2 board. [dimtass] documents his approach to calibrating the IMU going a bit further by using a Python script to generate offsets. We’ve advocated using Jupyter notebooks in the past and this is a good example of Jupyter plotting the data and visualizing the effect of the offsets in a second pass.

When in action, the Teensy reads IMU data and sends it over a USB RAW HID interface. For the uninitiated, HID transfers are more reliable than USB CDC transfers (virtual serial port) because they use smaller data chunks per event/transaction and usually don’t require special driversOn the computer side, [dimtass] has written a small application that gets the IMU values over the RAW HID and then provides it to the visualization application.

A 3D Millennium Falcon model is rendered in Unity, the popular open source game development engine. Even though Unity has an API, this particular approach is more OS specific using a shared-memory technique. The HID application writes to a file (/tmp/hid-shared-buffer) which is then read by Unity to make orientation changes to the rendered model.

[dimtass] provides lots of details on the tools used to bring his project to life and it can be a great starting point for more projects that need interfacing sensors with a visualization system. We have seen ways to turn a person’s head into a joystick and if you need a deeper dive into Unity, look no further.

Continue reading “Millenium Falcon HID: Get Unity To Talk To Teensy”

Screen Shake In VR, Minus The Throwing Up

In first-person games, an effective way to heighten immersion is to give the player a sense of impact and force by figuratively shaking the camera. That’s a tried and true practice for FPS games played on a monitor, but to [Zulubo]’s knowledge, no one has implemented traditional screen shake in a VR title because it would be a sure way to trigger motion sickness. Unsatisfied with that limitation, some clever experimentation led [Zulubo] to a method of doing screen shake in VR that doesn’t cause any of the usual problems.

Screen shake doesn’t translate well to VR because the traditional method is to shake the player’s entire view. This works fine when viewed on a monitor, but in VR the brain interprets the visual cue as evidence that one’s head and eyeballs are physically shaking while the vestibular system is reporting nothing of the sort. This kind of sensory mismatch leads to motion sickness in most people.

The key to getting the essence of a screen shake without any of the motion sickness baggage turned out to be a mix of two things. First, the shake is restricted to peripheral vision only. Second, it is restricted to an “in and out” motion, with no tilting or twisting. The result is a conveyance of concussion and impact that doesn’t rely on shaking the player’s view, at least not in a way that leads to motion sickness. It’s the product of some clever experimentation to solve a problem, and freely downloadable for use by anyone who may be interested.

Speaking of fooling one’s senses in VR environments, here is a fascinating method of simulating zero gravity: waterproof the VR headset and go underwater.

[via Reddit]