Driving A Controllerless LCD With The Humble Arduino Uno

These days, you could be forgiven for thinking driving an LCD from a microcontroller is easy. Cheap displays have proliferated, ready to go on breakout boards with controllers already baked in. Load up the right libraries and you’re up and running in a matter of minutes. However, turn your attention to trying to drive a random LCD you’ve yanked out of a piece of old equipment, and suddenly things get harder. [Ivan Kostoski] was in just such a position and decided to get down to work.

[Ivan]’s LCD was a 320×240 STN device salvaged from an old tape library. The display featured no onboard controller, and the original driver wasn’t easily repurposed. Instead, [Ivan] decided to drive it directly from an Arduino Uno.

This is easier said than done. There are stringent timing requirements that push the limits of the 8-bit platform, let alone the need for a negative voltage to drive the screen and further hardware to drive the backlight. These are all tackled in turn, with [Ivan] sharing his tips to get the most flexibility out of the display. Graphics and text modes are discussed, along with optimizations that could be possible through the varied use of available RAM and flash.

The code is available on Github. If you need inspiration for your own controllerless LCD driver. [Ben Heck] has done similar work too, using FPGA grunt to get the job done.

Component Video For The Commodore 64

Of all the retro systems, the Commodore 64 had the best video system. The VIC-II chip in the C64 was the best example of why Commodore was the best, but in terms of video output, the C64 was still a consumer device: the only output was S-video, or composite video, or something like it. The professional stuff uses YPbPr, an RGB video signal that separates the red, green, and blue colors. On a modern LCD, the difference between composite and YPbPr is noticeable, and if you’re going to run your C64 on the big screen, it would be very helpful to use a professional video standard.

In an effort to bring the C64 into the future, [c0pperdragon] created an FPGA-based modification for the VIC-II chip. The end result is getting YPbPr signals directly from the computer, and outputting it to a TV in glorious 480p.

Inside the Commodore 64, the VIC-II creates the chrominance signal in a way that is impossible to convert it back to any form of RGB. The solution to get RGB out of this information is to listen in to 22 pins of the VIC-II to determine what signals it intends to generate. This is done with a smallish Altera FPGA connected to the VIC-II through a ribbon cable. On the FPGA, the luminescence and all the color information is generated, then converted into true YPbPr. For the complete mod, the RF modulator is removed, and the original A/V jack is still functional. This is effectively a very in-depth mod that rids the C64 of the TV connector and channel selector (that no one uses anymore) and replaces it with a professional-grade video output.

When it comes to C64 mods, we thought we’ve seen it all. We’ve seen C64s resurrected from the dead, and we’ve seen drop-in replacements for the SID that still don’t have working filters oh my god. This is on another level. This is using FPGAs to drag the C64 into the modern era, and if you don’t care about the rusting RF box, it’s a reversible mod.

This Cardboard Box Can Tell You What It Sees

It wasn’t that long ago that talking to computers was the preserve of movies and science fiction. Slowly, voice recognition improved, and these days it’s getting to be pretty usable. The technology has moved beyond basic keywords, and can now parse sentences in natural language. [Liz Meyers] has been working with the technology, creating WhatIsThat – an AI that can tell you what it’s looking at.

Adding a camera to Google’s AIY Voice Kit makes for a versatile object identification system.

The device is built around Google’s AIY Voice Kit, which consists of a Raspberry Pi with some additional hardware and software to enable it to process voice queries. [Liz] combined this with a Raspberry Pi camera and the Google Cloud Vision API. This allows WhatIsThat to respond to users asking questions by taking a photo, and then identifying what it sees in the frame.

It may seem like a frivolous project to those with working vision, but there is serious potential for this technology in the accessibility space. The device can not only describe things like animals or other objects, it can also read text aloud and even identify logos. The ability of the software to go beyond is impressive – a video demonstration shows the AI correctly identifying a Boston Terrier, and attributing a quote to Albert Einstein.

Artificial intelligence has made a huge difference to the viability of voice recognition – because it’s one thing to understand the words, and another to understand what they mean when strung together. Video after the break.

[Thanks to Baldpower for the tip!]

Continue reading “This Cardboard Box Can Tell You What It Sees”

Car Alarm Hacks 3 Million Vehicles

Pen testing isn’t about evaluating inks. It is short for penetration testing — someone ensuring a system’s security by trying to break in or otherwise attack it. A company called Pen Test Partners made the news last week by announcing that high-end car alarm systems made by several vendors have a critical security flaw that could make the vehicles less secure. They claim about three million vehicles are affected.

The video below shows how alarms from Viper/Clifford and Pandora have a simple way to hijack the application. Once they have access, they can find the car in real time, control the door locks, and start or stop the car engine. They speculate a hacker could set off the alarm from a nearby chase car. You’d probably pull over if your alarm started going off. They can then lock you in your car, approach, and then force you out of the car.

Continue reading “Car Alarm Hacks 3 Million Vehicles”

A Garbage Bag Skirt Is Fit For A Hovercraft

The hovercraft is an entertaining but much maligned form of transport. While they have military applications and at times have even run as ferries across the English Channel, fundamental issues with steering and braking have prevented us all driving them to work on a regular basis. They do make great toys however, and [HowToMechatronics] has built an excellent example.

The build is primarily a 3D printed affair, with the hull, ducting, and even the propellers being made in this way. The craft is sized to be readily printable on a 30cm square build platform, making it accessible to most printer owners. Drive is via brushless motors, and control is achieved using their previously-featured self-built NRF24L01 radio control transmitter.

What stands out among most other hovercraft builds we see here is the functioning skirt. It’s constructed from a garbage bag, and held on to the hull with a 3D printed clamping ring. Most quick builds omit a skirt and make up for it with light weight and high power, so its nice to see one implemented here. We’d love to see how well the craft works on the water, though it holds up well on the concrete.

Finished in a camouflage paint scheme, the craft looks the part, and handles well too. We’d consider a small correction to the center of gravity, but it’s nothing a little ballast wouldn’t fix. Video after the break. Continue reading “A Garbage Bag Skirt Is Fit For A Hovercraft”

RemoteDebug For ESP Platforms

Debugging tools are critical to quick and effective development. Without being able to peek under the hood at what’s really going on, it can be difficult to understand and solve problems. Those who live on the Arduino platform are probably well acquainted with using the serial port to debug, but it’s far from the only way. [JoaoLopesF] has coded the RemoteDebug tool for ESP platforms, and the results are impressive.

RemoteDebug does away with the serial interface entirely, instead using the ESP’s native wireless interface to send debug data over TCP/IP. It’s all handled over telnet, making it completely platform agnostic. By handling things over the WiFi connection, it negates issues with physical access, as well as hassles with cables and limited serial ports. It’s also of benefit to robotics projects, which no longer need a tether when debugging.

It comes with a similar set of features to [JoaoLopesF]’s earlier work, SerialDebug. Things like verbosity and timestamps are all built in, making it easy to get high-quality debug data without having to reinvent the wheel yourself. Video after the break.

Continue reading “RemoteDebug For ESP Platforms”

ESP32 Drives Controllerless Display Using I2S Hack

It’s possible to find surplus LCDs in all kinds of old hardware. Photocopiers, printers – you name it, there’s old junk out there with displays going to waste. Unfortunately, unlike the displays on sale at your favourite maker website, these often lack a controller and can be quite difficult to drive. [pataga] took on the challenge to drive a LCD of unknown provenance, using the power of the ESP32.

The LCD in question is a 240×160 monochrome device, that was initially being driven successfully with a Microchip PIC24 with a dedicated LCD driver peripheral. This allowed [pataga] to study the display interface under working conditions with the help of an oscilloscope. Inspiration was then taken from a project by [Sprite_tm], which used the I2S peripheral to drive a small LED display without placing load on the CPU.

Using the ESP32’s I2S peripheral in parallel mode makes it possible to shift data out in the correct format to drive the LCD without bit-banging IO pins and using up precious CPU time. This leaves processor cycles free to do interesting things, like generating 3D images using [cnlohr]’s routines from the Channel 3 project. There’s a little extra work to be done, with the frame signal being generated by an external flip flop and some fudging with the arrangement of various registers, but it’s a remarkably tidy repurposing of the I2S hardware, which seems to be the gift that keeps on giving. (Here it is spitting out VGA video through a resistor DAC.)

Code is available on Github for those looking to get at the nuts and bolts of the hack. It’s another build that goes to show, it’s not the parts in your junk box that count, but how you use them.