Finding that his recently purchased LED Christmas lights defaulted to an annoying blinking pattern that took a ridiculous seven button presses to disable each time they were powered up, [Matthew Millman] decided to build a new power supply that keeps things nice and simple. In his words, the goal was to enable “all lights on, no blinking or patterns of any sort”.
Connecting the existing power supply to his oscilloscope, [Matthew] found the stock “steady on” setting was a 72 VAC peak-to-peak square wave at about 500 Hz. To recreate this, he essentially needed to find a 36 VDC power supply and swap the polarity back and forth at the same frequency. In the end the closest thing he could find in the parts bin was a HP printer power supply that put out 30 volts, so the lights aren’t quite as bright as they were before, but at least they aren’t blinking.
To turn that into a pair of AC square waves, the power supply is connected to a common L298 H-Bridge module. You might expect a microcontroller to show up at this point, but [Matthew] went old school, and created his two alternating 500 Hz square waves with a 555 timer and a 74HC74D dual flip-flop.
Unfortunately, he didn’t have the time to get a custom PCB made before Santa’s big night. Though as he points out, since legitimate L298s are backordered well into next year anyway, having the board in hand wouldn’t have helped much. The end result is that the circuit has to live on a breadboard for the current holiday season, but hopefully around this time next year we’ll get a chance to see the final product.
Speech commands are all the rage on everything from digital assistants to cars. Adding it to your own projects is a lot of work, right? Maybe not. [Electronoobs] shows a speech board that lets you easily integrate 255 voice commands via serial communications with a host computer. You can see the review in the video below.
He had actually used a similar board before, but that version was a few years ago, and the new module has, of course, many new features. As of version 3.1, the board can handle 255 commands in a more flexible way than the older versions.
Culminating a year-long project, [Usagi Electric] aka [David] has just wrapped up his single-bit vacuum tube computer. It is based on the Motorola MC14500 1-bit industrial controller, but since [David] changed the basic logic unit into an arithmetic-logic unit, he’s dubbing it the UE14500. Built on a wooden panel about 2.5 x 3 rabbit lengths excluding power supply. [David] admits he has cheated a little bit, in that he’s using two silicon diodes instead of a 6AL5 dual diode tube in his universal NOR gate on which the computer is based — but in his defense he notes that plenty of vacuum tube computers of the era used silicon diodes.
The tube he uses in the NOR gates is the 6AU6 miniature pentode, which he selected because of its availability, price, and suitability for low voltage. [David] runs this computer with two power supplies of +24 and -12 VDC, rather than the hundreds of volts typically used in vacuum tube designs. The modules are constructed on single-sided copper-clad PCB panels etched using a milling machine. The video below the break wraps up the 22-part series, where he fixes a few power supply issues and builds a remote front panel for I/O, and gives a demo of the computer in operation. Alas, this only completes one fourth of the project, as there are three more building blocks to build before the whole system is complete — Program Control (magnetic tape), RAM Memory bank, and a serial input/output module. We look forward to seeing the whole system up and running in the future.
Associate Professor [Homei Miyashita] from Meiji University’s School of Science and Technology in Tokyo has developed a new technology for reproducing taste on a television or monitor, a system called Taste the TV (TTTV). The team of researchers used taste sensors to sample a variety of foods, and came up with a palette of 10 different aerosol flavors which can be combined in various ratios. The taste is generated in a staging area at the top of the screen onto a thin plastic film, which is then scrolled down into position.
Possible applications shown in the video below the break include cooking programs, restaurant menus, and wine tasting events. We’re not quite sure how popular this would be to consumers. Tele-tasting a cooking show with friends would be inconvenient, if not unsanitary. We’re also not aware that current video interface protocols such as HDMI or ATSC include any provisions for senses other than sight and sound. If you have access to scholarly journals, [Prof Miyashita] research paper on TTTV is available in the 34th Annual ACM Symposium on User Interface Software and Technology.
We’ve written about a couple of taste-generating projects before, see here and here.
When you hear raytracing, you might think of complex dark algorithms that to stare too long at their source code invites the beginning of madness. And you’re technically not far off from the truth, but [h3r2tic] put a small open-source ray tracing game demo up on GitHub. The actual rust code powering the game is relatively short (just four files), with the longest file being the physics file. But, of course, there is a small mountain of code under this sample in the form of libraries.
Kajiya, physx-rs, and dolly are the three libraries that make this little demo possible. Kajiya, in particular, is what makes raytracing possible as it uses the newer RTX features (so only more recent Nvidia and AMD cards are supported) and Vulkan bindings. But, of course, it isn’t wholly ray-traced as we are still several years out from proper real-time raytracing. Nevertheless, the blend between raytracing and traditional rasterization looks incredible. The most important thing about this simple tiny sample isn’t the game itself but what it stands for. It shows how easy it is to create a sample like this. Even just five years, creating a demo like this required massive effort and expertise.
Visually, it is stunning to look at. While the reflections are most apparent, the takeaway from this is the ease that real-time global illumination brings. A quick look through the code shows that there are very few lights in the scene, despite looking well lit with soft shadows. Traditional video games spend a significant amount of development time lighting a scene, placing additional lights, and tweaking them to make up for all the shortcuts that lighting has to take in a rasterized environment. As more and more games are built with raytracing in mind rather than tacked on at the end, we can ditch the small crumbling mountain of hacks that we are forced to use in games today and just rely on the rays to accurately light a scene.
Because the ink is alive, it is technically programmable in the sense that it can self-assemble proteins into nanofibers, and further assemble those into nanofiber networks that comprise hydrogels.
One of the researchers compared the ink to a seed, which has everything it needs to eventually grow into a glorious tree. In this way, the ink could be used as a renewable building material both on Earth and in space. Though the ink does not continue to grow after being printed, the resulting structure would be a living system that could theoretically heal itself.
The ink creation process begins when the researchers induce genetically-engineered bacteria cultures to grow the ink, which is also made of living cells. The ink is then harvested and becomes gelatin-like, holding its shape well enough to go through a 3D printer. It even passes the bridging test, supporting its own weight between pillars placed up to 16 mm apart. (We’d like to see a Benchie.)
The annual meeting of the Chaos Computer Club, Germany’s giant hacker group, is online again this year. While those of us here are sad that we don’t get to see our hacker friends in person, our loss is your gain — the whole thing is online for the entire world to enjoy.
This year’s Congress has gone entirely decentralized, with many local clubs hosting their own video streams and “stages”. Instead of four tracks, there are now six or seven tracks of talks going on simultaneously, so prepare to be overwhelmed by choice. You can find the overall schedule here, so if you see anything you’d like to watch, you’ll know when to tune in.
Like last year, there is also a parallel 2D simulation world, like Zelda with videoconferencing, but for which you’ll need a ticket, and they’re sold out. (Check out the demo video if you want to see what that’s about.) And what would a conference be without t-shirts, armbands, and even a sticker exchange? Or course, it all has to be done by mail, but you do what you can.
We’ll be keeping our eyes on the talks, and let you know if we see anything good. If you do the same, let us know in the comments!