Figuring out what the Earth’s climate is going to do at any given point is a difficult task. To know how it will react to given events, you need to know what you’re working with. This requires an accurate model of everything from ocean currents to atmospheric heat absorption and the chemical and literal behavior of everything from cattle to humans to trees.
These days, just about anyone with a pulse can fall on a keyboard and make an AI image generator spurt out some kind of vaguely visual content. A lot of it is crap. Some of it’s confusing. But most of all, creators hate it when their hand-crafted works are compared with these digital extrusions from mathematical slop. Enter the “not by AI” badge.
Basically, it’s exactly what it sounds like. A sleek, modern badge that you slap on your artwork to tell people that you did this, not an AI. There are pre-baked versions for writers (“written by human”), visual artists (“painted by human”), and musicians (“produced by human”). The idea is that these badges would help people identify human-generated content and steer away from AI content if they’re trying to avoid it.
It’s not just intended to be added to individual artworks. Websites that have “at least 90%” of content created by humans are invited to host the badge, along with apps, too. This directive reveals an immediate flaw—the badge would easily confuse someone if they read the 10% of content by AI on a site wearing the badge. There’s also nothing stopping people from slapping the badge on AI-generated content and simply lying to people.
You might take a more cynical view if you dig deeper, though. The company is charging for various things, such as a monthly fee for businesses that want to display the badges.
It’s more of an inspired build rather than screen-accurate, but they’re still pretty neat. A Bell & Howell camera was the basis for the binoculars used in the film, in fact, and this build starts with the same tri-lens model. Found vintage objects are often used in sci-fi with some modifications, but more commonly in lower-budget productions. Star Wars can do it too, though, it seems.
Turning them into binoculars requires the construction of a viewfinder, which was made out of hand-cut Sintra PVC foam board. Lots of leather wrap had to be removed from the camera, too, which offered a happy accident—it left a heavily-weathered aluminum surface that looked great for a Star Wars prop. A few random controls were then added to disguise the camera as an advanced pair of futuristic binoculars. LED lighting was also installed internally to make the build glow as if it actually contained some powered sci-fi optics. It also got a hand-made leather strap for that rugged aesthetic so fitting for the film.
It’s not a functional build; we’d love to see someone build a set of AR or rangefinder binoculars that still look the part. However, this would be a great addition to any Poe Dameron costume you might have planned for the next upcoming Comic Con.
Here’s our question, though. Does it suck you out of your suspension of disbelief when filmmakers use found objects as the basis for props? Or is it a neat thing when you spot such an example? Video after the break.
Simon was a cutting-edge “computer controlled game” when it launched back in 1978. It would flash out a pattern of ever-increasing length and you had to copy it if you didn’t want to lose. The name, obviously inspired by the traditional folk game of Simon Says. [Robert] recently found an original vintage Simon game, but it had been non-functional for many years. However, with some astute analysis and repair, he was able to get it working again.
Upon powering the unit up, the best [Robert] could get out of it was some flickering of the lights, nothing more. It wouldn’t start a game or respond to button presses. Eventually, probing around showed [Robert] that the TMS1000 microcontroller wasn’t running properly. It seemed to concern the connection to the “Game Mode” selector switch. Thanks to a fault and the multiplexed layout of the controls, it was appearing to the microcontroller that a button was always pressed at all times.
The solution [Robert] landed on was to separate out the signal from the Game Mode switch by socketing the TMS1000 and lifting the relevant pin. . The signal was then wired back up to the chip via diodes so that it wouldn’t interfere with the other outputs and inputs on the chip used to read the other buttons. This meant that the unit was locked into the single main game mode, but it did get it operational again.
Outside of the depths of the ocean, or cartoons, we’re not typically accustomed to plant life glowing or otherwise generating its own light. However, science is helping to change all that. Now, you can order some bioluminescent plants of your very own from Light Bio.
Light Bio is a startup company working in the synthetic biology space. It’s not content to simply pursue research behind closed doors, and is now sharing its work with the public. It has announced it plans to start selling petunias to U.S. customers which literally glow with the magic of bioluminescence.
Petunias don’t normally glow, but with some modifications, it turns out they can be convinced to. It took a large team of 26 scientists to figure out how to boost bioluminescence in plants, by isolating and optimizing genes sourced from various glowing mushroom species.
The plants will be available from April, with Light Bio planning to sell them as “Firefly Petunias.” It might sound like scary sci-fi tech, but the USDA has apparently already signed off on Light Bio selling these to the public.
Something’s been bothering me, though. It’s at the edge of my memory… I think my old housemate played bass for Glowing Petunias back in 2015. Something like that, anyway… video after the break.
Sonolithography is a method of patterning materials on to a surface using finely-controlled sound waves. To achieve this, [Oliver] created a circular array of sixteen ultrasonic transducers controlled via shift registers and gate driver ICs, under the command of a Raspberry Pi Pico. He then created an app for controlling the transducer array via an attached computer with a GUI interface. It allows the phase and amplitude of each element of the array to be controlled to create different patterns.
Creating a pattern is then a simple matter of placing the array on a surface, firing it up in a given drive mode, and then atomising some kind of dye or other material to visualize the pattern of the acoustic waves.
It could be a useful tool for studying the interactions of ultrasonic waves, or it could just be a way to make neat patterns in ink and dye if that’s what you’re into. [Oliver] notes the techniques of sonolithography could also have implications in biology or fabrication in future, as well. If you found this interesting, you might like to study up on ultrasonic levitation, too!
Lots of things beep these days. Washing machines, microwaves, fridge — even drill battery chargers. If you’re on Team Makita, it turns out you can actually change the melody of your charger’s beep, thanks to a project from [Real-Time-Kodi].
The hack is for the Makita DR18RC charger, and the implementation of the hack is kind of amusing. [Real-Time-Kodi] starts by cutting the trace to the buzzer inside the charger. Then, an Arduino is installed inside the charger, hooked up to the buzzer itself and the original line that was controlling it. When it detects the charger trying to activate the buzzer, it uses this as a trigger to play its own melody on the charger instead. The Arduino also monitors the LEDs on the charger in order to determine the current charge state, and play the appropriate jingle for the situation.
It’s an amusing hack, and one that could certainly confuse the heck out of anyone expecting the regular tones out of their Makita charger. It also shows that the simple ways work, too — there was no need to dump any firmware or decompile any code.