Want to build your own espresso machine, complete with open-source software to drive it? The diyPresso might be right up your alley.
It might not be the cheapest road to obtaining an espresso machine, but it’s probably the most economical way to turn high-quality components (including a custom-designed boiler) and sensors into a machine of a proven design.
Coffee and the machines that turn it into a delicious beverage are fertile ground for the type of folk who like to measure, modify, and optimize. We’ve seen DIY roasters, grinders, and even a manual lever espresso machine. There are also many efforts at modifying existing machines with improved software-driven controls but this is the first time we’ve seen such a focused effort at bringing the DIY angle to a ground-up espresso machine specifically offered as a kit.
Curious to know more? Browse the assembly manual or take a peek at the software’s GitHub repository. You might feel some ideas start to flow for your next coffee hack.
NASA’s ACS3 (Advanced Composite Solar Sail System) is currently fully deployed in low Earth orbit, and stargazers can spot it if they know what to look for. It’s actually one of the brightest things in the night sky. When the conditions are right, anyway.
What conditions are those? Orientation, mostly. ACS3 is currently tumbling across the sky while NASA takes measurements about how it acts and moves. Once that’s done, the spacecraft will be stabilized. For now, it means that visibility depends on the ACS’s orientation relative to someone on the ground. At it’s brightest, it appears as bright as Sirius, the brightest star in the night sky.
ACS3 is part of NASA’s analysis and testing of solar sail technology for use in future missions. Solar sails represent a way of using reflected photons (from sunlight, but also possibly from a giant laser) for propulsion.
This perhaps doesn’t have much in the way of raw energy compared to traditional thrusters, but offers low cost and high efficiency (not to mention considerably lower complexity and weight) compared to propellant-based solutions. That makes it very worth investigating. Solar sail technology aims to send a probe to Alpha Centauri within the next twenty years.
Want to try to spot ACS3 with your own eyes? There’s a NASA app that can alert you to sighting opportunities in your local time and region, and even guide you toward the right region of the sky to look. Check it out!
The Voynich Manuscript is a medieval codex written in an unknown alphabet and is replete with fantastic illustrations as unusual and bizarre as they are esoteric. It has captured interest for hundreds of years, and expert [Lisa Fagin Davis] shared interesting results from using multispectral imaging on some pages of this highly unusual document.
We should make it clear up front that the imaging results have not yielded a decryption key (nor a secret map or anything of the sort) but the detailed write-up and freely-downloadable imaging results are fascinating reading for anyone interested in either the manuscript itself, or just how exactly multispectral imaging is applied to rare documents. Modern imaging techniques might get leveraged into things like authenticating sealed packs of Pokémon cards, but that’s not all it can do.
Because multispectral imaging involves things outside our normal perception, the results require careful analysis rather than intuitive interpretation. Here is one example: multispectral imaging may yield faded text visible “between the lines” of other text and invite leaping to conclusions about hidden or erased content. But the faded text could be the result of show-through (content from the opposite side of the page is being picked up) or an offset (when a page picks up ink and pigment from its opposing page after being closed for centuries.)
[Lisa] provides a highly detailed analysis of specific pages, and explains the kind of historical context and evidence this approach yields. Make some time to give it a read if you’re at all interested, we promise it’s worth your while.
What happens when an unfortunate bug ends up in a spider’s web? It gets bitten and wrapped in silk, and becomes a meal. But if the web belongs to an orb-weaver and the bug is a male firefly, it seems the trapped firefly — once bitten — ends up imitating a female’s flash pattern and luring other males to their doom.
Fireflies communicate with flash patterns (something you can experiment with yourself using nothing more than a green LED) and males looking to mate will fly around flashing a multi-pulse pattern with their two light-emitting lanterns. Females will tend to remain in one place and flash single-pulse patterns on their one lantern.
When a male spots a female, they swoop in to mate. Spiders have somehow figured out a way to actively take advantage of this, not just inserting themselves into the process but actively and masterfully manipulating male fireflies, causing them to behave in a way they would normally never do. All with the purpose of subverting firefly behavior for their own benefit.
It all started with an observation that almost all fireflies in webs were male, and careful investigation revealed it’s not just some odd coincidence. When spiders are not present, the male fireflies don’t act any differently. When a spider is present and detects a male firefly, the spider wraps and bites the firefly differently than other insects. It’s unknown exactly what happens, but this somehow results in the male firefly imitating a female’s flash patterns. Males see this and swoop in to mate, but with a rather different outcome than expected.
The research paper contains added details but it’s clear that there is more going on in this process than meets the eye. Spiders are already fascinating creatures (we’ve seen an amazing eye-tracking experiment on jumping spiders) and it’s remarkable to see this sort of bio-hacking going on under our very noses.
Ever needed a strong yet adhesive-free way to really stick PLA to glass? Neither have we, but nevertheless there’s a way to use aluminum foil and an IR fiber laser to get a solid bond with a little laser welding between the dissimilar materials.
It turns out that aluminum can be joined to glass by using a pulsed laser process, and PLA can be joined to aluminum with a continuous wave laser process. Researchers put them together, and managed to reliably do both at once with a single industrial laser.
By putting a sacrificial sheet of thin aluminum foil between 3D printed PLA and glass, then sending the laser through the glass into the aluminum, researchers were able to bond it all together in an adhesive-free manner with precise control, and very little heat to dissipate. No surface treatment of any kind required. The bond is at least as strong as any adhesive-based solution, so there’s no compromising on strength.
When it comes to fabrication, having to apply and manage adhesives is one of the least-preferable options for sticking two things together, so there’s value in the idea of something like this.
Still, it’s certainly a niche application and we’ll likely stick to good old superglue, but we honestly didn’t know laser welding could bond aluminum to glass or to PLA, let along both at once like this.
Researchers at the University of British Columbia leveraged an unusual discovery into ultra-black material made from wood. The deep, dark black is not the result of any sort of dye or surface coating; it’s structural change to the wood itself that causes it to swallow up at least 99% of incoming light.
The discovery was partially accidental, as researchers happened upon it while looking at using high-energy plasma etching to machine the surface of wood in order to improve it’s water resistance. In the process of doing so, they discovered that with the right process applied to the right thickness and orientation of wood grain, the plasma treatment resulted in a surprisingly dark end result. Fresh from the plasma chamber, a wood sample has a thin coating of white powder that, once removed, reveals an ultra-black surface.
The resulting material has been dubbed Nxylon (the name comes from mashing together Nyx, the Greek goddess of darkness, with xylon the Greek word for wood) and has been prototyped into watch faces and jewelry. It’s made from natural materials, the treatment doesn’t create or involve nasty waste, and it’s an economical process. For more information, check out UBC’s press release.
You have probably heard about Vantablack (and how you can’t buy any) and artist Stuart Semple’s ongoing efforts at making ever-darker and accessible black paint. Blacker than black has applications in optical instruments and is a compelling thing in the art world. It’s also very unusual to see an ultra-black anything that isn’t the result of a pigment or surface coating.
Meta’s Quest VR headset recently got the ability to accept and display video over USB-C, and it’s started some gears turning in folks’ heads. [Ian Hamilton] put together a quick concept machine consisting of a Raspberry Pi 400 that uses a VR headset as its monitor, which sure seems like the bones of a new breed of cyberdeck.
The computer-in-a-keyboard nature of the Pi 400 means that little more than a mouse and the VR headset are needed to get a functional computing environment. Well, that and some cables and adapters.
What’s compelling about this is that the VR headset is much more than just a glorified monitor. In the VR environment, the external video source (in this case, the Raspberry Pi) is displayed in a window just like any other application. Pass-through can also be turned on, so that the headset’s external cameras display one’s surroundings as background. This means there’s no loss of environmental awareness while using the rig.
[Note: the following has been updated for clarity and after some hands-on testing] Video over USB-C is technically DisplayPort altmode, and both the video source and the USB-C cable have to support it. In [Ian]’s case, the Raspberry Pi 400 outputs HDMI and he uses a Shadowcast 2 capture card to accept HDMI on one end and outputs video over USB-C on the other.
Here’s how it works: the Quest has a single USB-C port on the side, and an app (somewhat oddly named “Meta Quest HDMI link”) running on the headset takes care of accepting video over USB and displaying it in a window within the headset. The video signal expected is UVC (or USB Video Class), which is what most USB webcams and other video devices output. (There’s another way to do video over USB-C which is technically DisplayPort altmode, and both the video source and the USB-C cable have to support it. That is not what’s being used here; the Quest does not support this format. Neither is it accepting HDMI directly.) In [Ian]’s case, the Raspberry Pi 400 outputs HDMI and he uses a Shadowcast 2 capture card to accept HDMI on one end and output UVC video on the other, which is then fed into the Quest over a USB-C cable.
As a concept it’s an interesting one for sure. Perhaps we’ll see decks of this nature in our next cyberdeck contest?