NASA is going back to the Moon! We’ll follow the crew of Artemis II every step of the way.
Continue reading “Following Artemis II’s Journey Around The Moon”
NASA is going back to the Moon! We’ll follow the crew of Artemis II every step of the way.
Continue reading “Following Artemis II’s Journey Around The Moon”
On Friday, Reuters reported that Amazon is going to try to get into the smartphone game…again. The Fire Phone was perhaps Amazon’s biggest commercial misstep, and was only on the market for about a year before it was discontinued in the summer of 2015. But now industry sources are saying that a new phone code-named “Transformer” is in the works from the e-commerce giant.
At this point, there’s no word on how much the phone would cost or when it would hit the market. The only information Reuters was able to squeeze out of their contacts was that the device would feature AI heavily. Real shocker there — anyone with an Echo device in their kitchen could tell you that Amazon is desperate to get you talking to their gadgets, presumably so they can convince you to buy something. While a smartphone with even more AI features we didn’t ask for certainly won’t be on our Wish List, if history is any indicator, we might be able to pick these things up cheap on the second-hand market.
On the subject of AI screwing everything up, earlier this week, the Electronic Frontier Foundation reported that The New York Times had started blocking the Internet Archive’s crawlers, citing concerns over their content being scraped up by bots for training data. The EFF likens this to a newspaper asking libraries to stop storing copies of their old editions, and warns that in an era where most people get their news via the Internet, not having an archived copy of sites like The Times will put holes in the digital record. They also point out that mirroring web pages for the purposes of making them more easily searchable is a widely accepted practice (ask Google) and has been legally recognized as fair use in court.
Assuming we take the NYT’s side of the story at face value, there’s a tiny part of our cold robotic heart that feels some sympathy for them. Over the last year or so, we’ve noticed some suspicious activity that we believe to be bots siphoning up content from the blog and Hackaday.io, and it’s resulted in a few technical headaches for us. On the other hand, what’s Hackaday here for if not to share information? Surely the same could be said for any newspaper, be it the local rag or The New York Times. If a chatbot learning some new phrases from us is the cost of doing business in 2026, so be it. Can’t stop the signal.
We’ll start things off this week with some breaking news from NASA: just days after the space agency announced the Artemis II crew was preparing to blast off towards the Moon as soon as March 6th, a new problem with the Space Launch System rocket has pushed the launch back indefinitely. According to NASA Administrator Jared Isaacman, problems encountered while loading helium into the Interim Cryogenic Propulsion Stage (ICPS) necessitate rolling the massive rocket back to the Vehicle Assembly Building (VAB) for diagnosis and repair.
The logistics of shuffling the vehicle 6.8 kilometers (4.2 miles) from the pad to the VAB is going to eat up at least a week, and sending it back the other way is naturally just as much of a production. Add in the time they’ll need to actually figure out what’s wrong with the ICPS and make the necessary repairs, and it’s easy to see why a March launch is almost certainly off the table. It’s frustrating to see the Artemis II mission get delayed this close to launch, but sending humans into space isn’t the sort of thing you can cut corners on.
Espressif has unveiled its latest major chip in the form of the ESP32-E22. Officially referred to as a Radio Co-Processor (RCP), it’s intended to be used via its PCIe 2.1 or SDIO 3.0 host interface to provide wireless communications to an SoC or similar.
This wireless functionality includes full WiFi 6E functionality across all three bands, 160 MHz channel bandwidth and 2×2 MU-MIMO, making it quite a leap from the basic WiFi provided by e.g. the ESP32-S* and -C* series. There is also Bluetooth Classic and BLE 5.4 support, which is a relief for those who were missing Bluetooth Classic in all but the original ESP32 for e.g. A2DP sinks and sources.
The ESP32-E22 processing grunt is provided by two proprietary Espressif RISC-V CPU cores that can run at 500 MHz. At this point no details appear to be available about whether a low-power core is also present, nor any additional peripherals. Since the graphics on the Espressif PR article appear to be generic, machine-generated images – that switch the chip’s appearance from a BGA to an LQFP package at random – there’s little more that we can gather from there either.
Currently Espressif is making engineering samples available to interested parties after presumed vetting, which would indicate that any kind of public release will still be a while off. Whether this chip would make for an interesting stand-alone MCU or SoC along the lines of the -S3 or -P4 will remain a bit of a mystery for a bit longer.
Thanks to [Rogan] for the tip.
The concept of remote video calls has been worked on since Bell’s phone company began pitching upgrading from telegrams to real-time voice calls. It wasn’t until the era of digital video and real-time video compression that commercial solutions became feasible, with the 1985 Image Data Corporation Photophone CP220 being an early example. The CP220 is also exceedingly rare due to costing around $25,000 USD when adjusted to inflation. This makes the teardown and repair on the [SpaceTime Junction] channel a rather unique experience.
Perhaps the coolest part of the device is that the manual is integrated into the firmware, allowing you to browse through it on the monochrome CRT. Unfortunately after working fine for a while the device released the magic smoke, courtesy of the usual Rifa capacitors doing their thing. This is why a full teardown was necessary, resulting in the PSU being dug out and having said capacitors swapped.
After this deal the device powered on again, happily accepting a video input and saving screenshots to the floppy drive before it was replaced with a FDD emulator running FlashFloppy firmware. Unfortunately no video call was attempted, probably because of the missing camera and having to set up a suitable POTS landline for the built-in modem. Hopefully we’ll see that in an upcoming video to see what we common folk were missing out on back in the day.
Continue reading “Powering On A 1985 Photophone CP220 Videoconference System”
Everyone loves a full-wave bridge rectifier, but there’s no denying that they aren’t 100% efficient due to the diode voltage drop. Which isn’t to say that with some effort we cannot create an ideal bridge rectifier using active components, as demonstrated by [Mousa] with an active bridge circuit. This uses the NXP TEA2208T active bridge rectifier controller, along with the requisite four MOSFETs.

Taking the circuit from the datasheet, a PCB was created featuring four FDD8N50NZ MOSFETs in addition to the controller IC. These were then compared to a diode-based bridge rectifier, showing the imperfections with the latter when analyzing the output using an oscilloscope.
As expected, the active rectifier’s output was also one volt higher than the diode bridge rectifier, which is another small boost to overall efficiency. According to NXP’s product page, there’s about a 1.4% efficiency gain at 90 VAC, with the chip being promoted for high-efficiency operations. When you consider that many designs like computer PSUs feature one or more diode bridge rectifiers often strapped to heatsinks, the appeal becomes apparent. As for [Mousa], he put this particular board in his laboratory PSU instead of the diode bridge rectifier, because why not.
Perhaps the biggest impediment to using an active rectifier is the cost, with the TEA2208T coming in at $4 on DigiKey for a quantity of 100, in addition to the MOSFETs, PCB, etc. If power efficiency isn’t the goal, then some wasted power and an aluminium heatsink is definitely cheaper.
Continue reading “Active Ideal Full Bridge Rectifier Using TEA2208T”
Neutrinos are exceedingly common in the Universe, with billions of them zipping around us throughout the day from a variety of sources. Due to their extremely low mass and no electric charge they barely ever interact with other particles, making these so-called ‘ghost particles’ very hard to detect. That said, when they do interact the result is rather spectacular as they impart significant kinetic energy. The resulting flash of energy is used by neutrino detectors, with most neutrinos generally pegging out at around 10 petaelectronvolt (PeV), except for a 2023 event.
This neutrino event which occurred on February 13th back in 2023 was detected by the KM3NeT/ARCA detector and has now been classified as an ultra-high energy neutrino event at 220 PeV, suggesting that it was likely a cosmogenic neutrinos. When we originally reported on this KM3-230213A event, the data was still being analyzed based on a detected muon from the neutrino interaction even, with the researchers also having to exclude the possibility of it being a sensor glitch.
By comparing the KM3-230213A event data with data from other events at other detectors, it was possible to deduce that the most likely explanation was one of these ultra-high energy neutrinos. Since these are relatively rare compared to neutrinos that originate within or near Earth’s solar system, it’ll likely take a while for more of these detection events. As the KM3NeT/ARCA detector grid is still being expanded, we may see many more of them in Earth’s oceans. After all, if a neutrino hits a particle but there’s no sensor around to detect it, we’d never know it happened.
Top image: One of the photo-detector spheres of ARCA (Credit: KM3NeT)
Hold onto your hats, everyone — there’s stunning news afoot. It’s hard to believe, but it looks like over-reliance on chatbots to do your homework can turn your brain into pudding. At least that seems to be the conclusion of a preprint paper out of the MIT Media Lab, which looked at 54 adults between the ages of 18 and 39, who were tasked with writing a series of essays. They divided participants into three groups — one that used ChatGPT to help write the essays, one that was limited to using only Google search, and one that had to do everything the old-fashioned way. They recorded the brain activity of writers using EEG, in order to get an idea of brain engagement with the task. The brain-only group had the greatest engagement, which stayed consistently high throughout the series, while the ChatGPT group had the least. More alarmingly, the engagement for the chatbot group went down even further with each essay written. The ChatGPT group produced essays that were very similar between writers and were judged “soulless” by two English teachers. Go figure.