Reverse-Engineering Makita Batteries To Revive Them

Modern lithium-ion battery packs for cordless power tools contain an incredible amount of energy, which necessitates that they come with a range of safeties. Although it’s good when the battery management system (BMS) detects a fault and cuts power to prevent issues, there exist the possibility of false positives. Having an expensive battery pack brick itself for no good reason is rather annoying, as is being unable to reuse a BMS in for example a re-manufactured battery. This was the reasoning that led [Martin Jansson] down the path of reverse-engineering Makita batteries for starters.

After that initial reverse-engineering attempt involving a firmware dump of the NEC (Renesas) F0513 MCU, [Martin] didn’t get back to the project until recently, when he was contacted by [Romain] who donated a few BMS boards to the cause. One of these features an STM32 MCU, which made the task much easier. Ultimately [Martin] was able to determine the command set for the Maxim OneWire-based communication protocol, as was a hidden UART mode.

Due to the critical timing required, off-the-shelf programmers didn’t work, so an Arduino Uno-based programmer (ArduinoOBI) was created instead, which can be found on GitHub along with the Open Battery Information desktop application which provides access to these BMS features after connecting to the battery pack. Although only Makita is supported right now, [Martin] would like to see support for other brands being added as well.

Voyager 1 Once Again Returning Science Data From All Four Instruments

As humanity’s furthest reach into the Universe so far, the two Voyager spacecraft’s well-being is of utmost importance to many. Although we know that there will be an end to any science mission, the recent near-death experience by Voyager 1 was a shocking event for many. Now it seems that things may have more or less returned to normal, with all four remaining scientific instruments now back online and returning information.

Since the completion of Voyager 1’s primary mission over 43 years ago, five of its instruments (including the cameras) were disabled to cope with its diminishing power reserves, with two more instruments failing. This left the current magnetometer (MAG), charged particle (LECP) and cosmic ray (CRS) instruments, as well as the plasma wave subsystem (PWS). These are now all back in operation based on the returned science data after the Voyager team confirmed previously that they were receiving engineering data again.

With Voyager 1 now mostly back to normal, some housekeeping is necessary: resynchronizing the onboard time, as well as maintenance on the digital tape recorder. This will ensure that this venerable spacecraft will be all ready for its 47th anniversary this fall.

Thanks to [Mark Stevens] for the tip.

Easy Retro 3D Look With Voxel Displacement Renderer

Voxels are effectively like 3D pixels, and they form an integral part of what is commonly referred to as a ‘retro 3D’ look, with pixelated edges sharp enough to cut your retinas on. The problems with modeling a scene using voxels come in the form of creating the geometry and somehow making a physics engine work with voxels rather than conventional triangular (or quad) meshes.

The same scene in Blender (above) and in the voxel-based renderer (below). (Credit: Daniel Schroeder)
The same scene in Blender (above) and in the voxel-based renderer (below). (Credit: Daniel Schroeder)

The approach demonstrated by [Daniel Schroeder] comes in the form of a Voxel Displacement Renderer implemented in C++ and using the Vulkan API. Best part of it? It only requires standard meshes along with albedo and displacement maps.

These inputs are processed by the C++-based tools, which generate the voxels that should be rendered and their properties, while the GLSL-based shader handles the GPU-based rendering step. The pre-processing steps required make it a good idea to bake these resources rather than try to process it in real-time. With that done, [Daniel]’s demo was able to sustain a solid 100+ FPS on a Radeon RX 5700 XT GPU at 1440p, and 60+ FPS on a Steam Deck OLED.

In a second blog post [Daniel] goes through his motivations for this project, with it originally having been intended as a showpiece for his resume, but he can imagine it being integrated into a game engine.

There are still questions to be resolved, such as how to integrate this technique for in-scene characters and other dynamic elements (i.e. non-static scenery), but in terms of easing voxel-based rendering by supporting a standard mesh-based workflow it’s an intriguing demonstration.

Continue reading “Easy Retro 3D Look With Voxel Displacement Renderer”

Forsp: A Forth & Lisp Hybrid Lambda Calculus Language

In the world of lambda calculus programming languages there are many ways to express the terms, which is why we ended up with such an amazing range of programming languages, even if most trace their roots back to ALGOL. Of the more unique (and practical) languages, Lisp and Forth probably range near the top, but what if you were to smudge both together? That’s what [xorvoid] did and it resulted in the gracefully titled Forsp programming language. Unsurprisingly it got a very warm and enthusiastic reception over at Hacker News.

While keeping much of Lisp-isms, the Forth part consists primarily out of it being very small and easy to implement, as demonstrated by the C-based reference implementation. It also features a Forth-like value/operand stack and function application. Also interesting is Forsp using call-by-push-value (CBPV), which is quite different from call-by-value (CBV) and call-by-name (CBN), which may give some advantages if you can wrap your mind around the concept.

Even if practicality is debatable, Forsp is another delightful addition to the list of interesting lambda calculus demonstrations which show that the field is anything but static or boring.

TDS 744A Scope Teardown Fixes Dodgy Channel

There are a lot of oscilloscopes from around the 1990s which are still very much desirable today, such as the Tektronix TDS 744A which [DiodesGoneWild] got his grubby mitts on. This is a 500 MHz, 4-channel scope, with a capture rate of 500 MS/s (4 channels) to 2 GS/s (1 channel). It also has a color display and even comes with a high-density (1.44 MB) floppy drive. Unfortunately this particular unit was having trouble with its fourth channel, and its NuColor display had degraded, something that’s all too common with this type of hybrid CRT/LCD (LCCS) technology.

Starting with a teardown of the unit to inspect the guts, there was no obvious damage on the PCBs, nor on the acquisition board which would explain the weird DC offset on the fourth channel. After cleaning and inspecting the capture module and putting the unit back together, the bias seen on channel four seemed to disappear. A reminder that the best problems are the ones that solve themselves. As for the NuColor display, this uses a monochrome CRT (which works fine) and an LCD with color filters. It’s the latter which seems degraded on this unit, with a repair still being planned.

We covered NuColor-based devices before, which offer super-sharp details that are hard to capture even with modern-day LCDs, never mind the ones of the 90s. Fixing these NuColor displays can be easy-ish sometimes, as [JVG] found when tearing apart a very similar Tektronix TDX-524A which required a power supply fix and the removal of goopy gel between the CRT and LCD to restore it.

Continue reading “TDS 744A Scope Teardown Fixes Dodgy Channel”

From Nissan ICE Pickup To BEV With Nissan Leaf Heart

First run of the motor with battery pack still externally connected.

Last year [Jimmy] got a request from a customer ([Dave]) to help convert a 1998 Nissan Frontier pickup into an electric drive vehicle, with a crashed 2019 Nissan Leaf providing the battery and electric motor for the conversion. He has documented the months-long journey with plenty of photos, as well as a series of videos over at the [EVSwap Conversions] YouTube channel. While the idea sounds easy enough, there’s a lot more to it than swapping out the ICE with an electric motor and sticking some batteries to the bottom of the car somewhere with double-sided tape. The pickup truck got effectively stripped down  and gutted, before the 110 kW (150 HP) motor got installed using an adapter plate.

The donor Leaf’s battery pack came in at a decently sized 40 kWh, which should give the converted Nissan Frontier BEV a range of easily 100 miles. This pack was split up into two packs, which got put into a custom aluminium battery box, each mounted on one side of the driveshaft. The charging port got installed on the front of the car, next to the logo, discreetly behind a panel. The front of the car had much of the openings that were needed for the ICE’s radiator sealed up for reduced air friction, along with the new low-friction tires that got installed. Although this converted car still has a radiator, it only needs to assisting cooling the motor stack (including inverter and charger) when driving slowly or charging, making it far less demanding and thus allows for a more sleek front.

As a bonus, the car still has the manual 5-gear shift, just without a clutch, and the pickup bed can now also tilt, albeit with hydraulics (so far). Considering that it started with a decent 1998 pickup and totaled Nissan Leaf, this is among the cleanest conversions we have seen, not to mention a good use of a crashed BEV.

Thanks to [JohnU] for the tip.

Continue reading “From Nissan ICE Pickup To BEV With Nissan Leaf Heart”

EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator

Alibaba’s EMO (or Emote Portrait Alive) framework is a recent entry in a series of attempts to generate a talking head using existing audio (spoken word or vocal audio) and a reference portrait image as inputs. At its core it uses a diffusion model that is trained on 250 hours of video footage and over 150 million images. But unlike previous attempts, it adds what the researchers call a speed controller and a face region controller. These serve to stabilize the generated frames, along with an additional module to stop the diffusion model from outputting frames that feature a result too distinct from the reference image used as input.

In the related paper by [Linrui Tian] and colleagues a number of comparisons are shown between EMO and other frameworks, claiming significant improvements over these. A number of examples of talking and singing heads generated using this framework are provided by the researchers, which gives some idea of what are probably the ‘best case’ outputs. With some examples, like [Leslie Cheung Kwok Wing] singing ‘Unconditional‘ big glitches are obvious and there’s a definite mismatch between the vocal track and facial motions. Despite this, it’s quite impressive, especially with fairly realistic movement of the head including blinking of the eyes.

Meanwhile some seem extremely impressed, such as in a recent video by [Matthew Berman] on EMO where he states that Alibaba releasing this framework to the public might be ‘too dangerous’. The level-headed folks over at PetaPixel however also note the obvious visual imperfections that are a dead give-away for this kind of generative technology. Much like other diffusion model-based generators, it would seem that EMO is still very much stuck in the uncanny valley, with no clear path to becoming a real human yet.

Continue reading “EMO: Alibaba’s Diffusion Model-Based Talking Portrait Generator”