Running A Minecraft Server On A WiFi Light Bulb

WiFi-enabled ‘smart’ light bulbs are everywhere these days, and each one of them has a microcontroller inside that’s capable enough to run all sorts of interesting software. For example, [vimpo] decided to get one running a minimal Minecraft server.

The Bl602-equipped board inside the LED lightbulb. (Credit: vimpo, YouTube)
The Bl602-equipped board inside the LED lightbulb. (Credit: vimpo, YouTube)

Inside the target bulb is a BL602 MCU by Bouffalo Lab, that features not only a radio supporting 2.4 GHz WiFi and BLE 5, but also a single-core RISC-V CPU that runs at 192 MHz and is equipped with 276 kB of RAM and 128 kB flash.

This was plenty of space for the minimalist Minecraft server [vimpo] wrote several years ago. The project says it was designed for “machines with limited resources”, but you’ve still got to wonder if they ever thought it would end up running on a literal lightbulb at some point.

It should be noted, of course, that this is not the full Minecraft server, and it should only be used for smaller games like the demonstrated TNT run mini game.

Perhaps the next challenge will be to combine a large set of these light bulbs into a distributed computing cluster and run a full-fat Minecraft server? It seems like a waste to leave the BL602s and Espressif MCUs that are in these IoT devices condemned to a life of merely turning the lights on or off when we could have them do so much more.

17 thoughts on “Running A Minecraft Server On A WiFi Light Bulb

    1. 9 out of 10 times when they brake the lihtbulb is fine, just the powersupply is burnt. you can still hook up 5/3.3v to the board and it might work just fine … i’d call that “free”

        1. it should last for years but it depends un usage and external factors; heat kills the electronics.
          i had 4or5 smart lights burn out after maybe a year of use. on the other hand, ive had the exact same bulbs work for years in other fixtures.
          i guess its a como of QC and heat dissipation.

          one thing is certain, it was the little tansformer every time.

      1. ” It seems like a waste to leave the BL602s and Espressif MCUs that are in these IoT devices condemned to a life of merely turning the lights on or off when we could have them do so much more.”

        Sound to like you are supposed to buy the bulb to save the MCU from a boring life instead of just buying the MCU?

  1. With that much communications and compute horsepower this sounds like an ideal setup for a distributed LiFi-like broadcast setup.

    Also a great plot element for an infiltration story.

    1. That’s an interesting idea. Do the phosphors in white LEDs react fast enough to make that invisible?
      On a related note I recently visited a restaurant that was mainly illuminated by strings of half wave rectified LEDs, made me feel nauseous before the food ever came.

      1. I measured “normal” white (i.e. red+yellow) phosphors to have time constants in the 1-2 microsecond range, so they could support baud rates of 100 kHz. Actual bit rates might be 2-5x that, depending on coding complexity.

        But the underlying blue LED can be modulated much faster, and even with relatively mundane drivers could support 10 Mbps outbound bandwidth, and the narrowband blue light would afford better SNR through filtering out other ambient light.

        The return channel, if needed, could be a plain old near-infrared receiver. Bit rates vary with range, SNR, etc., but existing COTS solutions support everything from 4 Mbps to 9600 bps, or even longer range/poor SNR but slower solutions with IR remote-control signalling protocols.

  2. People of the past had such great imagination and plans for what would happen when electronics got so cheap and small that we could have the computing power of a whole desktop computer, by 1980’s standards, in a light bulb.

    What we actually do with it is just turning the lights on and off.

    Imagine that and extrapolate 50 years into the future, when we will be using the equivalent of a 64 GB Ryzen 9 setup running at 5,4 GHz to ring the doorbell.

    1. And the funny part will be that it makes sense to do so.

      Because manufacturing a simpler microchip to make the “ding dong” sound at the press of a button would cost more and have a longer lead time, and besides there’s nobody left who can design such archaic stuff anyhow. If it doesn’t run a live LLM interpreter then nobody knows how to program it.

      How you operate it: you tell the LLM what you want the device to do through a speech recognition program and the LLM takes the description of the task and turns it into lower level instructions like Python on the fly, which then gets interpreted into machine instructions and executed. The reason why you need such a powerful CPU to run it is because the program flow is handled at the highest level by the LLM which monitors the user inputs and keeps re-generating the commands over and over.

      After all, the user might decide that the doorbell should make a “bong bong” sound instead, so they need to be able to talk to the doorbell and tell it to do that instead.

Leave a Reply to ShannonCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.