Blender Builds LEGO Models

Blender is a free and open source computer graphics package that’s used in the production of everything from video games to feature films. Now, as demonstrated by [Joey Carlino], the popular program can even be used to convert models into LEGO.

This new feature available in Blender 3.4 allows for the use of instance attributes in a way that a large number of points on a model can be created without causing undue strain on (and possible crashing of) the software. Essentially, an existing model is split into discrete points at specific intervals. The spacing of the intervals is set to be exactly that of LEGO bricks, which gives the model the low-resolution look of a real LEGO set. From there, a model brick is created and placed at each of these points, and then colors can be transferred to the bricks individually.

The demonstration that [Joey] uses is converting a beach ball model to LEGO, but using these tools on other models delivers some striking results. He goes over a lot of the details on how to create these, and it would only be a short step from there to ordering the bricks themselves. Or, using these models and sending them over to a 3D printer straight from Blender itself. Not bad for free software!

13 thoughts on “Blender Builds LEGO Models

  1. It looks neat, but unless I’m missing something – and from the obvious seam lines it doesn’t look like I am – I can’t imagine the person who did this has ever put together a real LEGO model before.

    Something made entirely out of 1×1 bricks is going to have no stability at all. The author also seems to be under the misapprehension that LEGO models are only made out of that singular type of brick.

    It’s a neat demo of the instancing feature of Blender, but based on the headline, I was under the impression that this was going to be something that turns Blender into a contender against programs like MLCad or LDraw. Nice clickbait, I guess.

    1. Hacks, by their very nature, are not necessarily prepackaged solutions, but rather “building blocks” (see what I did there) for further hacks.

      Aggregation of Lego pieces into larger shapes is a separate and very difficult project. What if you are out of the pieces you need? What if there is a six month lead time for the pieces you need? Can you actually get all of the pieces it says you need, do they even exist? The software will need an automatically updating list of what parts are actually available today. Can you tell it to try to make the thing using the pieces you have? That is what you expect, right? It sounds like a hard AI project on the order of the travelling salesman problem.

    2. To add to what X said; slow your roll- this is a demo of Blender’s very new geometry nodes feature – a visual programming based toolkit for parametrically manipulating or generating 3D model geometry. A basic programmatic method isn’t going to compare to the intention of a human-designed model. With some more development I think this could be developed into a more informed and life-like tool – probably would require a custom script given most visual programming tools don’t do as great with recursion or decision trees. And this kind of exposure might inspire someone to take the initiative to do just that.

      1. more than this! Group adiacent color voxel, arrange them in solid blocks of the same color, than using a sorting algorithm that refers to a database of coherent stripes and blocks.split everything in the needed pieces

        1. You are basically asking for a software equivalent to a full team of Lego master builders that can produce weeks worth of work in seconds. I’m not here to say it’s impossible, but this is “robots will steal your job” thinking.

  2. Nice! Looking forward to watching this tomorrow (it’s late and I’m running a fever blah). I’ve been working on incorporating geometry nodes into a sculpting workflow, and it will be very interesting to see what I can extract and learn from this example, because the documentation is awfully flat still in this area. It’s really fascinating with the entire system in how procedural it can become, including with logic branching options: even without having seen exactly the route this specific workflow took, it should be entirely possible to build upon this to do more than just 1×1 bricks, based on adjacent node color. So what gets really neat about the geometry nodes system (which, btw, is present before 3.4) is that unlike simply creating visual artifacts via shaders for fun renders but of no utility for something like 3d printing (at least not without substantial work or specialized tooling that I’m unaware of), the geometry nodes really do create actual 3d mesh that can be acted upon in all the expected ways.

    I was torn between learning sculpting on zBrush or Blender, but after a few days working in Core I’m actually starting to lean back towards Blender. As nice as the insert mesh brush seems to be in zBrush, I feel like the geometry nodes system in Blender 3 has a lot more power and potential, at least for someone who is primarily sculpting with the intent to 3d print (and who has both a software engineering and fine arts background, rather than just fine arts). For all the hype around zBrush, I haven’t been nearly as impressed as I expected to be. Of course, the part where my older SpaceExplorer still works in Blender 3 but not zBrush may have biased me a little too, lol.

  3. Looks lovely. I might need to look into this to visualise what 3D, RGBCMYKWT dithering looks like.
    (string[] RGBCMYKWT = [“red”, “green”, “blue”, “cyan”, “magenta”, “yellow”, “black”, “white”, “transparent”])

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.