G-code Goes Binary With Proposed New Format

Art of 3D printer in the middle of printing a Hackaday Jolly Wrencher logo

G-code is effective, easily edited, and nearly ubiquitous when it comes to anything CNC. The format has many strengths, but space efficiency isn’t one of them. In fact, when it comes to 3D printing in particular file sizes can get awfully large. Partly to address this, Prusa have proposed a new .bgcode binary G-code format. You can read the specification of the new (and optional) format here.

The newest version of PrusaSlicer has support for .bgcode, and a utility to convert ASCII G-code to binary (and back) is in the File menu. Want to code an interface of your own? The libbgcode repository provides everything needed to flip .gcode to .bgcode (with a huge file size savings in the process) and vice versa in a way that preserves all aspects of the data. Need to hand-edit a binary G-code file? Convert it to ASCII G-code, make your changes, then flip it right back.

Prusa are not the only ones to notice that the space inefficiency of the G-code file format is not ideal in all situations. Heatshrink and MeatPack are two other solutions in this space with their own strong points. Handily, the command-line tool in libgcode can optionally apply Heatshrink compression or MeatPack encoding in the conversion process.

In a way, G-code is the assembly language of 3D printers. G-code files are normally created when slicing software processes a 3D model, but there are some interesting tricks to be done when G-code is created directly.

66 thoughts on “G-code Goes Binary With Proposed New Format

    1. Well, the spec on github is using meatpack, just adding data for thumbnails and the like ahead of the gcode
      I don’t connect over serial nor produce giant high-deal gcodes to print since the duet handles everything fine on-device(same with klipper since its sending motor commands instead of being interpreted on the MCU), but this seems to be more an issue with octo+marlin/prusa printers being bottlenecked by collecting/processing gcode at a high throughput.
      My old controller had this sort of baud rate lag, which is why I moved to klipper and eventually the duet, so a solution would be nice.

      If you could process it like a gcode.gz file or open it in NP++ with a plugin natively, I think a lot of the issues would be resolved

    2. I did that at Ultimaker, just simple gzip on both sides. Saves about 30% if I remember right and only took a few lines of code. All optional and easy for anyone not using 8bit micros to implement.
      Heavier compression saved a bit more space but at the cost of way more cpu time, so was not really worth it.

    3. This has been the case with STL files for years. ASCII-STL is hardly to be found anymore, the binary format simply maps the “commands” binary byte by byte. The files are smaller, machine-readable line by line or better byte by byte, i.e. sequentially, and anyone who wants to can build their own converter with the description. Githab or Matlab offer enough source code. A simple tetrahedron is reduced from 700 to 290 bytes without changing anything structurally in the code. Try it in OpenSCAD: cylinder(h=sqrt(6)/3, r1=sqrt(2)/2, r2=0, $fn=3);

      A binarisation of GCode would not do more – but also not less.

      Older boards in particular should be able to cope better with this, because parsing takes just as long in principle (it is the same content in almost identical order) but all strings are smaller or do not have to be converted or parsed as strings in the first place. The only person who has a problem is the person who wants to look at the code.

      And if you want to customise the GCode manually or using a script, just convert it first.

      Zipping is not a comparable process here. It only ensures smaller transport files. Unpacking produces the same GCode as before and requires additional resources for unpacking.

      1. And by the way: it makes sense to clean up the GCodes for certain boards beforehand by removing all comments and the thumnail (one line of sed is enough). This alone can reduce the code by half. Modern printers such as the Pollem AM (still a hobbyist product) will thank you for it.

      2. ASCII STL is still quite common, and a huge mess, actually STL is quite a mess.

        FYI, I’ve made the STL reader in Cura.

        ASCII STL, I’ve seen:
        * all 3 variations of lineends, including just \r (OpenSCAD did this for quite a while on Mac)
        * various number formats, including commas instead of dots and scientific notation
        * different whitespace variants

        Best what works is just try to read line by line (accounting for both lineends) see if the text “vertex” is in a line, and then try to parse numbers after that, ignoring everything that isn’t part of a number. Just replacing commas with dots isn’t 100% reliable, as it can be a 1000 separator as well. SO MUCH FUN.

        Now, next, we have binary, which is generally not a huge issue. Except, how do you see if it is a binary? Well, an ascii should start with “solid [name]”. But, binary starts with 80 bytes of free header. Guess what solidworks puts in the header. Yes, “solidworks”, both start with solid. There goes your binary/ascii detection. It’s better to go by the filesize and vertex count in the file. But that’s also not 100% reliable. So, you’ll end up trying as binary and ascii both and see what fits best.

        In all cases, normal data and “attribute” data is unreliable and should be ignored.
        And finally you have degenerate triangles. CAD software somehow seems to love those.

        1. I agree 100%. STL is not a good format and certainly not as an import format for CAD programs – except for the otherwise rather harmless 3D-Builder – and yes, the import works by guessing which format is available.

          But it is enough as an example to show that binarization does not have to be worse than the ASCII format. What I have mentioned fits anyway. And nobody is saying that G-Code, which unfortunately is just as scattered in its dialects, has to be implemented just as badly.

        2. All *three* line endings?
          But there are four – there’s also ‘reverse dos’ which are \n\r instead of \r\n . I have lab equipment that connects over rs232 and does this, which is actually nice, since it also mixes ASCII data in almost-JCAMP-DX format, and a straight up burst of packed little-endian 24bit integer. I’m okay with that though – because said preceeding ASCII part had the decency to actually tell you, (every time!), how many points were coming in the burst. And JCAMP-DX’s number formatting method is a janky mess at best. Even having to deal with packed 24 bit with a language that doesn’t want to acknowledge that it’s a format that exists, is still far easier than dealing with JCAMP-DX’s data formatting.

          Thus the traffic was self-documenting in that unique way that only plaintext encoding gives you.

          What we’ve learnt over the history of computing, is that plaintext is unreasonably effective, and Binary formats are only good for lockin, excepting where they are necessary for hardware support. (FPGA’s compress their bitfiles also, and decompress them on the fly whilst they autoconfigure themselves on powerup).

          In my experiance, binary is useful – but apart from hardware interfacing, best kept to plain old PCM (just an array of numbers with no struct-like layout, although it might be 2d as in a multi-channel ‘sound’ file).

          Best answer I think would be to have a header-section in plain text, and then put the data in a burst of binary of pre-signalled fixed length, before going back to ascii for a bit. This way, the format retains the self-documented nature of plaintext and thus its unreasonable effectiveness by way of modern unix tools like git and ripgrep (and the rest of unix). And you still get the efficiency / embedded use simplicity of raw binary data.

          Their format looks generally okay, but IMHO everything that is in a struct needs to go back to a simple ascii key-value system. You still get to have the binary blocks, optionally even compressed. So you get the vast majority of the benefits of going to binary, whilst retaining the future-proofedness of plain text.

          The unreasonable effectiveness of plain text for storing data is because it make archaology (aka ‘reverse engineering’ if you’re trying to make it sound dirty and not like the science it is) *infinitely* simpler, and massively more efficient. Raw Binary encoding as a data-storage method is generally pretty hostile and untrustworthy behaviour, especially if not documented. Boy I hope they update the amazing unix-system ‘file’ utility to recognise it!

    4. Hear, hear! Long ago Microsoft Office used proprietary binary file formats that were hard for them to maintain and for 3rd parties to import and export. In 2007 MSFT wisely shifted to Zipped XML and never looked back.

      For g-code, it even more important that it be an open, easy to write, easy to read format. All my homemade design software outputs timeless ASCII g-code. It makes weekend projects possible. As for “on the fly,” yes, but here incremental DE-compression is most important. Unlike Word documents, g-code files often stay open for hours and hours during printing or machining, so any compression should be done in “bite sized” chunks.

  1. G-code is the most hideous programming language ever invented. If you don’t believe me, try writing it. As just a list of movement operations it’s adequate but once you start looking at the control structures it will make you weep. Trying to improve it by reducing the amount of space is lipstick on a pig or worse.

    What CNC needs is a proper control language, something that can be streamed and compacted – and yes, even coded in binary, though proper tokenizing would be better. If you want to use an existing language, Postscript is sensible and Python would be outright insane due to the unfortunate choice of whitespace for control structure.
    .

      1. It’s not perfect for what it *became*.

        G-code not just a sequence of instructions for CNC machines to follow, anymore. Nowadays you need to read G-code just as much (maybe even more?) than write it, because it’s used as initialization sequences for 3D printers. Not happy with your Bambu initialization sequence? Good luck tweaking it!

        G-code is equivalent to machine code, and we lack an ‘assembly language’ to go with it. There needs to be a language that’s made not for machines but for programmers. A language is only good if the next programmer can understand it without much effort. Are we living in the 60s? And no, comments are not the solution.

        A couple days ago I had this exact thought after I bought my first Bambu printer and found out half the internet complains about it pooping when it shouldn’t during initializaion, and almost no one seems to have a satisfactory solution. I had to search what M106 does, and what it’s parameters do, only to find out the literature on G-codes is very scarce and spread-out. The only thing I found useful was someone on some forum citing complete M106 codes that happened some comments in the code.

    1. There are divisions between concepts that come into play, from description to language to markup to literal specification, each with its own level of abstraction. It is not a linear progression, and breakpoints will vary from context to context, but gcode is, by design, intended to be lowest level. Not a language, no abstraction, but literal description of a physical embodiment of the process to produce an object. It works well for the purpose many decades after to introduction.

      There are higher level tools that are available and widely used (such as solidworks or Inventor, which are very high level parametric languages for describing physical objects through explicit constraints as well as relationships), but gcode is intended for the lowest level description at the level of a specific machine.

      gcode is NOT a programming language in the sense software engineers think of one, nor is it intended it be. It is the machine code for a specific machine produced from a high level language that is machine agnostic.

        1. Doesn’t work on a smaller microcontroller. DEFLATE is meant for archival compression, operating across the whole file. A complex print can have a GCODE file hundreds of megabytes big, which an Atmega would just choke on decompressing on-the-fly. You could use it to shorten transfer speeds, but at the cost of having to wait even longer for the file to be decompressed on the device. By contrast, a more machine-code approach would give many of DEFLATE’s benefits while the printer can read the file with no preprocessing step.

          1. Should we be engineering the file formats of the future for the microcontrollers of yesterday? Today you can buy the affordable RP2040 with 256k of RAM on-chip and plenty of CPU power to run a decompression algorithm.

    2. “As just a list of movement operations it’s adequate”
      – I would say that’s the key point. Every CNC pipeline involves a lot more than just sending movement commands, and they all have widely varying requirements, but they do all send movement commands, and it is helpful that there is a (sort of) standard language for that one specific task. In most cases, a G-code program serves as the equivalent of a BMP or STL file, i.e. an interchange format that’s somewhat foolproof precisely because it’s so limited.

      It’s true that even simple movement commands are woefully unstandardised, but if you’re writing software that just outputs a toolpath and maybe some cooling commands, it’s fairly simple to account for dialect variations. If you had to generate NURBS paths for one machine, and polylines for another, and a set of parametric path routines for a third machine… that would get ugly.

      It’s a little janky how slicers need hundreds of machine-specific config files, but at least they HAVE those configs, because they’re easy to create and test, because no version of G-code does anything complicated or exotic.

      We went through all this with 2D printers, where people kept making and improving “standards” like PS and HPGL, and all that happened was that a fairly commodified category of device became a nightmare to support, with no real benefit.

    3. It seems to me that if one is thinking of G-Code as a *programming* language, you’re already off the track. It is a descriptive or command language, but definitely not a programming language.

      Where this line is blurred a bit is that many machine controls have interpreter extensions that DO allow some, more or less limited, “programming” functionality.
      At the most basic level this involves some kind of conditional branching and variable use, along with basic math functions. Fanuc Macro B is the industrial poster child for this sort of extension.

      But, even this admittedly limited capability offers a ton of functionality to make the machine do things outside of a defined, repetitive sequence. I once wrote and proved out an engraving macro, using only Fanuc Macro B code, that could do nearly anything you would want to do with engraved text – auto-incrementing serial numbers, automatic dates with multiple formats, multiple fonts, dynamic sizing/scaling/aspect ratio – I was working on text on arcs and rotary axis functions before I lost interest. Just because it’s limited doesn’t mean it’s terrible, though I agree that some of the limitations can become annoying (only 33 local variables? What?)

      Even with the inherent limitations of the language, it’s sufficient for the majority of things that need to be done *dynamically at runtime*. If your needs are beyond that kind of capability, it’s probably best to carefully evaluate what you’re trying to do and why.

      And then there are the macro functions in the more recent Siemens controls, which I admittedly really enjoy. (Arrays! Typed variables! String manipulation! Sigh… CNC bliss)

      1. I didn’t do the formats and fonts, but I had mine on arcs, and it could also project it onto a cylindrical surface along the x or y axis.

        One client had a machine that didn’t support the arctan function, so I had to have a version calculate it the long way. It was a pretty old machine, so it just crawled. But it worked!

    4. “G-code is the most hideous programming language ever invented.”
      Someone has never seen BrainF*ck (or it’s equally ugly cousin, JSF*ck).

      “If you don’t believe me, try writing it.”
      I have. It’s absurdly simple. It’s literally just a list of crude instructions.

      “As just a list of movement operations it’s adequate but once you start looking at the control structures it will make you weep.”
      If you’re looking at control-structures, you’re looking at a firmware-specific (proprietary) extension to G-Code.
      G-Code is not a programming language as it has no control structures. It’s literally just a series of “go there and do this while moving” instructions with a bunch of machine-specific switches (turn on the spindle, turn on coolant) tossed in.

      Funny that you compare it with PostScript. PostScript is actually Turing-complete. People used to tie up printers for hours playing with tiny postscript code that contained a ray-tracer. Writing these tiny ray-tracers used to be a competition in the ’90s.
      HPGL would be a much better comparison (created for pen-plotters). The funny thing is, G-Code is easier to read than HPGL (too), as the X & Y coordinates are based on position (they aren’t preceded by “X” or “Y” as G-Code does).

      Saying G-Code is a bad programming language is like saying your camera is a bad CAD system.
      It’s not supposed to be one.

      1. I’m not sure where you’re getting that idea.
        The differences are only in the switches to the hardware and the proprietary additions made by the firmware authors. The same is true for laser engravers.
        The vast majority of any of them are single-digit (movement) G-codes that they share in common (I’ve worked with all three).

        This gives a great comparison between firmware implementations of the most common 3D printer firmwares, but also includes all of the original G-Code definition:
        https://reprap.org/wiki/G-code

        A better comparison would be Midwestern American English vs Southern American English. They’re nearly exactly the same with only slight variations here and there.

    5. G-code is not hideous at all. It sure has it’s limitations and short comings, but at the time of it’s invention it was a quite remarkable accomplishment. G-code was invented in the early ’50-ies, and was further standardized in the ’60-ies and that makes it probably the only programming language which is still widely used from that era. Languages as Fortran and Cobol are slightly newer, maybe they are still used in some niche applications, but with the abundance of 3D printers and other CNC machines the use of G-code will be several of magnitudes bigger.

      This article is also funny, because it is clear that g-code was designed from the ground up to have a small memory footprint. Take for example the single letter and number format. There are no long names for identifiers, and the way “canned cycles” and repetition is set up (in an inherently compressed way) also clearly shows it was written for machines with a small amount of memory and for tape storage, reading.

      But the world has changed a lot, and the use of G-code too. 3D printers can not make use of a lot of the features of G-code, and they do have an extremely large amount of vector data. G-code has also changed from being a hand written language to being mostly used in M2M communication.

      There is also “TPL” which stands for “Tool Path Language”, and it is a java library for writing CNC programs. I do not care for Java though, but a few years I was interested in Python, so I wrote a python library for translating between Python and G-code. It works quite well. You can use loops, arrays, functions and other python constructs to write a CNC program, run it, and it outputs a G-Code file that can be read by many CNC programs. If you want to use for example furniture / Claw nuts in very hard materials (such as HPL) you can write a subroutine for the center hole, for holes for the claws, and maybe a disk for the flange too if you want that, and then simply create an array of them to make a mounting plate for a CNC machine. Things like milling out a rectangle are also easy to add as (parametric) sub routines. It is a bit like OpenSCAD, but instead of “programming” a drawing, you build up the CNC vectors in a program.

      How big do files for 3D printers get? Even if they are 100MiB, does that matter? even an 8-bitter can read read 4GiB files with FatFs and an uSD card. Maybe communication speed is a notable limit. But it would need may small CNC vectors over a slow serial channel for this to be an issue. But I am interested enough to have a look into it. Bit of a shame that Hackaday did not give an estimate on how much space savings this has.

      It also reminds me of a bug report for KiCad. KiCad outputs .SVG files with nanometer resolution, and that is a lot of ASCII (or whatever it is called these days). Apparently even a bunch of other software got stuck on the big numbers KiCad puts in .SVG files.

      But overall, if this bgcode gets adopted by projects such as GRBL and LinuxCNC, then I may some day modify my Python library so it can write bgcode too.

  2. What about things like Arc Welder in firmwares that support it?

    It can significantly reduce file sizes of intricate models by letting the firmware do the arc segmenting work itself.

    Or fixing up firmware that was built for 8bit machines but now have 32bit machines capable of doing a lot more?

    It seems like the manufacturers could help a lot by supporting marlin/klipper to iron out the wrinkles and give everyone a fuller feature set. Time and 3rd party firmware march forwards, it’d be nice if the mfrs could be there too?

  3. Size isn’t the only shortcoming of gcode. Big machines with servo motors should have an idea of how much force a particular cut should take and error out if it’s too high. This would have to take into account the initial spike on cut entry, and discern that from too heavy of a cut indicating a bad offset, bad tool, fixture in the way, or other error conditions. It’s good people want to make gcode better, but it really needs to be so very much better.

    1. Here’s the most interesting comment of the bunch. Add fundamental advantages, hopefully in ways that apply to broad classes of tools, and you might build enough momentum to create a change…

      1. But ensure the code itself is a backwards compatible variety with codes hidden to older systems. It would be up to the operator to know that the file they are using contains such control codes or up to the device executing the instructions to know that certain features are incompatible with the machine. Minor firmware upgrades to pull that off.

        So you’d have the same G-Code that would cut or print the same product, but the motor control feedback data may not work on an older machine lacking those sensors.

        As long as it’s backwards compatible or has easy conversion it’s an idea in any case. If they abandon the current easy to read text nature of the code, then you’ve lost my support. I can’t tell you how many issues I have fixed simply by reading the code on a text editor.

      2. Ok, two things. First, this actually exists in the industrial world, and has been used for at least a decade at this point. It’s most commonly known as adaptive control. Some machines include this from the factory (the Makino T-series comes to mind), and retrofit options such as Caron Engineering’s TMAC are available.
        https://www.caroneng.com/products/tmac/

        The second thing is that this is way, way outside the scope of G-Code. Adaptive control has to be done in real time, with requisite high-speed signal processing. With the interpreted and motion-based nature of G-Code, timing is… uncertain, to say the least. It’s really in the realm of software/firmware updates to the motion control system, plus usually some added hardware. All the G-Code does is turn adaptive control on or off, and maybe set a couple of parameters.

    2. If you have a software that’s able to simulate this, there’s nothing preventing you from extending G-Code to specify the expected force (for ex. by adding a “RF 0.34 0.12 0 S” line before the sequence of move). Unknown g-code command are ignored by g-code parser, so it will work everywhere.

  4. The language snobs come out with sharp knives when someone mentions GCode. But what’s the problem it’s trying to solve? When you think about that, it is perfectly fine for what it does. Perhaps it shouldn’t called a language, maybe…

    As to the need for more compact representation – meh. Even a super complex job doesn’t need THAT much GCode. I think a single smart phone photo of a favorite dog takes up more space than even really big GCode files. I agree with the earlier post that suggested using commonplace text compression. Maybe just zip it up – easy, peazy.

    1. I’ve easily had gcode files in the hundreds of megabytes for large or complex prints. Microcontrollers would choke on running DEFLATE over a file that big. Not everyone is running Klipper and can throw a decent-sized multicore ARM Cortex and a couple GB of RAM at a print job. Knock it down to a cheap STM32 or an ATMega, and all of a sudden that decompression overhead is daunting.

  5. It would be great to add separate controls such as what happens during a power outage, out of material, etc. Those functions could be standardized through an industrial committee. The benefits to leveling up to these functions are obvious.

  6. Honest question: what’s the advantage of saving file space?

    Storage currently sells for something like 10c per gigabyte on the consumer market. Terabyte drives small enough to fit in a pocket cost less than $100. It’s getting hard to find USB drives or SD cards with less than 32GB. As technical problems go, “running out or storage” is on the list that can be solved in hardware without significant cost.

    Gigabit Ethernet is commonplace, and wireless protocols are getting faster. Meanwhile the speeds and feeds to cut material are pretty much the same as they’ve been for the last hundred years. Okay, the proliferation of carbide tooling got us a 10x increase, but the bottleneck is still “waiting for the cutter to traverse the path”, not “waiting for the next instruction to arrive”. Making the transmission another order of magnitude faster won’t change anything but the amount of time the CPU busy-waits before it can execute the next instruction.

    Aside from “filling an even smaller single-digit percentage of the available store” and “sitting idle slightly longer while the toolhead moves along the path at comparatively geological speeds”, what does this compression achieve?

    1. In Prusa’s case specifically, they’re working around a problem of their own creation: they are using an ESP8266 to provide WiFi on their printers, which is already slow by itself, but then the data is also transferred through UART from the ESP8266 to the main microcontroller, which just slows it all down even further, so any space savings mean huge savings in the amount of time it takes to transfer a large gcode file to the printer. And when I say it’s slow, I mean it is *REALLY SLOW*.

      They should have used a proper 5G-capable WiFi-chipset from the get-go and connected it over SPI or some other high-speed interface. What’s worse is that the problem was known for years from the experiences with the Prusa Mini, but they decided to just go ahead and repeat the same mistake with their just released, new printers as well!

      1. Reading the article again it looks like Prusa are using this as an optional compressed format to solve this problem and have open-sourced it because that’s what they do and it might save someone else reinventing that wheel.

        Not sure there’s much worth being upset about here.

        1. I didn’t say I am upset about them introducing this bgcode stuff; smaller gcode files are obviously an improvement. I just simply explained the rationale behind their motivation to do this.

  7. My $.02, this is going the wrong way. Gcode really is assembly language for CNC motion control. We haven’t moved on from assembly language because someone wrote a compression algorithm. We moved on from it because someone came up with a better abstraction. All this is going to do is make it harder to tweak on the fly and deprecate hardware that can’t read it.

    You can’t abstract something if you don’t know what you’re abstracting. That’s the problem with CNC. GCode has to be optimized for the strength of the motors divided by the mass of print head/cutter. Extrusion printers need to be optimized for the heat characteristics of the hot end and the flow characteristics of the plastic. CNC routers need adjusting for the number of blades on the end mill.

    The approach that manufacturers have taken is to stiffen and over-engineer everything so that the differences are minimized, but we all have to tweak dozens of things to make our prints come out right.

    If you want to make something less unpleasant than Gcode, you need something that translates geometry into motion, but figures out the parameters on its own. We are DEFINITELY not going to get that from a compression algorithm.

    1. Except we’ve never left assembly/machine code. We just don’t bother touching it when a compiler will do the work for us. That’s the whole purpose of a Slicer: Compile geometry into GCODE. It would just be easier and faster if that GCode was in some form of bytecode, not ASCII.

    2. “If you want to make something less unpleasant than Gcode, you need something that translates geometry into motion,”

      In 3D-printer speak, that is an STL file and a slicer.
      And whether the M2M intermediate is readable G-code, or a binary format does not matter much.

    3. If you did that, you’re just trading in the problems with GCODE for new problems. Lack of control namely.

      GCODE is extremely low level by design. You essentially control every single motion the machine makes. This is bad because the files are huge, and machine specific. It’s good, because you get extremely fine control over the machine, while keeping the machine and firmware simple. All the feeds and speeds, toolpaths, and other calculations are done by your PC, which is much faster and has a better UI.

      Imagine if you had to mess with the machine settings and firmware on a part by part basis to get a good result. That would be terrible! And if you tried to include those settings in the file, you’re right back to machine specificity again.

      GCODE is not a bad solution really. Honestly it’s refreshing how ubiquitous and human readable it is (for an intermediary format). Native support for arcs would be nice though..

  8. From the repository:

    GCode

    G-code data.
    Parameters
    type size description
    Encoding uint16_t 2 bytes Encoding type

    Possible values for Encoding are:

    0 = No encoding
    1 = MeatPack algorithm
    2 = MeatPack algorithm modified to keep comment lines

    So for now it’s just container for thumbnail and other prusa-related stuff instead of something new.

  9. Isn’t the biggest culprit here the fact that most models make it into the slicer in some tesselated format like STL? I mean, straight surfaces would come out fine no matter what the model format is, but without true curves to work from, all a slicer can do is turn the model’s 47 bazillion triangles in to just as many G1 moves.

    Rather than trying to compress or convert the g-code, stop generating so damned much of it in the first place — slicers and modeling programs need to settle on using a better model format (though in some programs, that’s much easier said than done).

    In the meantime, there’s Arc Welder and its in-slicer equivalent, like SuperSlicer’s and PrusaSlicer’s arc fitting feature. I imagine proper configuration of that would cut quite a bit from final g-code, depending on the model’s geometry.

    1. I had to look that up. First results indicated either a punk rock band or an android simulator but this is more useful: https://plugins.octoprint.org/plugins/arc_welder/ Quite an appropriate name, as it “welds” many linear moves into a single arc (G2 or G3).

      And indeed. generating a lot of linear vectors for what should be a curve is a very bad thing to do. I also love Bi-Arcs, but unfortunately FreeCAD does not support them. Bi-Arcs are parametric curves in which the curve always goes trough all control points (unlike with Bezier curves) and there are two arc segments between two control points, and it has the added advantage that the arcs are directly mappable to G-code so there is no further degradation.

      If you want to push compression further for 3D printers, then finding some way to only map the difference between a layer and the next layer could give a good compression ratio. Downside is that the 3D printer software has to keep the previous layer in it’s memory too.

    2. PrusaSlicer and, by extension SuperSlicer, support STEP and 3MF formats, which are true 3D models. I always export my models from Fusion 360 as 3MF, for example. PrusaSlicer also does use true arc moves now.

    3. Arcs only fit circular features. I’d implement a form of bezier splines instead, do a curve fitting algorithm and the gcode instantly turns into far fewer lines of code. This should be possible on most 32 bit controllers. The curves will be rendered into stepper motor (micro) steps so always the highest resolution the machine can handle without the quadrillion straight but short gcode lines.
      Most constant velocity gcode interpreters on machines do interpolation anyway, it’s ‘lossy’ is a way to get around zigzags that will be averaged out for example (to keep velocity constant as much as possible and prevent the machine from shaking apart).

  10. My bigger bugaboo is with CAM software that defaults to extremely tiny linear segments in G01 instead of using the machine tool controller’s built in capabilities to process G02/G03 circular moves.
    Like patenting a wheel with thenovel idea of making it circular via millions of linear segments. Inanity exemplified.

    1. Not all machines support G02/G03, therefore supplying a list of tiny G01 segments lets the machine produce what you want, despite not having those commands. If your machine supports G02/G03 commands but you are not using them then you haven’t set up your CAM properly. No point whining about it.

  11. Wow so much debate. I like this idea. Converting back to human readable is important for occasionally debugging.

    I’ve had a couple printers that choke on large GCODE. Like one with a web interface that couldn’t upload more than 20MB, so we enable GZIP but then it times out, so we tweak the timeouts ……. screw that printer

  12. G-Code for CNC is fine, you can write very short parametric programs to do a lot of machining. A lot of time I get in there and write glue g-code for pallet systems, plane changes, tool wear etc… I dont want to fudge around with binary. — Now I know this is realted to 3D printing, but perhaps its because every single movement of the print head is a stream of g-code and not paramentric? I mean if you building a square cube, only the floor an and outline need to be programmed, the rest can just be macro calls to repeat, and you can even inc the Z access in the sub macro as well, so you can do a single line:

    M97 P1000 L25


    N1000 (layer code)
    G91 Z0.1
    G90
    … do the layer
    M99 – returns back to caller

  13. With SSDs and MicroSD cards at prices where they are practically being given away do we need this now?

    I guess if there was no disadvantage then fine. Every now and then though I find myself manually editing a gcode file. Sure, according to the text one could still convert it to text, edit it then back to binary. But why add those extra steps?

    I’m all for optimization. But you have to optimize for the parameter that is in the shortest supply. I would say for most that is probably time, not storage space.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.