OpenGL In 500 Lines (Sort Of…)

How difficult is OpenGL? How difficult can it be if you can build a basic renderer in 500 lines of code? That’s what [Dmitry] did as part of a series of tiny applications. The renderer is part of a course and the line limit is to allow students to build their own rendering software. [Dmitry] feels that you can’t write efficient code for things like OpenGL without understanding how they work first.

For educational purposes, the system uses few external dependencies. Students get a class that can work with TGA format files and a way to set the color of one pixel. The rest of the renderer is up to the student guided by nine lessons ranging from Bresenham’s algorithm to ambient occlusion. One of the last lessons switches gears to OpenGL so you can see how it all applies.

As you might expect, if all you have is a call to set a pixel color, you have a lot of work in front of you. This probably isn’t for everyone, but if you’ve ever wanted to understand vertex shading, back-face culling, and tangent space normal mapping, this is the ticket. There is even work towards benchmarking different algorithms for things like line drawing, which is invaluable if you want to write efficient code.

We noticed [Dmitry] also did a 500-line ray tracer and why not? We’ve even seen that trick pulled off mostly in Excel. While the graphics stack is made to be educational, it makes us think of the small hardware that might be able to use an OpenGL stack.

18 thoughts on “OpenGL In 500 Lines (Sort Of…)

  1. It looks good. However, he says
    “Note: It makes no sense just to look at my code, nor just to read this article with a cup of tea in hand. This article is designed for you to take up the keyboard and implement your own rendering engine. It will surely be better than mine. At the very least change the programming language!”

    Why on earth would you change the program language? – he used c++! Yes, I agree, in the fullness of time Rust may take over as the go to best language, but until then…

    1. Because taking his code and making it work in a different language ensures you understand at least the basics of what the code is supposed to be doing to make sure your port is working correctly.

    2. The tutorial is OpenGL, so also applies to WebGL for in browser rendering. Which could be done in C++, but only it the tutorial was extended to include an introduction to Web Assembly!

  2. Saying “OpenGL” may give the impression this is hardware accelerated when it’s quite the opposite. Dmitry explains the how to render 3D images by plotting each pixel in software, starting at drawing lines. This is about making a “graphics library” which is the GL in OpenGL.

    1. Uhm, this is the “opposite” of being “hardware accelerated”?
      You know how modern OpenGL works, right? You provide it with buffers, some programs to run on those buffers in 2 passes, and that’s it.

    2. That’s what OpenGL is. It’s a set of bindings to the graphics hardware. You load up vertex buffers with geometric data, basically the dots of a connect the dots. You load up index data which is the order it will connect those dots. You load up (optional) geometry shader, (required) vertex shader and (required) fragment shader. You set a bunch of state flags which are effectively the configuration of the entire pipeline like blending mode, winding order, wrapping modes and stuff like that. Then you issue one or more of the various draw commands and let the graphics hardware invoke the shaders as needed to work it’s way through every single vertex, then every single pixel the geometry would occupy on the screen. A library of functions and parameters to control the graphics hardware, hence graphics library.

      The shaders are where the hardware acceleration occurs. Those are tiny programs that run on your graphics hardware or lacking such hardware a software emulation of it. Which is used is down to both how the programmer initializes OpenGL and what the end user’s computer supports. We can pick as programmers to force it to run in software mode, only in hardware mode, only in a specific version, in a range of versions, backwards compatible, forwards compatible or in a compatibility mode which effectively is telling OpenGL “use hardware accel if possible, otherwise use software mode”. The shaders are really where all that super fast magic happens, all of those vector, matrix and texture operations are tremendously faster on the graphics hardware than the CPU. So we call it hardware accelerated when really what we mean is “runs on hardware specifically designed to do this”.

  3. old school rendering, and fun to know how it works,( or worked). i definitely have enjoyed writing low level renderers in the past and like he said you can do just about anything with just a pixel write. its like building your own cpu from gates/transistors

    a fun series. though not the point of this article, i wouldn’t set out to learn opengl currently (even though i’m very nostalgic about it) these days if you want to do modern rendering and want to use it generally, vulkan/meta/directx/cuda etc but its fine for concepts and just enjoying learning

  4. In this age of bloatware and toy languages, building a small renderer makes a lot of sense. C and GLSL are ultra portable and performant, while simple enough to keep clutter out of the pipeline. I’ve made several engines, and recommend this method. Platform integration with xlib, winapi, android sdk, oculus etc. is usually a few hundred lines at most. Simple physics, inverse kinematics and other cool stuff can be written in tens of lines, precluding the need for huge libraries. I do use libs for heavy stuff like video compression and usually there are simple C libraries available if you look beyond the bloated examples often found on the net.

    1. Very interesting. I am looking to learn GUI programming, and I would like to start with low-level stuff in C using Cairo and Pango as crossplatform 2D drawing and font rendering libraries.

      Would they be a wise multiplatform combo for integrating with the native boilerplate libraries of various platforms like macOS (Cocoa and Quartz), Windows (NT, GDI[+]?), *NIX & GNU/Linux (XCB, Xlib, X Intrinsics), for bridging my 2D/3D graphics render (Vulkan) or compute shader (SPIR-V) code?

  5. Well, as an old man, meaning i started my first 3d render (sort of) on a zx spectrum 35 years ago , all i can say is that it’s have not to be related to OpenGl. And yes the basics of 3d is something important. And yes, you can do anything with just the capacity to draw a pixel at x,y coordinates. But , i may be wrong, the idea that to understand you have to change your programming langage makes no sense in my mind. For example, i can translate c++ to delphi , rust, plain C , C#, basic , python, golang, java, or even javascript or assembly because of a huge experience in programming. That means i understand these langages. But believe me or not , that does not mean I understand the algorithm i am working with. So, the real stuff , in my opinion, is not to change langage, and as far as i see, if you master c and c++ , all others langages are just a joke to learn , it takes no time, the real stuff is … ok … do it again without opengl, just the graphic api of your os. And do it again with let say directX. All in all 3d is math. Do it again with a pen and a paper. And once you understand quite well… forget about all that and use a real engine. Ogre3D , unity, unreal engine are whatever… Believe me, i tried to make a 3d engine once… very frustrating, not a job for a man alone. When i made a step forward , the world around me made 100 km forward. So… back to basics and after that use real tools.

  6. I’ve been wanting to learn this stuff for years. I’ve written a couple very primitive ray tracers — one on Solaris in C++ in the 1990s, and one on the Atari 800 (emulated) in Pascal just last year. Just discovered 90s-era Blitz3D a few months ago and it’s been a great intro to vertex/surface/texturing/etc. concepts, without the ridiculously steep startup learning curve of OpenGL. I’ve been playing with it for months and still don’t understand blending modes — and God forbid I ever have to write a shader — but it’s a start!

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.