Project Zero Finds A Graphic Zero Day

After finding the infamous Heartbleed vulnerability along with a variety of other zero days, Google decided to form a full-time team dedicated to finding similar vulnerabilities. That team, dubbed Project Zero, just released a new vulnerability, and this one’s particularly graphic, consisting of a group of flaws in the Windows Nvidia Driver.

Most of the vulnerabilities found were due to poor programming techniques. From writing to user provided pointers blindly, to incorrect bounds checking, most vulnerabilities were due to simple mistakes that were quickly fixed by Nvidia. As the author put it, Nvidia’s “drivers contained a lot of code which probably shouldn’t be in the kernel, and most of the bugs discovered were very basic mistakes.”

When even our mice aren’t safe it may seem that a secure system is unattainable. However, there is light at the end of the tunnel. While the bugs found showed that Nvidia has a lot of work to do, their response to Google was “quick and positive.” Most bugs were fixed well under the deadline, and google reports that Nvidia has been finding some bugs on their own. It also appears that Nvidia is working on re-architecturing their kernel drivers for security. This isn’t the first time we’ve heard from Google’s Project Zero, and in all honesty, it probably won’t be last.

46 thoughts on “Project Zero Finds A Graphic Zero Day

  1. I am more of a hardware guy (chip design in Verilog), so I may be completely off-base here. However, couldn’t some of the blame be laid at the feet of the language? The way that C handles pointers and even strings pretty much guarantees that you will have stuff like this unless you carefully audit your code every time. Why not have a language that will not allow stuff like this to happen by including the checks at compile-time?

    Software gurus, discuss.

    1. It’s not so much a problem of how C handles pointers, so much as how C allows the programmer to do w.e the hell he wants. That being said there are newer programming languages that don’t allow for this kind of pointer manipulation (because of bugs like this) but if you are doing low level stuff like drivers or kernel code, you are going to be using C.

    2. Yes, some of the blame lies with the language. C assumes an intelligence and experience level that not all programmers have.

      I hate the idea of protecting people from their own ignorance, which is something newer languages do,, and I also hate the idea of a single shitty dev making millions of users insecure, which is what you get when you use a language that expects a certain skill and intelligence level from its developers.

      1. this essentially boils down to the old abstraction debate, unless one is using machine code there is always some level of abstraction, for something like c that level is actually fairly high, it is by no means a “direct” programming language, so if abstraction already occurs why not go for the smartest abstraction we can get?

        the ideal would be to go directly from natural language to a functional program but that probably isn’t going to happen anytime soon.

          1. Good Idea. Better still: Logo Programming language.
            I mean, who needs to access memory and process complex methods?

            Because watching a frame each second draw it self makes for good gaming experience!

            (TROWLLOLOLOLING in-case you couldn’t tell)

          1. For performance reasons, microcode is *extremely* close to assembly, and one of the reasons it exists is to allow for a way to patch the processor in the field.

            The translation of assembly to microcode is a very performant one. It’s not a full layer of abstraction as it is often assumed to be.

    3. C/C++ gives you lots of low level control of what the software is doing. That level of control is probably important in a graphics driver that needs to be tuned for maximum performance.

      Sure, other languages might had built-in protection against this sort of thing, but those features are going to cost you extra CPU cycles.

    4. The key to this question is that C is a systems programming language. Systems programming requires controlling the nuts and bolts, which means the programmer has to know how to control the nuts and bolts without having everything fall apart. Modern C++ offers a lot of facilities for controlling the nuts and bolts in ways less conducive to things falling apart, but A) it still doesn’t force the programmer to use said facilities (so a programmer can still pretty easily build things that fall apart), and B) for reasons undefined the systems programming world is largely stuck on C, which does not have such facilities.

      1. I work t the very low level and just using the new keyword can be difficult. It means you have something like Malloc() that can be called for the function. that means you have a heap the compiler and linker understand. If you don’t have these things, they you can end up with problems. The same sort of issue exists for things like try-catch blocks. At least in x86, the code can generate an exception that gets picked by a special exception handler. That may mean programming the CPU’s APICs which is not a trivial thing to do.

        but it could be worse, awe could still be programing this stuff in ASM.

    5. Static analysis has it’s shortcomings.

      Give Rust a go.

      Somethings are inherently unsafe, any low level code at some point is going to want to manipulate a pointer, at some point a pointer to something (or nothing) goes in and a pointer to soemthing comes out.

      Rust deals with this by packaging unsafe code in a unsafe block.

      Rust’s major shortcoming is it’s a pain in the arse.

    6. I’ve been a pro c/c++ coder for over 25 years. Your assessment of C is correct. With Great Power Comes Great Responsibility. It’s the most portable and fastest language I know. But you can REALLY break stuff with it. Many moons ago I had a bug my code running on a PS1 devkit that crashed so hard it took down the host PC. I suspected it cause the driver for the connector card to do something bad. LoL :)

      If NVidia open sourced their drivers then they could well have an army of coders auditing the code. I understand their reluctance to not doing this but they are loosing more in free work force checking the code than they ‘might’ loose buy giving some secrets away, which I doubt.

      1. >If NVidia open sourced their drivers then they could well have an army of coders auditing the code.
        Yeah… The Heartbleedbug remained undiscovered for more than 2 years altough the code is Open Source and the mistake was kind of trivial.
        What i want to say: We have too few people that have the knowledge to audit code AND are willing to do this in they free time. Open Source code is a really good thing but sadly not a guarantee for secure code. :-/

    7. Heck C++ gets flak for things like slow libraries, however if you use those libraries many of the error cases get handled for you, hence the slow. Being able to do things in a good way without losing the power of C to get things done it’s what C++ is supposed to be for.
      In other words languages other than C can either forbid bad practices or just make good practices so easy you don’t know you’re doing them.

    8. Alas, idiots exist. When they’re the user, it’s bad enough. When they’re the programmer, it’s worse. But while coding is a race for the cheapest and countries are churning out as many mediocre-at-best coders as they can, we’re stuffed.

      Note that I refer not to engineers, as these are not they.

    9. There are several languages that are designed to help programmers write better code. The oldest/most prominent is Ada ( https://en.wikipedia.org/wiki/Ada_(programming_language) ), which is used for software for airplanes ( http://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html ) and other things that should not crash (pun intended). I tried to find any studies concerning the defect rate of ada vs c, but it was hard to find any proof. http://www.adapower.com/index.php?Command=Class&ClassID=FAQ&CID=327 may give some hints.

      The problem with ADA is that there are not that many developers who know it, and that it is very hard to write bad code in it. To write good code is costly and many companies do not want to afford it, at least as an upfront cost (however many studies argues that the total cost of handeling bugs and security breaches are more expensive than making something right in the first place.

      It is possible to write good code in C as well, but it requires discipline, good coding standards and proper peer reviews.

      1. Huh, I wanted to bring up the Ariane 5 incident, because the flight computer was programmed in Ada, but upon closer investigation it seems that the rumors I’d heard were actually false and the reason the rocket crashed had nothing to do with programmers trusting the language to be safe. Oh well, I still think that blaming the language for programmers’ incompetence is wrong and that it is worth using C as the correct abstraction level for code that is meant to be portable and readable, yet fast.

      2. Sadly the F-35 dispite all the money spent on it… doesn’t even use Ada in any large degree.

        Apparently it’s 90% C and C++.

        Ada from what I have seen is hard to write code in period… it’s much like its relative VHDL in that regard very verbose…. another language that looks interesting but not as hard to grasp is ParaSail it’s sort of a mashup of Ada like languages and Python (though it is a compiled language).

    10. Most programming languages are intended to instruct a computer, i.e. the programmer (micro-/macro-)manages the computer (or an idealized model of it) with recipes. These languages are highly restricted in the kinds of sentences you can convey to the compiler: you can command imperatively to say “add this integer to a pointer and then read that bit of memory into a variable” but how do you write “for all input combinations, I require the output to satisfy this or that logical property” ? how do you say in imperative languages “for all x,y element of … (some property/fact)” ? Ultimately security rests on verification, and so ultimately the tools of formal verification should someday merge with the tools of programming if we ever wish systematically secure computational systems. To let the end-user verify this could mean software will at some level need to be open-source as a pre-requisite for verification…

      I.e. I can look up an algorithm for calculating the convex hull of a set of points, and see the proof that the algorithm works in some paper, and then implement it in a piece of software, but the compiler will never understand that the resulting output subset of the points will be such that the points not in the output all lay withing the convex hull of the points in the output. I could write the proof in the comments for other programmers, either a reference to the paper, or perhaps even a formally verifiable proof for some verifier in the comments, but until either programming languages become aware of logic (in the sense of math/philosophy, not just boolean logic) or until programming is done within a formal verifier where all the programming concepts are concepts in a formalized theory of computation we will not attain verifiable security, nor literate code…

      1. On the point of literate code, the idea is that the code explains itself, such that comments are not really necessary.
        But what is explanation really? What is the answer to a why question? Why is pi irrational? The explanation is the proof that pi is irrational. Proof is explanation, so literate code is mathematical description of code and its desired properties proven (by the programmer or a prover, or a combination during development)… Which axioms do we accept for a proof system? For programmers to accept each others proof in collaboration, or users to accept proofs that accompany software, they will need to agree on axioms they accept, so ultimately we want explicit practical belief systems formalized. There could be multiple schools of thought each having their own axioms, and sharing proofs whenever they share the requisite axioms…

        1. The task of formalizing everyday belief systems (in order to make them practical as opposed to armchair philosophy) seems very daunting and it probably is, let me illustrate with an example:

          There is a swimming pool, and it has among one of its rules: “No running.”

          How do we formalize “no running”? So walking is OK, but running is not. So where do we draw the boundary? 5km/h? A small child with short legs might need to run to reach that speed. The whole exercise seems pointless, until after thinking long enough you conclude that while walking “at all times at least one foot is resting on the ground” and as soon as this is no longer true, there is running going on… I am open to different definitions…

    11. There are tools to deal with known coding flaws, such as Flawfinder. The principle is that if you can define a coding malpractice as a pattern, and a secure alternative, then you can automate the security auditing of your code.

    12. C pointers are only as secure as the programmer makes them. While it may be safe to go wild inside your processes’ address space, you have to treat security much more seriously when you cross into other processes. The thing is C is a dual edge blade. As much as the end programmer can screw things up, a lot can be done to mitigate that just by exercising a little thought before punching out a hundred or so lines of code.

      1. considering that a lot of non self driving cars are already vulnerable i would hope it at least forces the manufacturers to do something, though in the current climate that would probably only involve some salty greasy palms and a law that puts the digital security of a vehicle on the owner and not the maker of the car, absolving them of all responsibility.

      2. Generally, control software in automotive industry is a lot more carefully developed and checked than graphics driver software. The number one objective in automotive is safety (because lawsuits), the number one objective for graphics drivers is probably performance (because more frames per second is more better).

  2. I am so glad that Nvidia releases the binary code for their drivers so we can better determine these issues are there and fix them as well as maximize open source compatibility and not require people to develop their own open source drivers LARGELY FROM SCRATCH… oh wait… why don’t they do this again?

        1. My sarcasm tag didn’t work in the first sentence.

          But in reality I find that there are way more exploits in closed sources and the exploits in the open source software are usually quite obscure and take a bunch more work to utilize. Plus the known published vulnerabilities in close source are the ones that actually get published, there are typically many many more hidden behind NDA’s.

    1. Open-source drivers for graphics card are primarily caused (I assume) by business needs.

      Actually, there is some legal protection in calling something a “trade secret.” This is something that is not quite up to the standard (for whatever reason) of getting a patent. However, if you blab, then you loose this protection, so you are, in essence, just helping out your competitors. An argument could be made that the good will of the open-source community is worth more than the secrets, but that is a value judgment.

      In other cases, it could be due to licensing. If you license a technology from another company, those secrets are not yours to publish.

      I have no direct involvement with the graphics business, but these are the sorts of thing that just jump to my mind.

  3. Nvidia has really gone downhill as far as the drivers. Of course everyone knows about the silly GeForce experience making you log in now. Aside from that, I had an issue with the Shadowplay not working, chalked it up to something with a game update that broke it. Next, PS4 remote play stopped working. Found out it was the last driver update was the cause the second issue, the previous update was to blame for the first issue. Then I come to find this article. All makes sense now. They seem to keep churning out these updates whenever a new game comes out, “for optimization purposes”. I don’t understand – if these drivers are supposed to just work the way they are meant to, why are individual games dictating how these drivers are written and updated?

  4. Working decades ago, I found a research paper that compared various programs written in ASM, C and some high-level languages. What they found was that once the ASM and C programs were written in such a way as to be “safe” (proper bounds/index checking, pointer safety etc) that the extra code resulted in no actual performance advantage over the high-level languages that had this built in.

    This added to the personal anecdotal evidence I had;

    When working at an aerospace company (load analysis engineer), my fellow engineers were having problems debugging their Fortran code. I suggested to the IT folks that we use WATFIV compiler instead of the IBM Fortran G and H. They balked at the “performance hit” of WATFIV’s run-time checking routines. They did give it a try when the Waterloo folks offered a free 30 day trial. I ran the benchmarks on my and some other engineers’ programs (with IT supervision – why trust those engineers?) and found that WATFIV outperformed even IBM’s optimizing compiler.

    Ditto years later when I had an “expert” coder who wrote some “highly optimized” spaghetti code that he insisted was essentially bug free. I rejected it (to management’s chagrin) and re-structured it myself to proper standards. I found numerous bugs just by structuring (including a patch of code that was executed multiple times when only needed once – ah, spaghetti code!). The structured result performed substantially faster.

    My moral: precision beats “performance” every time and I consider the argument that languages/coding styles that emphasize “performance” are often bunk.

    1. I lean towards effective use of bounds checking, not bounds checking every opportunity one gets.

      I was dragged into a meeting regarding our SQL server and the stupidly low performance the UI was experiencing. Reports just dragged on for hours. Many of which came out messed up and were unusable. My job was to analyze the math used in the reports. The meeting wasn’t strictly about the bog slow server.

      I asked to look at the source code to try and figure out the math. To my surprise (or maybe not), there were hundreds of thousands of lines in there to fix shitty data. For example, one portion of the code dealt with six digit numbers in a ##-#### format. The report had code in there to strip out alpha characters, remove then add in the ‘-‘ as well as massage the number to present six digits such as padding with zeros or truncating longer digits. none of it worked very well such as a long dash would cause the code to truncate on the first two digits. All in all about a hundred or so lines of code just for that alone all because the original programmer had decided to store the data as a 255 character string. What? Why?

      Turns out the database stored EVERYTHING as char arrays or TEXT blobs. That simplified the programming of the UI for data input but made a HUGE mess of the UI that extracted that data. I could actually (still can to this day) store a date as “11th full moon” and the damn UI would accept it and his stupid parser would try to make sense of it.

      With a straight face, with the CEO and all his minions staring at me, I looked straight at the programmer and asked him, “why aren’t you validating the data when you update the database? It would save you all this time and effort from having to clean up the data every time you read the DB and the reports would run much faster.”

      They sent him away to make the changes I suggested.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.