32C3: Vector Video Games

There are a few classic video games that rely on vector graphics and special monitors. Asteroids is incomplete if you’re not playing it in its original arcade format. The same goes with Tempest, Lunar Lander, and the 1983 Star Wars arcade game. Emulation of these games is possible, even with MAME, but the display – like every display you can buy today – is still rasterized. The solution to this problem is to create a vector display output for MAME that works in conjunction with adapter boards and DACs connected to a monitor.

For this year’s Chaos Computer Congress, that’s exactly what [Trammell Hudson] and [Adelle Lin] did. They’ve created an open source vector gaming system that connects MAME to XY monitors and oscilloscopes.

The build uses a custom board equipped with a Teensy 3.1 microcontroller and a 12-bit DAC to convert XY coordinates sent by MAME to vectors that can be displayed on any XY monitor. This, of course, requires a patch to MAME, which the maintainers rejected as being an, “unacceptably hacky way to achieve the intended result.” It does achieve the intended result, though: allowing dozens of vector games playable on whatever monitor supports vector graphics.

So far, [Trammell] and [Adelle] have gotten their system working on Vectrex consoles, analog oscilloscopes set to XY mode, and vectorscopes that litter every broadcast station and surplus shop. Check out [Trammell] and [Adelle]’s talk, and if you want to build the V.st vector display driver, the board is available from OSHPark.

32C3: Beyond Your Cable Modem

[Alexander Graf] gave an absolutely hilarious talk at 32C3 about the security flaws he found in cable modems from two large German ISPs. The vulnerability was very serious, resulting in remote root terminals on essentially any affected cable modem, and the causes were trivial: unencrypted passwords in files that are sent over TFTP or Telnet to the modems, for instance.

While [Alexander] was very careful to point out that he’d disclosed all of these vulnerabilities to the two German cable ISPs that were affected, he notably praised one of them for its speedy response in patching up the holes. As for the other? “They’d better hurry up.” He also mentions that, although he’s not sure, he suspects that similar vulnerabilities are present in other countries. Oh dear.

A very interesting point in the talk is the way that [Alexander] chose to go about informing the cable ISPs. Instead of going to them directly and potentially landing himself in jail, he instead went to the press, and let his contacts at the press talk to the ISPs. This both shielded him from the potential initial heat and puts a bit of additional pressure on the ISPs to fix the vulnerability — when the story hits the front page, they would really like to be ahead of the problem.

cable_modem-shot0012

There’s even a bone for you die-hard hardware hackers out there who think that all of this software security stuff is silly. To get the modem’s firmware in the first place, at minute 42 of the talk, [Alexander] shows briefly how he pulled the flash chip off the device and read it into his computer using a BeagleBone Black. No JTAG, no nothing. Just pulling the chip off and reading it the old-fashioned way.

If you’ve got an hour, go watch [Alexander]’s talk. It’s a fun romp through some serious vulnerabilities.

32C3: A Free And Open Source Verilog-to-Bitstream Flow For ICE40 FPGAs

[Clifford] presented a fully open-source toolchain for programming FPGAs. If you don’t think that this is an impressive piece of work, you don’t really understand FPGAs.

The toolchain, or “flow” as the FPGA kids like to call it, consists of three parts: Project IceStorm, a low-level tool that can build the bitstreams that flip individual bits inside the FPGA, Arachne-pnr, a place-and-route tool that turns a symbolic netlist into the physical stuff that IceStorm needs, and Yosys which synthesizes Verilog code into the netlists needed by Arachne. [Clifford] developed both IceStorm and Yosys, so he knows what he’s talking about.

What’s most impressive is that FPGAs aren’t the only target for this flow. Because it’s all open source and modifiable, it has also been used for designing custom ASICs, good to know when you’re in need of your own custom silicon. [Clifford]’s main focus in Yosys is on formal verification — making sure that the FPGA will behave as intended in the Verilog code. A fully open-source toolchain makes working on this task possible.

If you’ve been following along with [Al Williams]’s FPGA posts, either this introduction or his more recent intermediate series that are also based on the relatively cheap Lattice iCEStick development kit, this video is a must-watch. It’s a fantastic introduction to the cutting-edge in free FPGA tools.

32C3: Towards Trustworthy X86 Laptops

Security assumes there is something we can trust; a computer encrypting something is assumed to be trustworthy, and the computer doing the decrypting is assumed to be trustworthy. This is the only logical mindset for anyone concerned about security – you don’t have to worry about all the routers handling your data on the Internet, eavesdroppers, or really anything else. Security breaks down when you can’t trust the computer doing the encryption. Such is the case today. We can’t trust our computers.

In a talk at this year’s Chaos Computer Congress, [Joanna Rutkowska] covered the last few decades of security on computers – Tor, OpenVPN, SSH, and the like. These are, by definition, meaningless if you cannot trust the operating system. Over the last few years, [Joanna] has been working on a solution to this in the Qubes OS project, but everything is built on silicon, and if you can’t trust the hardware, you can’t trust anything.

And so we come to an oft-forgotten aspect of computer security: the BIOS, UEFI, Intel’s Management Engine, VT-d, Boot Guard, and the mess of overly complex firmware found in a modern x86 system. This is what starts the chain of trust for the entire computer, and if a computer’s firmware is compromised it is safe to assume the entire computer is compromised. Firmware is also devilishly hard to secure: attacks against write protecting a tiny Flash chip have been demonstrated. A Trusted Platform Module could compare the contents of a firmware, and unlock it if it is found to be secure. This has also been shown to be vulnerable to attack. Another method of securing a computer’s firmware is the Core Root of Trust for Measurement, which compares firmware to an immutable ROM-like memory. The specification for the CRTM doesn’t say where this memory is, though, and until recently it has been implemented in a tiny Flash chip soldered to the motherboard. We’re right back to where we started, then, with an attacker simply changing out the CRTM chip along with the chip containing the firmware.

But Intel has an answer to everything, and to the house of cards for firmware security, Intel introduced their Management Engine. This is a small microcontroller running on every Intel CPU all the time that has access to RAM, WiFi, and everything else in a computer. It is security through obscurity, though. Although the ME can elevate privileges of components in the computer, nobody knows how it works. No one has the source code for the operating system running on the Intel ME, and the ME is an ideal target for a rootkit.

trustedstickIs there hope for a truly secure laptop? According to [Joanna], there is hope in simply not trusting the BIOS and other firmware. Trust therefore comes from a ‘trusted stick’ – a small memory stick that contains a Flash chip that verifies the firmware of a computer independently of the hardware in a computer.

This, with open source firmwares like coreboot are the beginnings of a computer that can be trusted. While the technology for a device like this could exist, it will be a while until something like this will be found in the wild. There’s still a lot of work to do, but at least one thing is certain: secure hardware doesn’t exist, but it can be built. Whether secure hardware comes to pass is another thing entirely.

You can watch [Joanna]’s talk on the 32C3 streaming site.

Hackaday At 32C3 And Shmoocon

We are just a few days away from the 2015 Chaos Communications Congress in Hamburg Germany and we’re happy to say that a couple of the Hackaday crew will be on hand.
The annual event is one of the premier hacker conferences in the entire world. CCC-fairydustBoth [Voja Antonic] and [Nava Whiteford] will be attending this year’s 32C3, which runs from Sunday the 27th through Wednesday the 30th.

[Voja] will be pretty busy working a booth that will show off two of his projects. One is his Single-Chip Gaming System and the other is his DIY Book Scanner. If you do want to track him down, he dusted off his Twitter account, @Voja_Antonic, just for the event.

[Nava] will be less tied town, and looking for the best there is to see at the conference. If you want to connect with him, give his Twitter account a jingle: @new299.

2016 Shmoocon

schmoocon-bikerShmoocon is in the middle of January and boasts “Less Moose than Ever”. It’s notoriously hard to get a ticket for the annual hacker convention held in Washington, DC. We asked for three press passes and they were kind enough to provide one. We tried and failed to get tickets during the second public release, which sold out 900 passes in 7.58 seconds.

We’re Looking for One More Ticket!

We were able to purchase a single ticket second-hand, so along with the press pass we now have two. [Mike] and [Brian] are both planning to attend, but we’d like it if [Sophi] could be there as well. If you know of an extra ticket which we can buy at face value, please email mike at Hackaday with the details.

Will you be at Shmoocon? Want to meet up with [Brian], [Mike], and hopefully [Sophi], or know of an activity there we just shouldn’t miss? Ping us on Twitter (@szczys, @bbenchoff, @sophikravitz).

Also, how are our choices on con attendance so far? Leave a comment below and let us know what hacking events you think we just shouldn’t miss in the coming year.