Tips and Tricks for the C Pre-processor

C Pre-processor

The C pre-processor can help you write more concise, easy to follow code. It can also let you create a tangled ball of macros and #defines. [s1axter] wrote up a guide on how to use the pre-processor and keep your sanity.

We’ve seen some neat hacks with the C pre-processor, such as a full adder implementation, but this focuses on more practical usages. First, [s1axter] explains what the pre-processor does with your code by writing simple macros. Next up is arguments, and usage of ‘##’ directive for metaprogramming. Finally, we get a good explanation of why you need to worry about scope when using macros, and how to safe code by using ‘do {} while()’ statements.

If you’re into embedded programming, this guide will help you understand some of the more complex pre-processor techniques out there. It’s helpful for making your code clearer, and abstracting away hardware dependencies in a few lines of code.

3-Sweep: Turning 2D images into 3D models

2d to 3d

As 3D printing continues to grow, people are developing more and more ways to get 3D models. From the hardware based scanners like the Microsoft Kinect to software based like 123D Catch there are a lot of ways to create a 3D model from a series of images. But what if you could make a 3D model out of a single image? Sound crazy? Maybe not. A team of researchers have created 3-Sweep, an interactive technique for turning objects in 2D images into 3D models that can be manipulated.

To be clear, the recognition of 3D components within a single image is a bit out of reach for computer algorithms alone. But by combining the cognitive abilities of a person with the computational accuracy of a computer they have been able to create a very simple tool for extracting 3D models. This is done by outlining the shape similar to how one might model in a CAD package — once the outline is complete, the algorithm takes over and creates a model.

The software was debuted at Siggraph Asia 2013 and has caused quite a stir on the internet. Watch the fascinating video that demonstrates the software process after the break!

[Read more...]

0x10c becomes a community-developed game

0x10c

It’s official. [Notch], creator of Minecraft, has confirmed he’s shelved plans for 0x10c, the space-based block building and exploration MMO that features assembly programming as a core game component.

Over the last year or so since 0x10c was announced, a whole lot of programmers have picked up the in-game fictional CPU – the DCPU – by writing emulators and even emulating this CPU that only exists as a design document on an AVR. Needless to say, there are a lot of very skilled programmers that want this game to exist. Now, it seems, this community is forging ahead with this project without [Notch].

This is a truly massive undertaking by the community. Not only are the current plans to build an open world, procedurally generated, space-based MMO, it looks like these new developers will also be writing their own engine from scratch. If this were a commercial endeavour, it would require millions of dollars and many years to get to a rough alpha build, and the 0x10c community is doing this for free.

If you have experience in C++, OpenGL, and 3D game programming, the official signup thread is over on the 0x10c subreddit. Even if you’re not a programmer and only have experience in modeling, writing, your experience would be greatly appreciated.

Retrotechtacular: Bell Labs introduces a thing called ‘UNIX’

dennis

Modern operating systems may seem baroque in their complexity, but nearly every one of them  – except for Windows, natch – are based on the idea of simplicity and modularity. This is the lesson that UNIX taught us, explained perfectly in a little film from Bell Labs in 1982 starring giants of computation, [Dennis Ritchie], [Ken Thompson], [Brian Kernighan], and others.

At the time this film was made, UNIX had been around for about 10 years. In that time, it had moved far from an OS cloistered in giant mainframes attached to teletypes to slightly smaller minicomputers wired up to video terminals. Yes, smallish computers like the Apple II and the VIC-20 were around by this time, but they were toys compared to the hulking racks inside Bell Labs.

The film explains the core concept of UNIX by demonstrating modularity with a great example by [Brian Kernighan]. He took a short passage from a paper he wrote and found spelling errors by piping his paper though different commands from the shell. First the words in the paper were separated line by line, made lowercase, and sorted alphabetically. All the unique words were extracted from this list, and compared to a dictionary. A spell checker in one line of code, brought to you by the power of UNIX.

Retrotechtacular: How I wrote Pitfall for the Atari 2600

how-I-programmed-pitfall

This week we’re taking another departure from the ordinarily campy videos featured in the Retrotechtacular section. This time around the video is only two years old, but the subject matter is from the early 1980’s. [David Crane], designer of Pitfall for the Atari 2600 gave a talk at the 2011 Game Developer’s Conference. His 38-minute presentation rounds up to a full hour with the Q&A afterwards. It’s a bit dry to start, but he hits his stride about half way through and it’s chock-full of juicy morsels about the way things used to be.

[David] wrote the game for Activision, a company that was started after game designers left Atari having been told they were no more important  than assembly line workers that assembled the actual cartridges. We wonder if any heads rolled at Atari once Pitfall had spent 64-weeks as the number one worldwide selling game?

This was a developer’s panel so you can bet the video below digs deep into coding challenges. Frame buffer? No way! The 2600 could only pump out 160 pixels at once; a single TV scan line. The programs were hopelessly synced with the TV refresh rate, and were even limited on how many things could be drawn within a single scan line. For us the most interesting part is near the end when [David] describes how the set of game screens are nothing more than a pseudo-random number generator with a carefully chosen seed. But then again, the recollection of hand optimizating the code to fit a 6k game on a 4k ROM is equally compelling.

If you like this you should take a look at an effort to fix coding glitches in Atari games.

[Read more...]

The meaning of being a hard-core hacker from a 1985 recollection

6502-hand-assembling-and-programming

Gather ’round children, we’re about to hear a story about the good old days. Except that this is really more of a horror story of what it used to be like as a code monkey. [John Graham-Cumming] shares his experience programming a 6502-based KIM-1 machine back in 1985. Simple, right? The caveat being that there was no assembler or hardware for loading the finished code!

The machine in question was a label application tool for a production line. You know, product goes in bottle, label gets slapped on the side. But the slapping needed to be perfect because consumers shy away from packaging that looks shoddy. Computer control would end up being far superior than the mechanical means the factory had been using because it simplifies the ability to adjust calibration and other parameters. [John] started from square one by interfacing the KIM-1 with the existing hardware. It has a hex keyboard which is how the program was entered into the device. But first he wrote the software on sheets of notebook paper like the one seen above. It includes his hand assembled code, which was then typed in on the keypad. Kind of makes you appreciate all the tools you take for granted (like Eclipse), huh?

[via Reddit]

 

Interpreting Brainf*#k on an AVR

brainfuck

We won’t call it useless, but we will ask why [Dan] wrote a brainfuck interpreter for the AVR

It’s not generating code for the AVR; think of it more as a bootloader. To run a brainfuck program, [Dan] uploads it to the EEPROM inside his ATMega32, after which the microcontroller takes over and starts performing whatever instruction the brainfuck program tells it to do. Because the whole thing runs off the EEPROM, the code size is limited to 1022 bytes. Enough for any brainfuck program written by a human, we think.

As for why [Dan] would want an AVR to build an interpreter for a language that is nearly unreadable by humans, we honestly have no idea other than the common, ‘because it’s there’ sentiment. There are some pretty cool projects out there that use brainfuck, including this genetic algorithm software developer. Right now, though, blinkey LEDs are enough to keep us happy, so you can see a video of brainfuck doing its thing on a LED bar display after the break.

[Read more...]

Follow

Get every new post delivered to your Inbox.

Join 96,659 other followers