Advanced PCB Graphics With KiCAD 6 And Inkscape

There are many, many video tutorials about designing the functional side of PCBs, giving you tips on schematic construction, and layout tips. What is a little harder to find are tutorials on the graphical aspects, covering the process from creating artworks and how you can drive the tools to get them looking good on a PCB, leveraging the silkscreen, solder and copper layers to maximum effect. [Stuart Patterson] presents his guide for Advanced PCB Graphics in KiCAD 6.0 and Inkscape, (Video, embedded below) to help you on your way to that cool looking PCB build.

Silkscreen layers in yellow, solder mask opening in red

The first step is to get your bitmap, whether you create it yourself, or download it, and trace it into a set of vectors using the Inkscape ‘trace bitmap’ tool. If you started with an SVG or similar vector shape, then you can skip that stage.

Next simply create a PCB outline shape by deleting all the details that aren’t part of the outline. A little scaling here and there to get the dimensions correct and you’re done with the first part. [Stuart] has an earlier video showing that process.

The usability improvements in KiCAD 6.0 are many, but one greatly demanded feature is the ability to group objects, just like you do in Inkscape and any other vector graphics tool for that matter. That means you can simply import that SVG outline into the Edge.Cuts PCB layer and all the curves will be nicely tied together. Next you select the details you want for the silkscreen layer, solder mask removal layers and any non-circuit copper. In Inkscape it would be wise to use the layers feature to assign the different material types to a uniquely named layer, so they can be hidden for exporting. This allows you to handle silk, mask and copper PNG exports from a single master file, in addition to any vector details for outline, slots and holes.

Once you have PNG bitmap exports for the silk, mask etc. you need to create a footprint inside a board-specific library, using the KiCAD image converter tool. It was interesting to note that you can export a new image footprint from the tool and paste it straight into the footprint editor, and tweak all the visibility details at the same time. That will save some time and effort for sure. Anyway, we hope this little tutorial from [Stuart] helps, and we will be sure to bring you plenty more in the coming months.

Need some more help with KiCAD? Checkout this tutorial, and if you want a bit more power from the tool, you need some action plugins!

Continue reading “Advanced PCB Graphics With KiCAD 6 And Inkscape”

A vintage supercomputer with unique dual screen display

VCF East 2021: The Early Evolution Of Personal Computer Graphics

The evolution of computer graphics is something that has been well documented over the years, and it’s a topic that we always enjoy revisiting with our retrocomputing readers. To wit, [Stephen A. Edwards] has put together an impressively detailed presentation that looks back at the computer graphics technology of the 1960s and 70s.

The video, which was presented during VCF East 2021, goes to great lengths in demystifying some of the core concepts of early computer graphics. There’s a lot to unpack here, but naturally, this retrospective first introduces the cathode-ray tube (CRT) display as the ubiquitous technology that supported computer graphics during this time period and beyond. Building from this, the presentation goes on to demonstrate the graphics capabilities of DEC’s PDP-1 minicomputer, and how its striking and surprisingly capable CRT display was the perfect choice for playing Spacewar!

As is made clear in the presentation, the 1960s featured some truly bizarre concepts in regards to cutting edge computer graphics, such as Control Data Corporation’s 6600 mainframe and accompanying vector-based dual-CRT video terminal, which wouldn’t look out of place on the Death Star. Equally strange at the time was IBM’s 2260 video data terminal, which used a ‘sonic delay line’ as a type of rudimentary video memory, using nothing but coiled wire, transducers and sound itself to store character information following a screen refresh.

These types of hacks were later replaced by solid state counterparts during the microcomputer era. The video concludes with a look back at the ‘1977 trinity’ of microcomputers, namely the Apple II, Commodore PET and TRS-80. Each of these microcomputers handled graphics in a slightly different way, and it’s in stark contrast to today’s largely homogenised computer graphics landscape.

There’s a lot more to this great retrospective, so make sure to check out the video below. When you’re finished watching, make sure to check out our other coverage of VCF 2021, including some great examples of computer preservation and TTL-based retrocomputing.

Continue reading “VCF East 2021: The Early Evolution Of Personal Computer Graphics”

Ray Casting 101 Makes Things Simple

[SSZCZEP] had a tough time understanding ray tracing to create 3D-like objects on a 2D map. So once he figured it out, he wrote a tutorial he hopes will be more accessible for those who may be struggling themselves.

If you’ve ever played Wolfenstein 3D you’ll have seen the technique, although it crops up all over the place. The tutorial borrows an animated graphic from [Lucas Vieira] that really shows off how it works in a simplified way. The explanation is pretty simple. From a point of view — that is a camera or the eyeball of a player — you draw rays out until they strike something. The distance and angle tell you how to render the scene. Instead of a camera, you can also figure out how a ray of light will fall from a light source.

There is a bit of math, but also some cool interactive demos to drive home the points. We wondered if Demos 3 and 4 reminded anyone else of an obscure vector graphics video game from the 1970s? Most of the tutorial is pretty brute force, calculating points that you can know ahead of time won’t be useful. But if you stick with it, there are some concessions to optimization and pointers to more information.

Overall, a lot of good info and cool demos if this is your sort of thing. While it might not be the speediest, you can do ray tracing on our old friend the Arduino. Or, if you prefer, Excel.

The Faux-Vintage Becomes Vintage

For those who might have missed it, there was a brief period in the mid-00s where gamers everywhere eschewed consoles and PCs in favor of simple Flash-based games to be played in a browser. Among these was the game Peasant’s Quest, created by the folks at Homestar Runner and modeled after video games from the 80s. [deater] was a fan of this game and wondered if it would actually be possible to play this retro-styled game on actual retro hardware.

For the experiment he decided on using an Apple II since this computer is featured as a prop rather often by the developers at Videlectrix. It turns out that with some determination it’s actually possible to run this game on the late 80s hardware with very little modifications. Squeezing the sprites into the required space was a challenge, as well as getting the sound tracks to play properly, but in the end the game runs within the hardware’s 280×192 resolution with 6 colors. There are also detailed notes on how the complicated graphics system on the Apple works for those willing to take a deep dive. There’s a lot going on here, but surprisingly few compromises needed to be made to get this to work.

The game itself is available on the project’s webpage for anyone who still has an Apple II kicking around, or for anyone who is willing to try it out in an emulator. Of course you could always play the original Flash version but that’s missing a certain charm that decades old retrocomputers have with games. We certainly aren’t seeing video game controllers like those built for the Apple II anymore, for example.

Continue reading “The Faux-Vintage Becomes Vintage”

What Kind Of GPU Are You?

In the old days, big computers often had some form of external array processor. The idea is you could load a bunch of numbers into the processor and then do some math operations on all of the numbers in parallel. These days, you are more likely to turn to your graphics card for number crunching support. You’ll usually use some library to help you do that, but things are always better when you understand what’s going on under the hood. That’s why we enjoyed [RasterGrid’s] post on GPU architecture types.

If you can tell the difference between IMR (immediate mode) and TBR (tile-based) rendering this might not be the post for you. But while we knew the terms, we found a lot of interesting detail including some graphics and pseudo code that clarified the key differences.

Continue reading “What Kind Of GPU Are You?”

A Look At How Nintendo Mastered Dual Screens

When it was first announced, many people were skeptical of the Nintendo DS. Rather than pushing raw power, the unique dual screen handheld was designed to explore new styles of play. Compared to the more traditional handhelds like the Game Boy Advance (GBA) or even Sony’s PlayStation Portable (PSP), the DS seemed like huge gamble for the Japanese gaming giant.

But it paid off. The Nintendo DS ended up being one of the most successful gaming platforms of all time, and as [Modern Vintage Gamer] explains in a recent video, at least part of that was due to its surprising graphical prowess. While it was technically inferior to the PSP in almost every way, Nintendo’s decades of experience in pushing the limits of 2D graphics allowed them to squeeze more out of the hardware than many would have thought possible.

On one level, the Nintendo DS could be seen as a upgraded GBA. Developers who were already used to the 2D capabilities of that system would feel right at home when they made the switch to the DS. As with previous 2D consoles, the DS had several screen modes complete with hardware-accelerated support for moving, scaling, rotating, and reflecting up to four background layers. This made it easy and computationally efficient to pull off pseudo-3D effects such as having multiple backdrop images scrolling by at different speeds to convey a sense of depth.

On top of its GBA-inherited tile and sprite 2D engine, the DS also featured a rudimentary GPU responsible for handling 3D geometry and rendering. Hardware accelerated 3D could only used on one screen at a time, which meant most games would keep the closeup view of the action on one display, and used the second panel to show 2D imagery such as an overhead map. But developers did have the option of flipping between the displays on each frame to render 3D on both panels at a reduced frame rate. The hardware can also handle shadows and included integrated support for cell shading, which was a particularly popular graphical effect at the time.

By combining the 2D and 3D hardware capabilities of the Nintendo DS onto a single screen, developers could produce complex graphical effects. [Modern Vintage Gamer] uses the example of New Super Mario Bros, which places a detailed 3D model of Mario over several layers of moving 2D bitmaps. Ultimately the 3D capabilities of the DS were hindered by the limited resolution of its 256 x 192 LCD panels; but considering most people were still using flip phones when the DS came out, it was impressive for the time.

Compared to the Game Boy Advance, or even the original “brick” Game Boy, it doesn’t seem like hackers have had much luck coming up with ways to exploiting the capabilities of the Nintendo DS. But perhaps with more detailed retrospectives like this, the community will be inspired to take another look at this unique entry in gaming history.

Continue reading “A Look At How Nintendo Mastered Dual Screens”

Hackaday Links Column Banner

Hackaday Links: January 24, 2021

Code can be beautiful, and good code can be a work of art. As it so happens, artful code can also result in art, if you know what you’re doing. That’s the idea behind Programming Posters, a project that Michael Fields undertook to meld computer graphics with the code behind the images. It starts with a simple C program to generate an image. The program needs to be short enough to fit legibly into the sidebar of an A2 sheet, and as if that weren’t enough of a challenge, Michael constrained himself to the standard C libraries to generate his graphics. A second program formats the code and the image together and prints out a copy suitable for display. We found the combination of code and art beautiful, and the challenge intriguing.

It always warms our hearts when we get positive feedback from the hacker community when something we’ve written has helped advance a project or inspire a build. It’s not often, however, that we learn that Hackaday is required reading. Educators at the Magellan International School in Austin, Texas, recently reached out to Managing Editor Elliot Williams to let him know that all their middle school students are required to read Hackaday as part of their STEM training. Looks like the kids are paying attention to what they read, too, judging by KittyWumpus, their ongoing mechatronics/coding project that’s unbearably adorable. We’re honored to be included in their education, and everyone in the Hackaday community should humbled to realize that we’ve got an amazing platform for inspiring the next generation of hardware hackers.

Hackers seem to fall into two broad categories: those who have built a CNC router, and those who want to build one. For those in the latter camp, the roadblock to starting a CNC build is often “analysis paralysis” — with so many choices to make, it’s hard to know where to start. To ease that pain and get you closer to starting your build, Matt Ferraro has penned a great guide to planning a CNC router build. The encyclopedic guide covers everything from frame material choice to spindle selection and software options. If Matt has a bias toward any particular options it’s hard to find; he lists the pros and cons of everything so you can make up your own mind. Read it at your own risk, though; while it lowers one hurdle to starting a CNC build, it does nothing to address the next one: financing.

Like pretty much every conference last year and probably every one this year, the Open Hardware Summit is going to be virtual. But they’re still looking for speakers for the April conference, and just issued a Call for Proposals. We love it when we see people from the Hackaday community pop up as speakers at conferences like these, so if you’ve got something to say to the open hardware world, get a talk together. Proposals are due by February 11, so get moving.

And finally, everyone will no doubt recall the Boston Dynamics robots that made a splash a few weeks back with their dance floor moves. We loved the video, mainly for the incredible display of robotic agility and control but also for the choice of music. We suppose it was inevitable, though, that someone would object to the Boomer music and replace it with something else, like in the video below, which seems to sum up the feelings of those who dread our future dancing overlords. We regret the need to proffer a Tumblr link, but the Internet is a dark and wild place sometimes, and only the brave survive.

https://commiemartyrshighschool.tumblr.com/post/640760882224414720/i-fixed-the-audio-for-that-boston-dynamics-video