One of [aepharta]’s ‘before I die’ projects is a homebrew computer. Not just any computer, mind you, but a fabulous Z80 machine, complete with video out. HDMI and DisplayPort would require far too much of this tiny, 80s-era computer, and it’s getting hard to buy a composite monitor. This meant it was time to build a VGA video card from some parts salvaged from old equipment.
When it comes to ancient computers, VGA has fairly demanding requirements; the slowest standard pixel clock is 25.175 MHz, an order of magnitude faster than the CPU clock in early 80s computers. Memory is also an issue, with a 640×480, 4-color image requiring 153600 bytes, or about a quarter of the 640k ‘that should be enough for anybody.’
To cut down on the memory requirements and make everything a nice round in base-2 numbers, [aepharta] decided on a resolution of 512×384. This means about 100k of memory would be required when using 16 colors, and only about 24 kB for monochrome.
The circuit was built from some old programmable logic ICs pulled from a Cisco router. The circuit could have been built from discrete logic chips, but this was much, much simpler. Wiring everything up, [aepharta] got the timing right and was eventually able to put an image on a screen.
After a few minutes, though, the image started wobbling. [aepharta] put his finger on one of the GALs and noticed it was exceptionally hot. A heatsink stopped the wobbling for a few minutes, and a fan stopped it completely. Yes, it’s a 1980s-era graphics card that requires a fan. The card draws about 3W, or about two percent of a modern, high-end graphics card.
Nearly a year ago, an extremely interesting project hit Kickstarter: an open source GPU, written for an FPGA. For reasons that are obvious in retrospect, the GPL-GPU Kickstarter was not funded, but that doesn’t mean these developers don’t believe in what they’re doing. The first version of this open source graphics processor has now been released, giving anyone with an interest a look at what a late-90s era GPU looks like on the inside, If you’re cool enough, there’s also enough supporting documentation to build your own.
A quick note for the PC Master Race: this thing might run Quake eventually. It’s not a powerhouse. That said, [Bunnie] had a hard time finding an open source GPU for the Novena laptop, and the drivers for the VideoCore IV in the Raspi have only recently been open sourced. A completely open GPU simply doesn’t exist, and short of a few very, very limited thesis projects there hasn’t been anything like this before.
Right now, the GPL-GPU has 3D graphics acceleration working with VGA on a PCI bus. The plan is to update this late-90s setup to interfaces that make a little more sense, and add DVI and HDMI output. Not bad for a failed Kickstarter, right?
Hold on to your hats, because this is a good one. It’s a tale of disregarding the laws of physics, cancelled crowdfunding campaigns, and a menagerie of blogs who take press releases at face value.
Meet Silent Power (Google translation). It’s a remarkably small and fairly powerful miniature gaming computer being put together by a team in Germany. The specs are pretty good for a completely custom computer: an i7 4785T, GTX 760, 8GB of RAM and a 500GB SSD. Not a terrible machine for something that will eventually sell for about $930 USD, but what really puts this project in the limelight is the innovative cooling system and small size. The entire machine is only 16x10x7 cm, accented with a very interesting “copper foam” heat sink on top. Sounds pretty cool, huh? It does, until you start to think about the implementation a bit. Then it’s a descent into madness and a dark pit of despair.
There are a lot of things that are completely wrong with this project, and in true Hackaday fashion, we’re going to tear this one apart, figuring out why this project will never exist.
Continue reading “Behold! The Most Insane Crowdfunding Campaign Ever”
[Gpuhackr] chose his username to explain exactly how he spends his time. For instance, here he’s using an STM32 Discovery board to drive an AMD Radeon HD 2400 graphics card. The ARM microcontroller isn’t actually using the PCIe interface on the card. Instead, [Gpuhackr] has patched into the debugging interface built into the card itself. This isn’t quite as straight forward as it sounds, but if you do the wiring carefully it’s a pretty intersting way to connect an ARM to an LCD monitor.
This project would be almost impossible if it weren’t for the open source code which AMD has released. This lets him implement the card’s 3D rendering features. The demo directly programs the UVD Xtensa CPU which is on the video card. It draws a cube with color gradients on each side. The cube spins while the debug information is overlaid on the screen. In this case the ARM chip/board is really being used as a programmer to upload some custom firmware. But we think a real code-ninja could implement a communications protocol to open up a simple way to drive the card in real-time.
Upgrading a desktop with a diamond cutting wheel
[Michail] needed a new graphics card. The only problem was his motherboard didn’t have any free PCI-E x16 slots available. Unable to find a PCI-E x1 card, he did what any of us would do and broke out the Dremel. Yes, he got it working, but don’t do this unless you know what you’re doing.
[Steve] recently got a Galaxy S3 and was looking for something to do with his old phone. It’s got WiFi, it’s got a camera, and with a free app, [Steve] now has an IP Webcam. Neat way to recycle a phone.
This is now bookmarked
We’re not much for plugging other blogs, but Math ∩ Programming – that’s intersection, remember – is really cool. Apparently it has been around for a little more than a year and already there are quite a few really cool posts. How to use cellular automaton to generate caves in video games and facial recognition through Eigenvalues are amazingly in depth, and show the theory behind some really cool techniques. Very, very cool.
Troll Physics: now wireless!
Remember [Fredzislaw100], the guy who puzzled the Internet with impossible circuits? He’s back again, this time with wireless LEDs. We’re guessing something similar to an induction charging system in the battery clip, wirelessly coupled to something under the paper, and that is wirelessly coupled to the LEDs. Your guess will probably be better than ours, though.
Not shown: Captain Obvious, Major Major
Pv2 [Zachary Ricks] of the U.S. Army thought we would get a kick out of the last name of one of the guys in his company. Yes, it’s ‘Hackaday,’ and yes, it’s a real surname. Here’s the full pic [Zach] sent in. Apparently it’s a name along the lines of ‘Holiday.’ Honestly, we had no idea this was a real surname, but we’re thinking Private Hackaday could use a care package or two (dozen).
Anyone up for sending a few hacker friendly (for [Zach] and a few other guys) care packages? Even socks or books or Oreos would make for an awesome care package. Email me if you want the mailing address.
Even though NVidia and ATI have been open-source friendly for a while now, there still isn’t a true open-source graphics card. [Anton] and [Per] are trying to fix that by building his own graphics card around an FPGA. The project is called ORSoC, and it’s available on opencores.com.
The guys are building the ORSoC graphics card around a Digilent Atlys FPGA dev board. So far, he can draw lines, textured triangles, bitmap or vector fonts, and throw a few 3D meshes up on the screen. This project isn’t intended to run advanced OpenGL or Steam on Linux, but for all the work that into this graphics accelerator, it’s an amazing piece of work.
There are a few demos after the break; a cube rotating in 3D and a demo drawing and translating polygons and a few textures. The ORSoC is a bit slow, but that’s an artifact of the build not being optimized for the FPGA the team is using. If you’d like to test this graphics card, there’s a Git available. As a bonus you don’t even need an FPGA to play around with this project. There’s also a software emulation of all the functions. Very neat.
Continue reading “Open source graphics card”
[Taylor] popped a new graphics card into his computer, but before he could settle in for a round of gaming, his card started to overheat. He eventually tracked the problem down to an undersized power supply, but the prospect of cooking his new GPU to death made him think twice about how he was monitoring his system’s health.
To continually keep tabs on his video card’s temperature going forward, he put together a small circuit that will alert him if things start to get too hot. He mounted a small temperature sensor on his graphics card near the GPU, wiring it to an Arduino. The Arduino monitors his video card, lighting an RGB LED blue if conditions are alright. If the temperature rises above 50C, the LED changes to red, signaling a problem.
We’re aware that there are all sorts of software applications that can monitor component temperatures for you, but the appeal of [Taylor’s] system is that it can be easily seen from across the room rather than via the desktop. That said, we think that his system could take advantage of his PC’s case fan lighting for a more visible warning, and it wouldn’t hurt to wire in an auto-shutdown feature in case his computer overheats while he’s away.