Google Ups The Ante In Quantum Computing

At the American Physical Society conference in early March, Google announced their Bristlecone chip was in testing. This is their latest quantum computer chip which ups the game from 9 qubits in their previous test chip to 72 — quite the leap. This also trounces IBM and Intel who have 50- and 49-qubit devices. You can read more technical details on the Google Research Blog.

It turns out that just the number of qubits isn’t the entire problem, though. Having qubits that last longer is important and low-noise qubits help because the higher the noise figure, the more likely you will need redundant qubits to get a reliable answer. That’s fine, but it does leave fewer qubits for working your problem.

Continue reading “Google Ups The Ante In Quantum Computing”

Images As Excel FIles Are Gloriously Nasty

Almost every person of a technical persuasion who has worked in an office will have some tale of wildly inappropriate use of office technology for a task that could have been accomplished far more simply with an appropriate tool. There are jokes about people photocopying a blank sheet of paper when they need a few sheets themselves, but some of the real stories are very bit as surreal.

[Bjonnh]’s patience for such things was exceeded when he received a screenshot embedded in a Microsoft Word file. His response is both pointless and elegant, a Python script that takes a JPEG image and encodes it into an Excel file. It’s simply an array of cells whose background colours represent the pixels, and he warns us that the output files may take a while to load. We just had to subject it to a test, but are sorry to report that LibreOffice doesn’t seem to want to play ball.

So yes, this is a small departure from our usual fare of hardware, and it serves no use other than to be a fantastically awful misuse of office technology. If you’ve ever been emailed a PowerPoint invitation to the office party though, then maybe you’ll have cracked a smile.

If pushing your corporate spreadsheet to the limit is your thing, perhaps you’d also like to see it running a 3D engine.

Microsoft Quantum Simulator Goes To Linux And Mac

Everyone seems to be gearing up for the race to be the king of quantum computers. The latest salvo is Microsoft’s, they have announced that their quantum simulator will now run on macOS and Linux, with associated libraries and examples that are now fully open source. They have produced a video about the new release, which you can see below.

Microsoft also claims that their simulator is much faster than before, especially on large simulations. Of course, really large simulations suffer from memory problems, not speed problems. You can run their simulator locally or on their Azure cloud.

Continue reading “Microsoft Quantum Simulator Goes To Linux And Mac”

Stretched PC Case Turned GPU Cryptominer

We don’t do financial planning here at Hackaday, so we won’t weigh in on the viability of making money mining cryptocurrency in such a volatile market. But we will say that if you’re going to build a machine to hammer away at generating Magical Internet Monies, you might as well make it cool. Even if you don’t turn a profit, at least you’ll have something interesting to look at while you weep over your electricity bill.

Sick of seeing the desktop machine he built a decade ago gathering dust, [plaggle24w5] decided to use it as the base for a cryptocurrency mining rig. Of course, none of the original internals would do him any good, but the case itself ended up being a useful base to expand on. With the addition of some 3D printed components, he stretched out the case and installed an array of video cards.

To start with, all the original plastic was ripped off, leaving just the bare steel case. He then jammed a second power supply into the original optical drive bays to provide the extra power those thirsty GPUs would soon be sucking down. He then designed some 3D printed arms which would push out the side panel of the case far enough that he could mount the video cards vertically alongside the case. Three case fans were then added to the bottom to blow air through the cards.

While [plaggle24w5] mentions this arrangement does work with the case standing up, there’s obviously not a lot of air getting to the fans on the bottom when they’re only an inch or so off the ground. Turning the case on its side, with the motherboard parallel to the floor, allows for much better airflow and results in a measurable dip in operating temperature. Just hope you never drop anything down onto the exposed motherboard…

Mining Bitcoin on desktop computers might be a distant memory, but the latest crop of cryptocurrencies are (for now) giving new players a chance to relive those heady early days.

“The Commodore Story” Documentary Premieres Today

What is it about a computer that was introduced 36 years ago by a company that would be defunct 12 years later that engenders such passion that people still collect it to this day? We’re talking about the Commodore 64, of course, the iconic 8-bit wonder that along with the other offerings from Commodore International served as the first real computer to millions of us.

There’s more to the passion that Commodore aficionados exhibit than just plain nostalgia, though, and a new documentary film, The Commodore Story, seeks to explore both the meteoric rise and fall of Commodore International. Judging from the official trailer below, this is a film anyone with the slightest interest in Commodore is not going to want to miss.

It will of course dive into the story of how the C64 came to be the best selling computer in history. But Commodore was far from a one-trick pony. The film traces the history of all the Commodore machines, from the PET computers right through to the Amiga. There are interviews with the key players, too, including our own Bil Herd. Bil was a hardware engineer at Commodore, designing several machines while there. He has shared some of these stories here on Hackaday, including the development of the C128  (successor to the C64) and making the C64 speak.

We can’t wait to watch this new documentary and luckily we won’t have to. It’s set to start streaming on Netflix, Amazon, and iTunes today, so pop up some popcorn and settle in for a two-hour ride through computer history but right now we’re unable to get firm dates on when. However, those of you in the Mountain View area have an even better opportunity this evening.

The Commodore Story will be premiered live at 6:30pm PST at the Computer History Museum. Grab your tickets to the premiere and a Q&A session with Bil Herd, Leonard Tramiel, and Hedley Davis.

Continue reading ““The Commodore Story” Documentary Premieres Today”

A Two Tapes Turing Machine

Though as with so many independent inventors the origins of computing can be said to have been arrived at through the work of many people, Alan Turing is certainly one of the foundational figures in computer science. His Turing machine was a thought-experiment computing device in which a program performs operations upon symbols printed on an infinite strip of tape, and can in theory calculate anything that any computer can.

In practice, we do not use Turing machines as our everyday computing platforms. A machine designed as an academic abstract exercise is not designed for efficiency. But that won’t stop Hackaday, and to prove that point [Olivier Bailleux] has done just that using readily available electronic components. His twin-tape Turing machine is presented on a large PCB, and is shown in the video below the break computing the first few numbers of the Fibonacci sequence.

The schematic is available as a PDF, and mostly comprises of 74-series logic chips with the tape contents being displayed as two rows of LEDs. The program is expressed as a pluggable diode matrix, but in a particularly neat manner he has used LEDs instead of traditional diodes, allowing us to see each instruction as it is accessed. The whole is a fascinating item for anyone wishing to learn about Turing machines, though we wish [Olivier] had given  us a little more information in his write-up.

That fascination with Turing machines has manifested itself in numerous builds here over the years. Just a small selection are one using 3D printing, another using Lego, and a third using ball bearings. And of course, if you’d like instant gratification, take a look at the one Google put in one of their doodles for Turing’s 100th anniversary.

Continue reading “A Two Tapes Turing Machine”

Whatever Happened To The Desktop Computer?

If you buy a computer today, you’re probably going to end up with a laptop. Corporate drones have towers stuffed under their desks. The cool creative types have iMacs littering their open-plan offices. Look around on the online catalogs of any computer manufacturer, and you’ll see there are exactly three styles of computer: laptops, towers, and all-in-ones. A quick perusal of Newegg reveals an immense variety of towers; you can buy an ATX full tower, an ATX mid-tower, micro-ATX towers, and even Mini-ITX towers.

It wasn’t always this way. Nerds of a sufficient vintage will remember the desktop computer. This was, effectively, a tower tilted on its side. You could put your monitor on top, negating the need for a stack of textbooks bringing your desktop up to eye level. The ports, your CD drive, and even your fancy Zip drive were right there in front of you. Now, those days of desktop computers are long gone, and the desktop computer is relegated to history. What happened to the desktop computer, and why is a case specifically designed for a horizontal orientation so hard to find?

Continue reading “Whatever Happened To The Desktop Computer?”