Modular Pockit Computer Is More Than Meets The Eye

“Modular” and “Computer” have historically been on the opposite ends of a rather awkward spectrum. One could argue that a hobbyist grade PC is modular, but only to a point. Re-configuring it on the fly is not readily possible. Modular laptops are slowly happening, but what about handheld devices, where our needs might change on a regular basis?

Enter the Pockit: a fully modular IoT/edge computing device that can be reconfigured on the fly without having to reprogram it. Don’t browse away from this page without watching the demonstration video below the break. It just might be the “mother of all demos” for the current decade.

A modular base provides basic computing power in the form of a Raspberry Pi, like many other projects. The base has twelve magnetic connectors, each with twenty I/O and power pins. When a module is added, the operating system detects the new module and loads an appropriate program on the fly. When more modules are loaded, it automatically configures itself so that all modules have a purpose. This allows the Pockit to be an integrated IoT device, an edge computing powerhouse, a desktop computer, a Blackberry-esque handheld, or a touch screen tablet, and so many more things.

For example, if a camera is added, it displays an image on a screen — if there’s  a screen. If a button is added, it automatically takes a picture when the button is pressed. If you want the camera to be motion activated, just add a motion sensor. Done. External devices can be controlled with relays and home automation integrates almost seamlessly.

There are a great number of features that we’re glossing over for the sake of getting to the point: Go watch the video and when you’re done, perhaps you’ll be as astonished as we are. We’ve expressed our love of modular hardware like the Pockit in the past, and after watching this demo, we can only hope that this is what the future of computing and electronics looks like!

Continue reading “Modular Pockit Computer Is More Than Meets The Eye”

Top side of the VL670 breakout board, with two USB connectors and the VL670 chip in the center.

A Chip To Bridge The USB 2 – USB 3 Divide

On Twitter, [whitequark] has  found and highlighted an intriguing design – a breakout board for the VL670, accompanied by an extensive yet very easy to digest write-up about its usefulness and inner workings. The VL670 is a chip that addresses a surprising problem – converting USB 2.0 signals into USB 3.0.

If you have a USB 2.0 device and a host with only USB 3.0 signals available, this chip is for you. It might be puzzling – why is this even needed? It’s about the little-known dark secret of USB3, that anyone can deduce if they ever have to deal with a 9-pin USB 3.0 connector where one of the three differential pairs doesn’t quite make contact.

When you see a blue “3.0” port, it’s actually USB 2 and USB 3 — two separate interfaces joined into a single connector. USB 3 uses two single-directional differential pairs, akin to PCI-E, whereas USB 2 uses a single bidirectional one, and the two interfaces on a blue connector operate basically independently of each other. There’s many implications to this that are counterintuitive if you simply take “USB 3.0” for “faster backwards-compatible USB”, and they have painful consequences.

For instance, USB 3 hub ICs have two separate hub entities inside – one for USB 3 and one for USB 2. Even if you have a USB 3 hub plugged into a USB 3 port, multiple USB 2 devices plugged into it still cannot break through the USB 2 uplink limit of 480 MBps. If you ever thought that a faster hub with a faster uplink would fix your USB 2 device speed problems – USB-IF engineers, apparently, thought differently; and you might have to find a workaround for your “many cheap SDRs and Pi 4 in a box” setup. Continue reading “A Chip To Bridge The USB 2 – USB 3 Divide”

Are Apple Trying To Patent The Home Computer 45 Years Too Late?

In our recent piece marking the 10th anniversary of the Raspberry Pi, we praised their all-in-one Raspberry Pi 400 computer for having so far succeeded in attracting no competing products. It seems that assessment might be premature, because it emerges that Apple have filed a patent application for “A computer in an input device” that looks very much like the Pi 400. In fact we’d go further than that, it looks very much like any of a number of classic home computers from back in the day, to the extent that we’re left wondering what exactly Apple think is novel enough to patent.

A Raspberry Pi 400 all-in-one keyboard console computer
Looks pretty similar to us.

Reading the patent it appears to be a transparent catch-all for all-in-one computers, with the possible exception of “A singular input/output port“, meaning that the only port on the device would be a single USB-C port that could take power, communicate with peripherals, and drive the display. Either way, this seems an extremely weak claim of novelty, if only because we think that a few of the more recent Android phones with keyboards might constitute prior art.

We’re sure that Apple’s lawyers will have their arguments at the ready, but we can’t help wondering whether they’ve fallen for the old joke about Apple fanboys claiming the company invented something when in fact they’ve finally adopted it years after the competition.

Thinking back to the glory days of 8-bit computers for a moment, we’re curious which was the first to sport a form factor little larger than its keyboard. Apple’s own Apple ][ wouldn’t count because the bulk of the machine is behind the keyboard, but for example machines such as Commodore’s VIC-20 or Sinclair’s ZX Spectrum could be said to be all-in-one keyboard computers. Can anyone provide an all-in-one model that predates those two?

You can read our Raspberry Pi 400 review if the all-in-one interests you.

Via Extreme Tech.

 

A purple 3D-printed case with an LCD screen on the front and Pikachu on top

Avoid Repetitive Strain Injury With Machine Learning – And Pikachu

The humble mouse has been an essential part of the desktop computing experience ever since the original Apple Macintosh popularized it in 1984. While mice enabled user-friendly GUIs, thus making computers accessible to more people than ever, they also caused a significant increase in repetitive strain injuries (RSI). Mainly caused by poor posture and stress, RSI can lead to pain, numbness and tingling sensations in the hand and arm, which the user might only notice when it’s too late.

Hoping to catch signs of RSI before it manifests itself, [kutluhan_aktar] built a device that allows him to track mouse fatigue. It does so through two sensors: one that measures galvanic skin response (GSR) and another that performs electromyography (EMG). Together, these two measurements should give an indication of the amount of muscle soreness. The sensor readout circuits are connected to a Wio Terminal, a small ARM Cortex-M4 development board with a 2.4″ LCD.

However, calculating muscle soreness is not as simple as just adding a few numbers together; in fact the link between the sensor data and the muscles’ state of health is complicated enough that [kutluhan] decided to train a TensorFlow artificial neural network (ANN), taking into account observed stress levels collected in real life. The network ran on the Wio while he used the mouse, pressing buttons to indicate the amount of stress he experienced. After a few rounds of training he ended up with a network that reached an accuracy of more than 80%.

[kutluhan] also designed a rather neat 3D printed enclosure to house the sensor readout boards as well as a battery to power the Wio Terminal. Naturally, the case was graced by a 3D rendition of Pikachu on top (get it? a mouse Pokémon that can paralyze its opponents!). We’ve seen [kutluhan]’s fondness for Pokémon-themed projects in his earlier Jigglypuff CO2 sensor.

Although the setup with multiple sensors doesn’t seem too practical for everyday use, the Mouse Fatigue Estimator might be a useful tool to train yourself to keep good posture and avoid stress while using a mouse. If you also use a keyboard (and who doesn’t?), make sure you’re using that correctly as well.

Continue reading “Avoid Repetitive Strain Injury With Machine Learning – And Pikachu”

3D Printed CPU Bracket Reduces Temperatures

What do you do when your motherboard bends your CPU? If you’re [Karta] or [Luumi], you 3D print a new retaining bracket to fix the problem. [Karta] originated the design, and [Luumi] also tried it and produced the video you can see below.

We think we find flat surfaces all the time, but it is actually very difficult to create something truly flat. You usually learn this when you try to maximize heat transfer between two surfaces. Getting two supposedly flat surfaces to touch is quite hard. CPU brackets use a combination of pressure and some sort of thermal media or paste to fill in any gaps between the CPU case and the heat sink. Intel’s LGA1700 bracket is an example, but there’s been a problem. Apparently, with recent CPUs, the bracket is a little too tight, and it bends the CPU’s case. It doesn’t hurt the CPU, but it does inhibit thermal transfer.

Others have “fixed” this problem by adding some washers to slightly raise the bracket. In both cases, there has been some very small improvement in CPU temperatures. [Luumi] says part of the problem is his water cooling block is not completely flat and needs to be lapped. [Karta], however, reported a 7 degree drop in temperatures, which is pretty significant.

We love seeing how 3D printing can fix or improve things you own. They talk a lot about lapping in the video, and, in some cases, people actually risk lapping the IC die itself to make it flatter. It can help, but the risk is relatively high and the gain is relatively low.

Continue reading “3D Printed CPU Bracket Reduces Temperatures”

Two revisions of Wenting's custom SSD board - earlier revision on the left, later, sleeker and more complete, on the right.

Custom SSD Gives New Life To Handheld Atom PC

People don’t usually go as far as [Wenting Zhang] has – designing a new IDE SSD board for a portable x86 computer made in 2006. That said, it’s been jaw-dropping to witness the astounding amount of reverse-engineering and design effort being handwaved away.

The Benq S6 is a small MID (Miniaturized Internet Device) with an Atom CPU, an x86 machine in all but looks. Its non-standard SSD’s two gigabytes of storage, however, heavily limit the OS choice – Windows XP would hardly fit on there, and while a small Linux distro could manage better, it’s, and we quote, “not as exciting”. A lot of people would stop there and use an external drive, or a stack of adapters necessitating unsightly modifications to the case – [Wenting] went further and broke the “stack of adapters” stereotype into shards with his design journey.

Tracing quite a few complex multi-layer boards into a unified and working schematic is no mean feat, especially with the SSD PCB being a host to two BGA chips, and given the sheer amount of pins in the IDE interface of the laptop’s original drive. Even the requirement for the SSD to be initialized didn’t stop him – a short fight with the manufacturer’s software ensued, but was no match for [Wenting]’s skills. The end result is a drop-in replacement SSD even thinner than the stock one.

This project is well-documented for all of us to learn from! Source code and PCB files are on GitHub, and [Wenting] has covered the journey in three different places at once – on Hackaday.io, in a YouTube video embedded down below, and also on his Twitter in form of regular posts. Now, having seen this happen, we all have one less excuse to take up a project seemingly so complex.

Hackers play with SSD upgrades and repurposing every now and then, sometimes designing proprietary-to-SATA adapters, and sometimes reusing custom SSD modules we’ve managed to get a stack of. If case mods are acceptable to you aesthetics-wise, we’ve seen an SSD upgrade for a Surface Pro 3 made possible that way.

Continue reading “Custom SSD Gives New Life To Handheld Atom PC”

IBM Eagle Has A Lot Of Qubits

How many qubits do you need in a quantum computer? Plenty, if you want to anything useful. However, today, we have to settle for a lot fewer than we would like. But IBM’s new Eagle has the most of its type of quantum computer: 127-qubits. Naturally, they plan to do even more work, and you can see a preview of “System Two” in the video below.

The 127 qubit number is both impressively large and depressingly small. Each qubit increases the amount of work a conventional computer has to do to simulate the machine by a factor of two. The hope is to one day produce quantum computers that would be impractical to simulate using conventional computers. That’s known as quantum supremacy and while several teams have claimed it, actually achieving it is a subject of debate.

Like any computer, more bits — or qubits — are better than fewer bits, generally speaking. However, it is especially important for modern quantum systems since most practical schemes require redundancy and error correction to be reliable in modern implementations of quantum computer hardware. What’s in the future? IBM claims they will build the Condor processor with over 1,000 qubits using the same 3D packaging technology seen in Eagle. Condor is slated for 2023 and there will be an intermediate chip due in 2022 with 433 qubits.

Scaling anything to a large number usually requires more than just duplicating smaller things. In the case of Eagle and at least one of its predecessors, part of the scaling was to use readout units that can read different qubits. Older processors with just a few qubits would have dedicated readout hardware for each qubit, but that’s untenable once you get hundreds or thousands of qubits.

Qubits aren’t the only measure of a computer’s power, just like a conventional computer with more bits might be less capable than one with fewer bits. You also have to consider the quality of the qubits and how they are connected.

Who’s going to win the race to quantum supremacy? Or has it already been won? We have a feeling if it hasn’t already been done, it won’t be very far in the future. If you think about the state of computers in, say, 1960 and compare it to today, about 60 years later, you have to wonder if that amount of progress will occur in this area, too.

Most of the announcements you hear about quantum computing come from Google, IBM, or Microsoft. But there’s also Honeywell and a few other players. If you want to get ready for the quantum onslaught, maybe start with this tutorial that will run on a simulator, mostly.

Continue reading “IBM Eagle Has A Lot Of Qubits”