Optimizing Linux for Slow Computers

It’s interesting, to consider what constitutes a power user of an operating system. For most people in the wider world a power user is someone who knows their way around Windows and Microsoft Office a lot, and can help them get their print jobs to come out right. For those of us in our community, and in particular Linux users though it’s a more difficult thing to nail down. If you’re a LibreOffice power user like your Windows counterpart, you’ve only really scratched the surface. Even if you’ve made your Raspberry Pi do all sorts of tricks in Python from the command line, or spent a career shepherding websites onto virtual Linux machines loaded with Apache and MySQL, are you then a power user compared to the person who knows their way around the system at the lower level and has an understanding of the kernel? Probably not. It’s like climbing a mountain with false summits, there are so many layers to power usership.

So while some of you readers will be au fait with your OS at its very lowest level, most of us will be somewhere intermediate. We’ll know our way around our OS in terms of the things we do with it, and while those things might be quite advanced we’ll rely on our distribution packager to take care of the vast majority of the hard work.

Linux distributions, at least the general purpose ones, have to be all things to all people. Which means that the way they work has to deliver acceptable performance to multiple use cases, from servers through desktops, portable, and even mobile devices. Those low-level power users we mentioned earlier can tweak their systems to release any extra performance, but the rest of us? We just have to put up with it.

To help us, [Fabio Akita] has written an excellent piece on optimizing Linux for slow computers. By which he means optimising Linux for desktop use on yesterday’s laptop that came with Windows XP or Vista, rather than on that ancient 486 in the cupboard. To a Hackaday scribe using a Core 2 Duo, and no doubt to many of you too, it’s an interesting read.

In it he explains the problem as more one of responsiveness than of hardware performance, and investigates the ways in which a typical distro can take away your resources without your realising it. He looks at RAM versus swap memory, schedulers, and tackles the thorny question of window managers head-on. Some of the tweaks that deliver the most are the easiest, for example the Great Suspender plugin for Chrome, or making Dropbox less of a hog. It’s not a hardware hack by any means, but we suspect that many readers will come away from it with a faster machine.

If you’re a power user whose skills are so advanced you have no need for such things as [Fabio]’s piece, share your wisdom on sharpening up a Linux distro for the rest of us in the comments.

Via Hacker News.

Header image, Tux: Larry Ewing, Simon Budig, Garrett LeSage [Copyrighted free use or CC0], via Wikimedia Commons.

Running Intel TBB On a Raspberry Pi

The usefulness of Raspberry Pis seems almost limitless, with new applications being introduced daily and with no end in sight. But, as versatile as they are, it’s no secret that Raspberry Pis are still lacking in pure processing power. So, some serious optimization is needed to squeeze as much power out of the Raspberry Pi as possible when you’re working on processor-intensive projects.

This simplest way to accomplish this optimization, of course, is to simply reduce what’s running down to the essentials. For example, there’s no sense in running a GUI if your project doesn’t even use a display. Another strategy, however, is to ensure that you’re actually using all of the available processing power that the Raspberry Pi offers. In [sagiz’s] case, that meant using Intel’s open source Threading Building Blocks to achieve better parallelism in his OpenCV project.

Continue reading “Running Intel TBB On a Raspberry Pi”

Lessons in Small Scale Manufacturing From The Othermill Shop Floor

Othermachine Co. is not a big company. Their flagship product, the Othermill, is made in small, careful batches. As we’ve seen with other small hardware companies, the manufacturing process can make or break the company. While we toured their factory in Berkeley California, a few interesting things stood out to us about their process which showed their manufacturing competence.

It’s not often that small companies share the secrets of their shop floor. Many of us have dreams of selling kits, so any lessons that can be learned from those who have come before is valuable. The goal of any manufacturing process optimization is to reduce cost while simultaneously maintaining or increasing quality. Despite what cynics would like to believe, this is often entirely possible and often embarrassingly easy to accomplish.

Lean manufacturing defines seven wastes that can be optimized out of a process.

  1. Overproduction: Simply, making more than you currently have demand for. This is a really common mistake for first time producers.
  2. Inventory: Storing more than you need to meet production or demand. Nearly every company I’ve worked for has this problem. There is an art to having just enough. Don’t buy one bulk order of 3,000 screws for six months, order 500 screws every month as needed.
  3. Waiting: Having significant delays between processes. These are things ranging from running out of USB cables to simply having to wait too long for something to arrive on a conveyor belt. Do everything you can to make sure the process is always flowing from one step to another.
  4. Motion: If you have a person walking back and forth between the ends of the factory to complete one step of the manufacturing process, this is wasted motion.
  5. Transport: Different from motion, this is waste in moving the products of each individual process between sections of the assembly.
  6. Rework: Get it right the first time. If your process can’t produce a product that meets specifications, fix the process.
  7. Over-processing: Don’t do more work than is necessary. If your part specifies 1000 hours of runtime don’t buy a million dollar machine to get 2000 hours out of it. If you can find a way to do it with one step, don’t do it with three.


The first thing that stuck out to me upon entering Othermachine Co’s shop floor is their meticulous system for getting small batches through the factory in a timely manner. This allows them to scale their production as their demand fluctuates. CNCs and 3D printers are definitely seasonal purchases; with sales often increasing in the winter months when hackers are no longer lured away from their workstations by nice weather.

As the seven sins proclaim. It would be a bad move for Othermachine Co. to make too many mills. Let’s say they had made an extra 100 mills while demand was at a seasonal low. If they found a design or quality problem from customer feedback they’d have to commit to rework, potentially throwing away piles of defective parts. If they want to push a change to the machine or release a new model they’d either have to rework the machines, trash them, or wait till they all sold before improving their product. Even worse, they may find themselves twiddling their thumbs waiting for their supply to decrease enough to start manufacturing again. This deprives them of opportunities to improve their process and leads to a lax work environment.

One way to ensure that parts are properly handled and inventory is kept to a minimum is with proper visual controls. To this end, Othermachine Co has custom cardboard bins made that perfectly cradle all the precision parts for each process in their own color coded container. Since the shop floor is quite small, it lets them focus on making spindle assemblies one day and motion assemblies another without having to waste time between each step. Also, someone can rekit the parts for a recently completed step easily without interrupting work on the current process going on.


It’s hard to define what’s over processing and what isn’t. My favorite example of what isnt, and something I’ve fought for on nearly every factory floor I’ve worked on is proper torque limiting screwdrivers. They’re a little expensive, but they are a wonderful tool that helps to avoid costly rework and over processing. For example, let’s say you didn’t have a torque limiting screwdriver. Maybe your customers would complain that occasionally a screw came loose. Now, one way to solve this would be the liberal application of Loctite. Another way would be an additional inspection step. Both of these are additional and completely uneccessary steps as most screws will hold as long as they are torqued properly.

In one factory I worked in, it was often a problem that a recently hired worker would overtorque a screw, either stripping it or damaging the parts it was mating together. A torque limiting screwdriver takes the worker’s physical strength out of the equation, while reducing their fatigue throughout the day. It’s a win/win. Any time a crucial step can go from unknown to trusted with the application of a proper tool or test step it is worth it.

Another section where Othermachine Co. applied this principle is with the final machining step for the CNC bed. The step produces a large amount of waste chips. Rather than having an employee waste time vacuuming out every Othermill after it has gone through this process, they spent some time designing a custom vacuum attachment. This essentially removed an entire production step. Not bad!

IMG_0396 (2)

With the proper management of waste it is entirely possible to save money and improve a process at the same time. It takes a bit of training to learn how to see it. It helps to have an experienced person around in order to learn how to properly respond to them, but with a bit of practice it becomes a skill that spreads to all areas of life. Have any of you had experience with this kind of problem solving? I’ve really enjoyed learning from the work stories posted in the comments.

Optimizing your electronics projects with a camera

What do you do when you have a microcontroller you’re trying to optimize? One method is using a debugger, but for AVRs and such that’s not a very common technique. For lower-level electronics projects, it’s nearly impossible, even. [Cnlohr] built a small Minecraft server that listens to in-game redstone circuits, but the performance of his real-world to block-world bridge wasn’t what he hoped. He came up with a pretty clever way of figuring out what was slowing his server down without any special gear at all.

[cnlohr]’s Minecraft server is just a simple AVR microcontroller, Ethernet adapter, and SD card affixed to a beautiful glass PCB. The performance of his server wasn’t what he expected; downloading a largish file from the server resulted in a download rate of about 55kbps, much slower than he expected. He wasn’t quite sure what the hangup was, so he took a camera and with a long exposure time took a very blurry picture.

The Minecraft server has a blue LED to show when the SD card was being accessed. In the picture above, [cnlohr] saw that SD card access was taking far too long, and if he wanted to optimize the code this would be the place to start.

Not bad for a dead-simple method of seeing where the code on your microcontroller project is slowing down.

Continue reading “Optimizing your electronics projects with a camera”

Behind the scenes of a 1K graphics demo

Programmer/designer [Steven Wittens] has posted a fantastic write-up on the black art of producing compact demo code, dissecting his own entry in the 1K JavaScript Demo Contest. The goal is to produce the best JavaScript demo that can be expressed in 1024 characters or less and works reliably across all standards-compliant web browsers.

[Wittens] details several techniques for creating a lot of visual flash in very few bytes, including the use of procedural graphics rather than fixed datasets, exploiting prime numbers to avoid obvious repetitions in movement, and strategically fudging formulas to save space while adding visual interest. These methods are just as applicable to other memory-constrained situations, not just JavaScript — some of the contest entries bear a resemblance to the compact microcontroller demos we’ve previously showcased, except running in your browser window.

The contest runs through September 10th, allowing ample time to come up with something even more clever. Whether he wins or not, we think [Steven] deserves special merit on account of having one of the most stylish blogs in recent memory!