Guest Rant: From Bits To Atoms

I’ve been a software developer for quite a while. When you spend long enough inside a particular world, it’s easy to wind up with an ever-narrowing perspective. You start seeing everything from a software point of view. As the saying goes, when your only tool is a hammer, you tend to treat every problem as NP-Complete. Or something. I forget how that goes.

Anyway, the point is, it’s always good to broaden one’s horizons, and solve as many different kinds of problems as possible. To that end, I started to get into hobby electronics recently. The journey has been very enlightening in a number of ways.

First of all, I learned that hardware is really not so different from software. The same principles of engineering design and debugging apply. When designing a complex hardware system, you use the same basic approach as a large software system. You break it down into increasingly smaller components until all the pieces are small enough to be implemented and verified in isolation. Then you start implementing the pieces, connecting them, and debugging issues that occur as a result of those interactions. Gradually you wind up with a complex entity that functions, but is too large to understand all at once.

Second, I learned that hardware people have cooler tools (like, with blinking lights and stuff). Some of them make noises. Some of them are hot or sharp.

Hardware arguably has more of what are thought of as “hard bugs” in production software engineering. These are bugs that are difficult to reproduce, or behaviours that are non-deterministic. These types of problems are common in very large systems, realtime systems, and multithreaded systems. An experienced software developer has encountered and solved many of these, often spending days or weeks on a single bug. Luckily, that knowledge maps very well to hardware debugging. This is not to say hardware development is more difficult than software development. I think they are about equal in this regard, but the challenges tend to lie in different areas.

Good software engineering is largely about minimizing the hard bugs. A well designed system will experience fewer memory leaks, fewer deadlocks, fewer race conditions, and so forth. Hardware is a world of AC noise, electromagnetic interference, static electricity, parasitic capacitance, and all kinds of other random phenomenon that can make debugging difficult. Similarly to software, much of good hardware design is down to minimizing these sorts of influences.

In the end, I’m learning that hardware can be immensely satisfying, because you end up with something you can hold in your hand when you’re done. Don’t get me wrong, I love writing software. However, it does suffer from the problem that everything you make is fleeting. I think back to all the hundreds of hours I spent on projects in grade school, for example. Those games and demos are all gone now, probably forever. The floppies are theoretically still in my mom’s basement, but they are most likely unreadable by now. Modern code still has the same problem, in a different form. Sure, with the internet and constant migration, the code itself can live forever. However, all the layers of languages, SDKs, libraries, and drivers that it depends on become obsolete if not constantly maintained. Backward compatibility has a window, which the code will slide out of if not kept up. At that moment, it ceases to function. Software is, and always will be, fleeting.

Perhaps the acceptance of the idea of hard work going into something fleeting is the difference between hardware and software people. Software people are used to thinking “virtually”, and long ago came to terms with the idea that when a project is done, it is pushed aside, perhaps forgotten, and all focus is put on the Next Thing.

This is a great time to explore both sides of the hardware/software divide. Microcontrollers and friendly development boards are blurring the lines between the ‘wares. The barrier to entry to the other side is getting lower all the time. So, if you’re a software person, try getting your hands dirty and sling a little flux. If you’re a hardware person, try your hand at a genetic algorithm, a pathfinder, or a quicksort. You might be surprised at how awesomely the other half lives.

 * * *

IMG_0533About the author:

[Quinn Dunki] has been making games for 32 years, the last 17 of which have been professional. She was recently AI Architect at Pandemic Studios, on the dynamic open-world title “The Saboteur”. She now pursues consulting, independent development, mixed-media engineering projects, and writing. She has a mobile software company and consultancy called One Girl, One Laptop Productions, which makes games and other applications for iOS, Android, Mac, and PC. In her spare time she takes pictures, races cars, hacks electronics, fabricates computers, and berates her friends with sarcasm.

83 thoughts on “Guest Rant: From Bits To Atoms

  1. I’m in the exact same situation.
    I would add that, compared to software, learning material on the theory is quite harder to find for electronics (IMHO this is improving fast though).

    1. Same boat here – hardware requires a bit of an investment by comparison. I also hate having to wait on parts I was only able to order from China. ;)

      Having the knowledge of software programming has made the hardware journey much easier than it would have been otherwise.

      1. That’s one book. In the software world, you can pick up 50 or more books on any given topic. And something about a older version of a piece of software (say an older Apache or MySQL) will still be somewhat useful as long as it’s just a minor upgrade and if you don’t mind reading a change log. And finding that book for a reasonable price? Amazon wants 100 for used copies!

        But finding a set of electronics books that cover a topic like transistors from etching to use, BJT to MOSFET, and what all the different graphs mean can be a bit of trouble. And that’s for just one piece of common hardware that’s probably built into most of the ICs that we all use.

    2. A lot of that has to do with not having ‘old enough’ books. Since they solved the problems that required specific solutions, with a general-purpose computer, no one writes books on electronics hardware other than college professors or such individuals. ;)

    3. I’m not sure but there is plenty og learninh material at YouTube. The US military training videos really can’t be bear & there are also videos others have created that are as good, and the graphics might be more pleasing to the eye I recently viewed one of Dave Jones (eevlog) videos where he said he hasn’t yet come across an error on Wikipedia articles on the topics dealing with electronics, Wikipedia has books that can be downloaded Most of the Navy Electricity Electronics Training Series can be found at, search for neets. Many find this site useful

      1. abcdefghijklmnopqrstuvwxyz

        That’s a complete alphabet.
        There are no errors.

        Do you think that putting up just there 26 glyphs to children is the best way of teaching it?
        or do you think it requires careful practice and defined and structured lessons?

        Sure you can “read” about lots of things on Wikipedia, that does not make it a good “learning” resource.

        A video of a person writing an alphabet is also no better indicator or lesson, sure, you might see how to hold a pen, but unless the camera angle is just right you won’t actually see the pen strokes.

        The same is true with videos of electronics, a Wikipedia page that explains what a transistor is gives you smoe clue how to use it, but not any real good information around device parameters or how to read charts etc, and fast paced five minute videos that explain theory badly are not good learning resources either. (or in the case of EEVBlog, half hour videos that often explain stuff badly). -not that the EEVBlog is bad, I do enjoy watching the videos, I just don’t think that it’s a good learning resource. -or at least no substitute for a properly structured lesson, either written down, or a properly planned out lesson video with structure rather than a rant about whatever Dave feels like talking about today.

  2. I often think about the different kinds of software. Some software is written to be thrown away after it’s done its one and only task (eg. Apollo flight control software). This software may take many man-years to write, yet is only ever used for a few days, hours or less. Some software is deployed once and lives forever (the bespoke stuff running most companies) – it becomes the fabric of an organisation. Other software is deployed many times in exactly the same form (console games). Much software lives longer than anyone ever expected – the COBOL underlying many accounting and banking organisations, for example – or the stuff running Air Traffic Control in the USA. Some of this stuff now has to be treated as “do not touch” and we can only amend its functionality through the layers of software on top. It’s like the soul which outlives the body, flitting from one set of hardware to another when the original dies.

  3. It’s funny, but the projects I remember the fondest were electromechanical – coil winders – because I could see the end results of the software I wrote. Also, I’ve had a warm spot for old style pinball machines for the same reason

      1. I still remember the shock I felt when I realized I could identify a pinball machine halfway across an arcade just by the sounds it made. That’s when I knew I had been playing too much.

    1. the first time I opened the back on an electromechanical pinball machine I was in awe!

      Picture a 20 by 20 array of relays and a couple of rows of 10 contact “step” relays.

      Then I noticed a pile of small pieces of paper down the bottom of the back plane, with “A12”, “B16” etc printed on them.

      It took me a while to realise it was all the labels from the relays that had fallen off!

  4. Not to nag, troll or pick on the article too much, but i take issue with this line, which I see a variation of a lot:

    “This is a great time to explore both sides of the hardware/software divide. Microcontrollers and friendly development boards are blurring the lines between the ‘wares. The barrier to entry to the other side is getting lower all the time.”

    I know you mean this from your perspective, which you admit you haven’t been doing the hardware side of things for all that long, and you say the barrier is getting lower, but I think saying things like this tends to make people confused. “Friendly” Dev. boards and microcontrollers have been around a lot longer than people tend to think. The basic stamp has been around for many years now, PICs have been around a long time etc. etc. Just because the internet loves to think it’s inventing the ground beneath its own feet and there are new fads in hardware development, it doesn’t mean there weren’t things that came before. Yes, ardunio has introduced a new generation of kids to microcontrollers, but its far from the first thing out there. It’s the same thing as a basic stamp, if anything the arduino made the software side of things easier for hardware guys because of the fact you can steal all your code or “examples” of code from a library. But the basic stamp even had that going for it. But really they aren’t that different hardware wise…

    I think people either don’t know about what existed before their time, want to think “they were the first to…” or just don’t care to give credit where credit is due. All that tends to make people think the older stuff was somewhat more difficult to get into than the new stuff, which really isn’t true at all.

    I realize that people also love stuff that is cheap, and they see C as the universal programming language, you don’t really need a fancy IDE to code with it and so anything that uses C and has a low entry price is going to be the darling, but please don’t think that the things that came before were any harder or any less innovative!

      1. I’m glad you rolled that back.

        I started getting into microcontrollers before Arduino was available. I started with the BS2 — I HATED coding for that thing! Once I realized how slow it was I built a DAPA programmer and bought my first AVR. After picking up an AVR Dragon (a proper programmer/debugger for $50?!? awesome) I never looked back.

        I may have my history clouded, but it seems to me that the ability of an average person to purchase single units of programmable microcontrollers and a cheap programmer is more of a new thing that really DID lower the bar.

        1. I made my first programmer, (a JDM programmer) cost me about £3 in components, (I’ve still got it an it still works.) (it’s on Vero board, and that’s insulated to stop shorts if I put it on anything metal by using some duck tape.)
          Mplab was free
          the hitech compiler was free.
          ICprog -to get the code on the chip.was free.

          the chips were ordered as samples from Microchip -and were free…

          the idea that there is a cost entry to electronics is now, and was a decade ago false.
          the idea that even the cheapest of arduino clones remove a cost barrier is pretty false. -they are no less cool but as said above, they are more like a tool to lower a software writing barrier rather than a cost of hardware barrier…

          1. The first programmer I ever used was thumb wheels, set each address/data press the write button – writing code for 1802s.
            The first programmer I built ran from the P.C. parallel port (modded to be bi-directional), a few counters, buffers and gates, running Forth on the P.C. to send an image to program Ucontrollers.
            Currently I am on the third incarnation of my cnc router. It’s just so satisfying to draw something, then cut/shape it from wood, build it, then take it out in the surf and ride it, yep wooden surfboards.

          2. Exactly the point really, i(in this case) is it’s not that arduino lowers the cost to entry.

            Instead of thumb wheels you could have just used a set of resistors to set values if you wanted. -so the cost of a programmer could be pence… (But doesn’t compare to the dragon programmer that mike mentioned, saying he got a great programmer for fifty dollars, the average “hacker” has most if the jdm components laying about, the transistors are the only thing that are “specialist” basically I got a similar grade programmer for 1% of what he paid after my parts bin was raided, if I had to buy the parts it’d have been the few quid I said before..,)

            The arduino style language being very much based on c doesn’t really even make micro programming any easier either if you’re starting from scratch…

            If you think arduino won because it was cheaper to enter then you’re wrong. If you think arduino win because it’s easier to program, again I don’t think so.

            Arduino “won” it’s market share by having good marketing. The code libraries that allow programs to be built easily (by pasting together premise blocks) is a great motivator. That reduced the skill needed to program micro controllers…

            But as everyone has pretty much always said (and as cheap clones prove) the arduino is not the product that removed the cost if entry.

            I agree with the article, it is a good time to be “into” electronics, there is always something new online, always a new code sample to copy, always a new circuit block to see… Just that “into” and learning might be different things in this context.

        2. I learned to love coding for basic stamps. If software for a PC is like writing a novel, then an arduino program is a poem, and a basic stamp program is a Haiku. In unsophisticated hands a haiku might seem a cheap and uninspiring novelty, but to the japanese haiku masters, they would labor for months over the three little lines, finding a perfect metaphor, the perfect season word, the perfect cutting word, to finally create a tiny objet de art that can move even the most hardened critic.

          Basic stamps taught me to be concise, clear, and defined, knowing that like the haiku which simply provides a bootstrap for the imagination, the real magic happens outside the code in the circuit.

        3. Not only that, but they have cheap ones with support for freaking USB!
          About the only thing I can see beating that is if you eventually can buy a currently $500-1500 FPGA for $5 and it comes with open software not owned by the likes of Keil.

      2. I appreciate the intellectual honesty you just demonstrated. That’s a rare thing on the Internet.

        You’re right that in a sense there’s nothing new under the sun. Arduino is just PC104 all over again. PC104 was just a rehash of the 8-bit open platforms of the time, like the Apple II. Then there’s the early CPU trainers like the Kim-1. Everything is derived from something else.

        However, I would argue that part of barrier to entry is approachability and attainability. Something like an Arduino or a Pi is very easy for any lay person to buy, cheap as dirt, the documentation and community are great, and the learning curve is shallow. Those are all big improvements over embedded systems of the past.

          1. To jump back a bit, yes the Arduino is largely marketing, but at it’s core it’s a well-understood plug and play platform where you don’t need to run out and buy anything else, unless you’ve been living under a rock and need a (standard) USB cable.

            The Pi and BeagleBone Black are far more impressive really. For under $50 you can get a full Linux system that’s more powerful than any PC from before 1998 or so, and just as electronics hobbyist friendly as an Apple II was. Modern TV’s also work far better as makeshift computer monitors.

          2. No I don’t live under a rock. I didn’t have to go out and buy anything
            for my $68.11 MC68HC11 eval board from the late 1980’s. I even shipped a
            line of products based on that and an embedded PC. Nor did I have to buy
            anything my <$10 68HC908 board either. The 68HC908 even comes with a
            professional development tool Code Warrior.

            My $30 router or my $30 Dockstar can run linux and they don't have
            multiple junior engineer level design issues like the RPi.

          3. The Arduino plug n play complete system did the trick. No electronics experience needed to get going. An hour out of the box and you were up and running blinking leds faster, slower…. YOU were making things happen.. AND IT WAS RIGHT AWAY!.

            Take BS1 and BS2 for example. Did many years of these myself. Buy the thing… Make a cable… Pick a terminal program… Build a power supply… Maybe a week later you got a hand wired led to work… but at 2-3 days the non-tech user had already lost interest.

            Yah.. that Arduino LED and simple start up was a great move! Captured and swelled the initiate’s enthusiasm within the first hour or two in their hands. Those products that came before had technical challenges to overcome first before even getting a shot at being powered up much less running the first program…. but that was just the state of technology, it took some work.

            What will your kids have to play with?

        1. When I was a kid, I went to the electronics store, and got 74xx series gates, flip flops, shift registers, 4040 counters, and had great fun with those. The barrier to entry was just as low as now. The only difference was that the things were more primitive. But for learning and understanding electronics, a bunch of simple TTL logic will teach you as much as an Arduino.

        2. What really seems new to me, is the combination of “Homebrew Electronics” and “online community”.

          I could have started playing with something like PC-104 or Basic STAMP or a Rabbit 2000, or something like that, in the late 1990s. What I couldn’t do then, was just google “stepper motor controller programming and circuits for arduino”. Information wants to be free on the internets.

          Kudos for diving into the hardware. I have all the bits to start playing with Arduino and the TI MSP430 (cheap like borscht) but I have not plunged in. My 15 year old son is more dauntless than I and has launched right into it.

          The combination of things you’re interested in reminds me a lot of Jeri Ellsworth. Are you and her friends by any chance? You’d probably enjoy hanging out, compare notes on hardware hacking, stock car racing, welding, and not working at Valve. :-)


    1. I can see both sides of this perspective. I think it’s more about the accessibility of the hardware, and the options therein, than a technology issue. So instead of a first-timer or hobbyist electronist fumbling around on DigiKey looking at spec PDFs, there’s resources like SparkFun, Instructables, Hack-a-day, MAKE magazine, etc. Perhaps that’s what Dunki means when she says “barrier to entry”.

      However, simple and foundational resources exist and are important for long-term understanding. See this fun project on the MAKE channel; a “breadboard” was once, quite literally, on a wooden breadboard. That sort of experimentation can still continue today, even with all of the new, fancy arduinos and IDEs. And arguably, the “REAL” breadboard is much more easy to understand, conceptually, than the Arduino… but everyone learns differently, so it’s a matter of where to start when getting into hardware.

    2. Come on, you have to be kidding! BASIC Stamp compared to Arduino?

      I don’t know much about the history of PIC but I really don’t think BASIC Stamp shows that in the past hobby electronics was as accessible as it is today. Sure, Stamps may have been easy to program but costing a couple hundred dollars, and that in a time when the value of a dollar was much more than it is today… They may have been useful to universities and particularly well-to do high schools but that is hardly accessible at the hobbyist level!

      As a kid (late 80s, early 90s) I remember reading about and wanting a BASIC Stamp. It just wasn’t going to happen on my little paper route budget. Even then I was already ordering occasional components from Mouser. That’s the same company I buy my Atmega328s and the clock hardware to use them as ‘Arduinos’ for only a few dollars. If anything like that existed back then I could have bought piles of them!

      I still see an occasional ‘intro to microcontrolers’ kit collecting dust at local Radio Shacks. It’s pretty much just a book, a BASIC Stamp and a tiny breadboard. They still want $150!

      I must admit though, I was a bit too buried in the software side of things for a while and missed the early days of Arduino. Maybe I missed something that came before Arduino? PIC perhaps? I don’t know but I do know but if anything was hobbyist accessible the way Arduino is before Arduino I do know that it was NOT the BASIC Stamp!

      1. What you wanted came along a few years later – the PicAxe.

        PIC chip with a bootloader, stick on breadboard and download program (RS232). About twice the price of the bare chip.

        (BasicStamps actually used PIC clones – the Scenix chips.)

      2. You could just get a single PIC 16F84 for a few bucks, and program that. It had flash memory (I think it was the first flash chip they sold), and could be programmed in-circuit with a cheap parallel port programmer.

      3. That was my problem as well. I learned Apple BASIC on a ][e and was happy when I got a Laser 128 clone with lots of ram. But after a few years, when I wanted to move to hardware, I didn’t want to go back to BASIC. I had already started on C first in Windows with a pirated compiler and shortly later by duel booting to Linux. Going back to BASIC felt insulting when I knew their were better chips out there.

        As for programmers, I could find schematics for the AVR programmers at the time, but all of them used a small AVR chip to interface between serial and the target. Which meant that, to build a programmer, I needed a programmers. And I was in hill country where I was one of the few people interested in this hobby. So the option became ‘buy a $50 programmer,” fine if you had a job at that point while still in pre-college school, but it wasn’t something my parents were going to loan me $50 for. By the time the cheaper options came along, I was in college at a well known Tech school for computer science and between writing OSes and decyphering grammerless languages while working on combinatorics homework (the only easy course I had) between courses on stats or differential equations (not tough, but a real sexist bastard of a professor with tenure) . . . time available was better used for catching a few hours of sleep. By the time I had time to work on hardware uC stuff again (okay, I was still doing AC audio stuff, guitar pedals and the like that whole time) the Arduino was around. And then the Papillio for FPGA, and all the clones for ARM and PIC and every other platform.

        So, it is a different time now. Sure, if you had a mentor back in the pre-internet days, or had a local BBS or parents who didn’t care about phone bills you could have gotten lots of info on the BASIC stamps or AVRs and if you had the right catalogs (I had those, dad brought digikey stuff home from work to use as scrap paper in the shop once the new catalogs came in) you might be able to get samples; or you might get scared off by the dev board prices. But we had neither a BBS nor a mentor. Now, even the public library has books on Arduinos and they are cheap enough a school kid working as a babysitter could pick one up. And you don’t need a local tutor anymore.

        ~Quin (one n, not the same person as the article’s author)

    3. You make an interesting point. . . I was just thinking about accessibility.

      As a child electronics, thanks to Radio Shack and surplus stores, etc. fireworks and model rockets were far more accessible than computer programming was. So I explored (and exploded) the world with the tools that I had and were given to me by kindly engineers.

      Today there are a plethora of software development environment and tutorials at the fingertips that all can be transmitted electronically to just about any place in the world. That is not so with hardware, where the basic tools (such as an oscilloscope) are less attainable.

      So the rocket pyromaniac of the 1990’s is now the app developer of today.

      Long live curiosity and rule breaking.

  5. What people are accomplishing with software engineering these days is breathtaking and it’s a pleasure for me to interact with really good code.

    Disclaimer: I do not write good code.

    Fluffing aside, I find hardware engineering much more interesting and have focused on that professionally. Hardware engineering is the unsung hero which enables us to experience software everyday. My job is to struggle with physics and it is really, really difficult. The end result is highly tangible and therefore, gratifying / horrible.

    I have never had that experience with code.

    Hopefully there will be some sort of evolution in FPGA technology allowing for hardware systems that dynamically reconfigure to better execute code, line by line or in parallel. . . Some kind of object oriented HDL.

    1. I know some have envisioned such a device, where RAM, processing, and storage elements can be reconfigured on the fly, all out of the same multifunctional material. I recall reading an article that imagined a future computer being essentially a block of such material, with no discrete components at all, just a blob which configures itself to whatever task is at hand.

      (I’d have to call it Odo.)

      1. Self reprogramable FPGA. I had a computer engineering buddy who was working on that post bachelor-degree for a local company and the college. One small core determined what would be needed by scanning the main core’s upcoming instructions, and would then shift other parts around to keep in-use RAM available but change where other parts are or even if they exist. Remove a ADC or DAC or MIMO unit to get more RAM, or vice versa. Some interesting papers exist on it; and I think he’s still working on similar stuff out at what him and the electronics engineer buddy call “Black Mesa” (mid-west government-run lab…scary friends)

  6. In software, patterns are used everywhere to solve problems with known solutions. (Simple patterns like using a for-loop to print the contents of an array to more complex pattern like visitors and the application-document-view objects.)

    As someone who needs to learn electronics similar patterns must exist. Are there any books or websites that provide examples?

    1. If I’m understanding your post correctly, I’d imagine that a simple understanding of Ohm’s, Watt’s and Kirchhoff’s Laws is a start. Especially Kirchhoff’s seems like common sense but it can make things clearer.

      Resistors add in series, capacitors add in parallel, but the formula to determine parallel resistance and series capacitance is the same – 1/(1/X1+1/X2…+1/Xn) where X stands for your resistors or capacitors. IIRC you’re actually adding the conductance of each resistor and then inverting it back into a value for resistance (or capacitance, as the case may be.)

      1/frequency = period, 1/period = frequency, that sort of pattern?

      I’m just lucky to have good instructors, interested in teaching every student, not just the ones at the top. They work hard to make things understandable to the students who even struggle with basic arithmetic (one of my lab partners mixes up “divided into” and “divided by” and confuses the heck out of himself.) They write all their own lab manuals though, so there’s not really anything I could point out to read.

    2. That’s an excellent question Rich, and a tricky one to answer. You need someone who is deep enough into software to know what you really mean by Design Patterns (in the software engineering sense), and someone deep enough into hardware to know if an analog (no pun intended) exists.

      I’m probably not qualified to answer, but my impression is that the answer is yes, with an asterisk. There are definitely best practices and common approaches to familiar problems, but I don’t get the sense that it’s organized as formally. Someone might look at a problem and say, “oh you need a band-pass filter here”, or “a half-adder and a shift register would solve that”. However, I’ve never come across an agreed upon set of names for patterns, along the lines of Decorator or Delegate.

      The closest analog to the Design Patterns book itself might be The Art of Electronics by Horowitz. It’s often regarded as the bible of the craft.

      1. There is no substitute for a proper textbook when you want to really develop a serious knowledge of the subject. However, a newbie just starting out might spend several weeks just learning the basics before ever getting a chance to apply anything.

        I’d recommend starting out with projects off the internet, both written articles and Youtube videos. Just keep in mind… some of what you see is probably wrong. It’s not that hard to find the difference though, just make sure to look for many sources of the same information. It becomes clear pretty quickly that one source is wrong when everything else disagrees!

        Once you have played around a bit this way and you are hooked, then get a big fat college level textbook and start thoroughly going through it cover to cover. Remember, you are studying this one because you want to, not because there is a test next Friday like back in school. Take your time, learn it all, not just what you think will be on the quiz and get it worked into your long term memory. Don’t just shove it into your head far enough that it lasts until the end of the semester.

        I might suggest getting some math books as well, at the very least a good calculus book. I’m assuming here you are already well studied in Algebra if not get that too. If you want multiple ways of doing things get a Linear Algebra book too. It isn’t necessary but it will teach you different ways to do the same things that are easier in some circumstances.(especially if you are into software as well). If you don’t want to study these math books outright then just keep them for reference when you get to a part in your electronic book that requires them.

        Plan on this taking a long time… the rest of your life actually. It’s a journey, not a destination. Don’t forget to stop and build something now and then, keep it fun too! Even if that means you have to suplement with an occasional YouTube video or the like when you want to know something fast where you haven’t reached that chapter yet!

      2. I have to disagree with the part about “just learning the basics” from
        websites or youtube just because the person is a web-celeb. Same thing
        with software if you are putting in the time, learn it properly and
        don’t learn bad habit.

        I don’t know how many times I have seen people who do not understand
        circuits putting in a part just because they saw some Joe on a website
        doing so. It is important to understand the limitations and gotcha of
        doing things a certain way. (“what”, “when”, “where”, “why”, and “how”)
        That unfortunately does not propagate in the youtube/web-celeb copycat

        1. “if you are putting in the time, learn it properly and don’t learn bad habit”

          Sure.. if you start out wanting it badly enough to do that. Starting with a light introduction though isn’t just about becoming a good engineer. It’s about testing the waters, determining that yes, this is something you are interested in and it is worth putting in that work.

          If everyone has to start with the most basic of basics plus learn all the theory and proper techniques in order I suspect the majority will lose interest long before they ever get to actually build something.

          I do agree though.. once you have the bug go for it. Get a textbook and study for real.

  7. I found this article very interesting, mainly because I can see what Quinn Dunki is getting at.

    I have been doing hardware design and programming for industrial machinery for quite some time now, and I find that being able to see a project trough to the end, both on the hardware and software side, is hugely satisfying.
    When I have written the code for a machine and I install it on site with the customer, and see the machine in action, I can mentally see my code being executed as the machine cycles trough its sequence. The customer just stands by the sideline, oblivious to the many hours of coding that went into programming the machine. The customer don’t see the little things. The things that prevent the machine from spewing paint all over the floor, or cutting somebody’s hand of.

    Good hardware and software design fells like a little world of its own. If you can master both the hardware and software in that world, the feeling is immensely satisfying.

  8. I agree here, totally. It’s too easy to get stuck trying to specialize in one thing, that we forget (or aren’t taught) that doing the opposite is beneficial. When my dad was in school, he took drafting, but it was tied to machine shop. After the class was done drawing up their projects, they would have to swap drawings and then go build each others project. This obviously drove-home the faults and pitfalls to watch out for.
    Playing a bit of piano makes you a better guitarist. Pitching a bit makes you a better hitter.
    Sometimes it’s better to code a blinking LED than build a circuit to do it. Sometimes it’s better to add another IC to a board than bit-bang a signal. It’s good to have these options and not be afraid of them.

  9. I’ve found a lot of similar experiences lately. The only thing I might not agree with is that software is the only one that’s fleeting. Hardware degrades, breaks down, and so on as well – just, as with everything else, in slightly different ways. You don’t replace code to keep it working, but you do dust it off, check the packaging for damage, and so on.

    The coolest experience is when you merge the two. A massive software package making lights blink throughout a space the size of a football field is an awesome sight to behold.

  10. Getting back into electronics myself. Started in software to make our electronic things work, then spent more years than I care to admit doing boring business software. You know, the kind where you are required by pointy-haired management to design in then figure out the hard bugs.

    Hit a snag the other day that I wouldn’t have expected from software. Making a 3v system, but the silly output driver didn’t see a “1” unless it was higher than 3.4v. Ok, in software, you can usually assume a 1 is a 1 and a 0 is a 0. :D

  11. Quinn, your post (or rant) was amazing, and largely spot on. As are most of the comments. I got started programming in Apple BASIC and R6502 assembler. Then switched to the PC, and finally landed on the BS2. Nothing wrong with that guy. Also sampled the Arduino. (I still don’t like that one!) And now on and off, writing Linux stuff on both Intel and ARM via the PI. But all of that doesn’t even come close to the R6502 for completeness.

    And the expression is, “Every thing is connected to everything else. That is why its so hard to keep secret.”

    Consider this gang, the BASIC Stamp went through a lot of design ideas, they even did use the late Scenix chips, and then back to the PIC after those were forced out of business.

    And Trui, I still use SN74 series logic, some obtained as late as late 80s, and some obtained last year. There is still a purpose for that.

    Quinn, you’re still an amazing individual.

    To finally end this comment with, “Live long and prosper.” as said by Spock.

  12. I start in high school, blowing up ICs and letting out the magic smoke. After a while, I got a little too comfortable with electrocution, so I focused on software. A couple of years ago, I took a computer architecture class and used Verilog to model a mini-MIPS processor. Cool. It *is* easier than it used to be to do hardware. And now all the power supplies have covers, so it’s safer, too, to do digital electronics. It’s really tempting to dabble in hardware again…

  13. I checked out of being a computer programmer for precisely this reason. I looked back on my 30 year career, I thought of all the 80 hour weeks, the deadline rushes, the times my fingers went numb from sitting at the keyboard too long, etc etc – and I asked myself, what, of all the things I have made, still exists? The answer was – exactly nothing. It had all turned to digital dust. I was paid well for making this digital dust, but for my remaining years, I am going to make things that are intended to last – on my death bed, I want to be able to look back on my life’s efforts and know that someone somewhere is still getting some joy or benefit from those efforts

    1. I was a Mainframer for 13 years but Y2K made me never want to see another line of code again. I worked as a tooling machinist for a year among other hands-on transitional jobs. Now I’m a field service engineer for large industrial lasers, and travel around North America tweaking optics and electronics in factories and labs. That’s a hands-on job that requires mechanical, electronic and software skills.
      I went to electronics tech school in the ’70s but it was never more than a hobby and side business until now. Contributing to Make Magazine has really forced me to step up my game.
      Programming was a great way to spend my worklife between about age 25 and 40, but I doubt I could sit still long enough to do it for a living now that I’ve gotten used to working on my feet, with my hands.

  14. Yes, the side between software and hardware development becomes thinner. If engineers use the software for development of devices and mechanisms why programmers can’t use the software for the same? In an arsenal of programming there are all necessary tools for work on engineering designs of any complexity. Creation of completely automated personal fabricator with a set of automatic manipulators and the sensors, allowing to make all necessary for production of various details and complete product automatically, will allow to erase a side between development of the software and development of devices, mechanisms and any other things. The product will be developed in an digital form, then its parts will or be printed or be made otherwise in an automatic mode, then automatically to gather, then to pass tests – automatically and manually. After that the prototype will disassemble, than utilized again to initial materials of which again new samples then will be made. And all necessary correctings will be made to drawings, assembly scripts and the built-in software after tests of an assembled prototype. Result will be the product debugged and ready for printing/fabricating on an automatic fabricator which will be able to duplicate, finish and modify anyone, in the same way, as works on OpenSource software are now conducted. It will be OpenSource for things and mechanisms. Certainly in technical engineering everything is a bit more difficult (in some cases, in some cases – easier), owing to need to be guided by laws of physics, chemistry, mechanics, etc. But all these laws and patterns from the point of view of the programmer in fact just one more Domain Specific Logic.

    1. And yes – the most part of this processes even now can be performaed completely in virtual form with physics simulators, models and so on. But automated prototyping and production of physical objects will be necessuary part of this technology, to make it OpenSource.

  15. I’m looking at the intro picture for this article and that is a prime example of how not to mount vertical resistors there. The tolerance band should be the lowest band, like next to the board. So the resistor reads top, to bottom. Learn how to form the component properly too. Pinch the body of the resistor between your thumb, and forefinger, then flip the protruding lead down over your thumbnail. That will start the radius, then finish it by bending the lead all the way down. Neatness counts. There’s a right way, then all the other ways.

    Now excuse me because I have to go be ill here. Disgusting!

      1. Not really.

        Surface mount parts are great for finished products but absolutely terrible for prototyping. I will never get why some people like to design a multi-layer PCB with silk screen, plated vias and order it from a professional board house for a circuit that they have never yet tried to assemble. You can’t change anything!

        It’s much better to have something you can change out parts and wiring at will until you have your design perfected. Then.. if you plan to produce a bunch of them for selling or if you want to miniaturize your creation it might make sense to go to surface mount. Until then I would stick to breadboards and perfboards.

          1. On my last project I think I went through 15 different iterations where I was not just swapping parts but actually moving moving wires in a period of minutes. In a breadboard it was easy, just pull something out and push something in. I have no idea how I would have done this using surface mount parts. Solder on leads and make them through hole?

            RF circuits don’t like breadboards so much but that’s what Manhattan style construction is for.

  16. Maybe it is just time for a new language and form of communication.

    Starting in the 80′ with a soldering iron and some chips, I nearly could ‘read’ schematics before even an actual language on a proper level. After ‘the usual expets’, like developing hard- and software, starting at Apple II, Sinclair, PC, Acorn, Atari, whatever, I ended up in the IT consulting area.

    In the end it doesn’t matter, if it is hardware or software. It is not even about programming languages nor system names. The only argument is about ‘documentation’. Find a form of talking and writing, so the task can be understood by everyone.
    More basic and important is the ‘meta-level’: find a way of talking and writing, that is understood.

    The only backdraw if The Universal Language can be established: the consultants are not needed any more.

  17. I can relate to what Quinn has written. I’m a software engineer mylsef, spent 10+ years on the web and ERP systems. Most of everydays work in software, unless you get into nasa or google, is pretty boring. You get to do an occasional architecture that is interesting and elegant, but usually you just manage to stamp out something that barely works and they already throw a new thing at you.

    Well, what are everyday challenges for everyday programmer that didn’t get into nasa? Some web-based stack and some mobile app stack. Some GUI work, some slow resource access optimizations, some architecture and that’s basically it. Maybe some datamining and statistics if you get lucky. All the rest is just stamping our a particular digital model of reality – be it a customer buying a product or social photosharing.

    On the other hand, hardware world has piles of interesting challenges to offer as soon as you wander into it just a tiny bit deeper than the first tutorial hit on google. Especially when you are dealing with analyzing input data from the real world. Damn, I have written an FFT implementation when I was a student without ever realising what is it for; and only when I got into hardware I’ve finally understood what it is and what cool stuff you can do with it.

    In a nutshell, I feel that the stuff I encounter when doing electronics, especially digital signal processing, makes me a better engineer. I know very well how to code, but now I’m relearning all the toolset mathematics has to offer – and I couldn’t encounter such richness of problems that require such intricate solutions in everyday programming.

  18. During the weekend I made Atmega8 based AC mains frequency counter, after which I used Atmega8 itself to generate 50Hz signal and blinked RGB led with these two signals.

    It was easy to make but the interesting part was to look at this led now shifting in time between the two colors.

    It got even more interesting when I started to reason why these shift in time in relation to each other. I thought can it shift when mains frequency is fluctuating around and between 49 and 50Hz, or can my quartz crystal produce big enough error.

    Then I started to optimize my code in order to make sure there is no “lag” caused by the operations. I ended up writing binary operations and eventually gave up when I ended up to some very shade MIT binary arithmetic page.

    Eventually boring out that I hadn’t accomplished anything.

    1. That shifting in the mains frequency was a mix of two things. Your xtal frequency being off slightly (but stable) and the mains power being deliberately shifted “JUST a small amount” in frequency by your generating plants.

      WHy? Well… everybody has clocks at home… and a lot of them are made cheaper by not relying on a crystal for a timebase, they use the mains frequency and count the cycles to determine what one second is. Power plants don’t keep this absolutely accurate day to day, but they do week to week so your clock on average does not drift and is correct.

      Why do they slightly change the freq? Well… if plant #1 is at 20% load and plant #2 is at 80% load…. they shift the phase relationship of the two so plant #1 is slighly leading in phase which makes more power be drawn from it than #2. Now, with a few hundred power plants all this a bit difficult to manage and they don’t really follow exactly 50 or 60 Hz in order to do it. (because of how many plants they are balancing).. but they do count every cycle and come back and speed up or slow everyone down once in a while to keep your bedside alarm clock running accurate.

      You just measured something rather difficult to notice! CUDOS!

  19. The arduino community certainly has helped mend the two become easier and more fun. I got into software and hardware at the same time and I was surprised as to how similar hardware and software systems actually are, I was able to relate what I have learned from one side to the other, first being logic gates.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.