Growing Up With Computers

My son is growing up with computers. He’s in first grade, and had to list all of the things that he knows how to do with them. The list included things like mousing around, drawing ghosts with the paint program, and — sign of the times — muting and unmuting the microphone when he’s in teleconferences. Oh yeah, and typing emojis. He loves emojis.

When I was just about his age, I was also getting into computers. But home computers back then were in their early years as well. And if I look back, I’ve been getting more sophisticated about computers at just about the same pace that they’ve been getting more sophisticated themselves. I was grade school during the prime of the BASIC computers — the age of the Apple II and the C64. I was in high school for the dawn of the first Macs and the Amiga. By college, the Pentiums’ insane computational abilities just started to match my needs for them to solve numerical differential equations. And in grad school, the rise of the overclockable multi-cores and GPUs powered me right on through a simulation-heavy dissertation.

We were both so much younger then.

When I was a kid, they were playthings, and as a grownup, they’re powerful tools. Because of this, computers have never been intimidating. I grew up with computers.

But back to my son. I don’t know if it’s desirable, or even possible, to pretend that computers aren’t immensely complex for the sake of a first grader — he’d see right through the lie anyway. But when is the right age to teach kids about voice recognition and artificial neural networks? It’s a given that we’ll have to teach him some kind of “social media competence” but that’s not really about computers any more than learning how to use Word was about computers back in my day. Consuming versus creating, tweeting versus hacking. Y’know?

Of course every generation has its own path. Hackers older than me were already in high-school or college when it became possible to build your own computer, and they did. Younger hackers grew up with the Internet, which obviously has its advantages. Those older than me made the computers, and those younger have always lived in a world where the computer is mature and taken for granted. But folks about my age, we grew up with computers.

78 thoughts on “Growing Up With Computers

      1. I learned programming by starting with Basic on ZX Spectrum, then Turbo Pascal on PC, then mixed the Pascal with x86 assembly (I learned instruction sets and differences between various processors in the 8086-P6 line), then 8051 assembly, then C on 8051, C on PC, C on AVR-8, C on ARM Cortex M, assembly and C on PIC, then assembly on ARM Cortex M.
        It depends on specific person, for me, assembly and basic C is many times simpler than using high level languages.

    1. Why would you teach your kid a dead programming language? There are things I hate about Python, but if my kid wants to learn about programing, the first language she’ll learn is Python. It runs on everything, it’s easy to learn and there is a ton of library’s and example code.

          1. Lua is used as a scripting engine in many games. If you want to create advanced content in Roblox, you need to know Lua.

            Going back to the ’80s home computer era, kids learned BASIC to make their own games or modify existing ones, not for intrinsic reasons. I wonder how many software engineering careers were started by putting lewd messages into NIBBLES.BAS?

          2. As chango said, Lua is used in some games. I had the same experience as John, doing my first “real” “programming” in Lua (specifically in Roblox). It’s a great tool for learning to code because unlike the many corny attempts to “make coding into a game”, this actually *is* a game that allows you to *actually* code.

          3. Back in the nascent PSP Homebrew days, a Lua interpreter showed up and I started playing around with it. I made several decent programs out of it. It was quite powerful for a scripting language (compared to HTML or DOS batch files which were my level of programming before Lua), and it introduced me to a ton of programming basics that I never could wrap my head around in prior attempts. So, yeah. Lua is alright. It’s still floating around as others have pointed out. It’s very logical and seems newbie tolerant.

          1. I used Lua in u-boot to do factory diagnostics. Instead of releasing new binaries when something was added or changed on the board I could update what I called a “config file”. which was a pretty lengthy Lua script.

          1. That’s a good point. They can learn the concepts of procedural programming before being able to read and write their native tongue. They also don’t require the discipline needed to ensure their code is perfect to compile. Missing a character in a sea of characters is not something little kids are good at noticing.

      1. The older more simple computers like the Z80 and its most bare metal languages are well worth learning – as they are simple enough to comprehend how the CPU processes it all – the nuts and bolts of how a computer works, which is very important knowledge.

        Rather than the pile of abstraction and abstraction of a ‘modern’ language. Though then learning one of those should be much easier.

        1. I’m not so inclined to believe the CPU matters.

          But nobody was doing “bare metal”. The Altair had the front panel, and I recall it could be single stepped. The KIM-1, had a great monitor in ROM. Not onky to put code in memory, but yiu cpukd set break points and single step and there wereroutines to get bytes from the keypad and display results. It’s been 40 years, but I think you could check registers and stwtus code.

          It’s that environment that made it friendly. You could put in one opcode, and any operand, and see the results, and then string it with other instructions to write code, still not needing to go “bare metal”.

          That sae sort of thing could happen today if a similar monitor ran under Linux.

          The real difference is you can’t hand assemble with today’s CPUs.I

          I’ve never used C much, so when I needed a simple program a few years ago, I relearned the same way. Extra lins to display values, and to print “here we are” so I coukd follow my errors.

          1. I would say it matters enough to be worth learning the more bare metal language for something simple enough to actually understand – as that means you should have a far better grasp on what your program actually needs to do and how it does it – which should mean more efficient coding choices and a better intuitive grasp of potential security issues and bugs that all the abstraction layers hide from you while not actually always preventing.

            I’ll admit I haven’t done so myself – my understanding of this, which is still more limited than I would like, comes from the other end. Learning FPGA and the softcore cpu development – which I’m still far from really a master of, HDL’s certainly feels like lots of work to get your head around.

        2. Anything simple enough to understand every part of is probably obsolete or at least legacy. It’s important that someone understand how CPUs work, but I don’t need to know more than the most basic level of that, because I’m not a CPU designer and that’s an implementation detail I don’t see, because I don’t code assembly.

          I really don’t buy they whole specialization is for insects thing. The only way to be able to do a bunch of different things is to practice them, and there are only so many hours in the day. Nothing as advanced as the modern world would happen without some degree of specialization and a whole lot of acceptance of black boxes.

          I’d rather have something that works perfectly that I don’t need to understand, than something that breaks down every few years but that I know every detail of.

      2. Every version of Python eventually dies. But you’re right about lots of online resources, and it’s not just examples but other people to talk to about it.

        I think the draw to some of these 8-bit systems as a starter is they were smaller and easier to hold within a single brain. They were less of an opaque magic box that a modern language with garbage collection and type inference offers.

        I’m of the mind that people who are interested in learning stuff should learn all the different systems. Even if they just sit in a short video lecture of how some of these old systems worked. Rather than dive right into development on a Z80.

        I’d lean more towards an Arduino than a Z80 for kids looking to get some experience with low level and systems programming. It’s popular enough that resources are available. C is a little hard to learn but it’s cut-and-paste friendly and most Arduino programs are relatively simple and avoid complicated data structures and exotic language features. Running without an OS is both liberating and challenging, it not having files, networking sockets, graphics, and multitasking easily available is limiting, but it also means you don’t have the pressure to dive into doing those things right away.

  1. 1975 to 1995 is the statutory Golden Age of personal micro computing – it was personal computing’s equivalent of the Cambrian Explosion. Thereafter computers settled down and really, became sort of boring (even though vastly more powerful)

    If you lived during the years when BASIC was in the ROM in most personal computers and was there at your fingertips when the computer was turned on, you lived during a rather special era – consider yourself lucky to have witnessed it first hand

    1. And not just BASIC, a few had Forth, at-least had a version of APL, and then there was the SuperPet with BASIC, APL, FORTRAN, COBOL, and Assembler at a flick of a switch.

      Always there, always available as soon as you turn on your machine.

      I miss that, the languages were slow compared to what you run today, but you could have ideas and changes in seconds,

      1. It’s surprising what still uses FORTRAN these days. Not just old NASA space probes. A few programs that use a lot of math behind the scenes tend to have portions of their code written in FORTRAN because it’s nice and efficient. An uncle of mine used to write for the PET back in the day. I remember him being particularly proud of a little program he called, “speed up pet” [ His terrible sense of humor. ], it effectively kept the disk spinning so you wouldn’t need to keep waiting for it to spin up and seek. Personally I miss things like having hardware mapped to memory addresses. Object orientated programming proved to be a nightmare to learn for me, coming from the relm of asm, basic and batch.

    2. yeah i really wonder how (and what) younger generations learn. the computer was so finite then, a small reference library could really cover everything you needed to know. now, you have to tune out the overwhelming complexity and focus on one element of it. it’s a *lot* easier to get things done, but you get a totally different view of the computer and you get used to the majority of the layers being completely opaque to you.

      but man, i did just do malloc(0x100000000ULL) the other day. the future truly is amazing

      1. This, too! I remember running into Bill Gates’ (apparently apocryphal: https://www.computerworld.com/article/2534312/the–640k–quote-won-t-go-away—-but-did-gates-really-say-it-.html) 640K after I read an article about Markov Chains in Scientific American. Now, with smart programming, one could have worked around the sparse matrices, but I was like 15, damnit.

        Nothing says YOLO like calling out a “ULL” in a malloc call. That’s just badass!

        1. Why would Bill Gates have said that? He didn’t design the IBM PC, he just sold IBM BASIC and the operating system.

          People did make weird pronouncements, so I can’t even remember if I heard Bill’s back then. But every time it’s come up in recent years, it just hits me that he wasn’t thedesigner, so why would he say it?

          It’s on the same level as people thinking Steve Jobs was a technical type. Or tye Apple II was the first “personal computer”.

          1. He’s a businessman, and the success of his business was loosely tied to the success of the PC platform. Of course he’d want to downplay an obvious architectural limitation.

      2. I’m 16. The amount of platforms, languages, libraries, and IDEs is overwhelming, but I now realize if it works, it works.
        The other thing is that with the internet showing every amazing project everyone else has done, it makes it seem like anything less than a groundbreaking project isn’t worth doing.

        1. Ah come on, you’re 16! Consider what level your skills were 20 years ago and what that will be 20 years from now, keep working and you’ll be making groundbreaking stuff.

        2. Not only groundbreaking, but also with a social presence and video-editing skills that folk love, and so much more. And then, once you’ve dedicated so much into that groundbreaking thing, that took years to achieve, you’ve got to come up with ever more, on ever tighter time constraints, lest your fifteen minutes of fame be lost to bit-rot, and you’ve got to start all over again.
          Yep… you’re getting it!

        3. “The other thing is that with the internet showing every amazing project everyone else has done, it makes it seem like anything less than a groundbreaking project isn’t worth doing.”

          I hear that. I think that’s part of the appeal of retro-computing: That I could do something that was never done back when they were new technology.

        4. It usually takes a little reading to get to the part where they’ve spent a lifetime getting the experience then four years of downtime development.

          Everything looks like magic when all you see is the end result. Take it as inspirational rather than discouraging.

        5. – I hear you on ‘internet showing every amazing project’ making your own seem not worth the effort, and it does feel like that has become a great downside to of all of the information at our fingertips.
          – I grew up more in a time (along the lines of this article) when you could get a book from a library to figure out how to do something technical with a console, or later PC – but not necessarily complete your idea. You felt like your ideas were novel, creative, worth pursuing, learned a lot in the process of executing them, and had satisfaction of feeling like you created something new and great when you were done – leaving you wanting more.
          -Today you can get an idea that is novel to you, google something, and see 20 people have had same/similar ideas, built it already, and probably a few of them have built something vastly more refined and complex (quite possibly having less limited time and resources to pursue than yourself). It takes much if not all of the fun out of it, at least for myself. Your fun new idea for a build is now just a copy of someone else’s documented work, and moreso following instructions than creating and problem solving, many times to just get to something less impressive than you found along the way (not necessarily due to your skills and abilities, but realistic resource and time constraints again). Even if you have the motivation to build your idea at this point, it’s hard to look at the outcome as an accomplishment, rather than ‘not as cool’ and falling short of the similar projects you came across while researching.
          – I’m not sure how we best deal with this, but it does seem to be a great hinderance for the younger generations to ‘catch the bug’ of building, creating, and learning – not to mention an issue for us ‘older’ folk also.
          – I guess the main thing to keep in mind is every challenge you take on expands your own skills and knowledge base, regardless of if someone else has done it already, or how well they did it. At least in builds for our own enjoyment, maybe we need to go our of our way a bit sometimes to avoid looking at similar projects – and if you run into a stumbling block, look for answers for that very specific stumbling block, and not similar projects/solutions in general. Trying to build a rabbit repelling water-turret for your garden? – Keep the googling to servo pinouts, image recognition, etc. You, and likely your friends and family will likely still be impressed with what you come up with, not having seen similar. Only if you submit it to HAD will someone point out ‘so and so made this one 10x better, and with some better design decisions’ and deflate your ego :-).

          1. I’m 30ish so I grew up with the rise of the modern internet rather than computer software/hardware.

            I think the current proliferation of projects and information is great!
            If you want to learn about metal working and casting, fishing, sports, electronics, gardening, home repair, car repair, woodworking, programming, cooking, baking, animal care, or whatever; there are still the print resources but with a quick internet search you can find others out there with the same interests to learn from and grow with.

            So yeah, for me it’s important to understand ‘why’ something works but redoing the research from the ground up (and the associated time commitment) is less important to me.

            I might not be an inventor but I would rather be something of a renaissance man and know/do lots of different things than be an expert in a specific area.

            Oh yeah, home brewing.

            I wouldn’t worry about the younger generation, just keep putting the opportunities out there and they will take advantage of it. There will always be people who want to take advantage of it even as there will always be people that only know how to interact with a phone interface and never know (or need to know) how to use other input devices.

            Just some thoughts.

  2. I have no idea when I first noticed TV, it was there, and not a novelty to.me.

    But about 1969, there was a story in the paper about two kids around my age who built their own computer. That’s when I wanted my own. (In retrospect, they must have built one of those demonstrators out of straws or cardboard, but I didn’t know that at the time.)

    I couodn’t afford a computer in 1975, but it was part of my life, I was fifteen. I have always had a comouter since May 1979.

    I think there’s a difference between wanting something, seeing it develop, and something that’s everywhere.

    Lots of talk about “digital natives”, but they had nothing to do with it. It’s an appliance, their skill is mostly social. They didn’t create this world, it was created for them, and created as a thing where they could easily participate.

    Much of the population waited for til it was a Disneyfied space, easy to use, and controlled. E en groups that had webpages in 1996 weren’t really using the medium effectively, they were just markers. They waited until facebook and twitter was already being used by the masses to jump in.

  3. I feel immensely fortunate to have grown up with early computers. By necessity we had to learn about how the computer works. Of course the immense power that people have at their fingertips today means that a people can get amazing things done on a computer without any real knowledge of how things work inside the computer. The problem is without that basic understanding the ability to troubleshoot is lost when something doesn’t work as expected.

  4. Computers are just mind amplifiers, work on maximising the developmental potential of the child’s mind, then when they are ready they will be in the best position to decide what details about what technologies best serve their needs. It also pays to have a lot of interesting stuff accessible and have a family culture involving open dialogue about a wide range of topics so that when a topic comes up and the child show some interest you are there for them with some stuff from your hord to show them how it works and why they may find it useful. They may find something interesting and explore it further, sometimes deeply, or just for a while before moving on, at times they surprise you by not being interested at all. You have to let the child guide you when it comes to exploring knowledge because it is their curiosity that motivates them and later in life it contributes to a habit of lifelong learning. The key is to build an environment of opportunity and variety for their minds to expand into.

    1. “You have to let the child guide you when it comes to exploring knowledge because it is their curiosity that motivates them and later in life it contributes to a habit of lifelong learning. The key is to build an environment of opportunity and variety for their minds to expand into.”

      Thank you for this.

  5. I struggle with this question with my own kids as well. I too grew up in parallel with home computers. My first computer was 8-bit and I learned to program it in BASIC and assembly language in elementary school. In middle school I upgraded to 16-bit, and in high school to 32-bit.
    This made each step accessible and comprehensible and intuitive. My children, however, do not seem to take to programming or creating much beyond what can be done in video game level editors and I find myself wondering how much of that is just a difference in personality and temperament vs. how much comes from being overwhelmed by the depth and complexity of the systems involved. The limitations that made gome computers so accessible during my childhood provided both a challenge and a goal that felt within reach: the gap between professionally produced games and software vs. hobbyist games and software did not seem nearly as insurmountable as it does today when commercial offerings are built by enormous teams of developers and specialists rather than by individuals or small teams.
    I worry that today’s children will not have the empowering experience of realizing “hey, I could do that!”.

    1. If they do like editing games then just try to get more technical with it. I learned a lot of my Windows knowledge by installing my own PC games, modifying, cheating, etc. Makes you feel like a 1337 haxor for editing an ini file

  6. I’m so far finding this discussion very disappointing. Y’all seem to be looking at this as though computers are tools, or references, or toys, but no one seems to be considering the power these things have for flat-out mis-informing the user… how are y’all teaching your kids to be skeptical about sites’ legitimacy?
    We learned these things as they progressed, too… bootblock virii are just silly and annoying, then those that infected your programs and manipulated your files… also annoying, but nothing like the societal impacts of email takeovers that send messages under your name… or “teaching” websites that are flat-out /wrong/… or “cancer pills” from “canada” for prices folk can actually afford, that are just placebos at best. We grew-up alongside those progressions, learning to recognize and weed through those questionable resources, but now we’ve generations, both old and new, being thrown into that deep end. What’re we doing about this? Not even discussing it, so far, judging by the article and comments so far.

    1. I think the discussion here is more about teaching kids programming (kinda deviated from the article’s main point).
      In general, regarding misinformation, people shouldn’t forget that we’re all born cave-men, and that ignorance exists even if there is a crazy amount of information that’s available right there, learning to sort through it is a subtle skill that requires a lot of education in itself.

  7. In my opinion, in the same way that we don’t stick kids in front of “War and Peace”, expecting them to excel at reading, because it’s a 1000 times more complex than “The Gruffolo”; nor do we stick children in front of Mathematica, because it’s so much better than a 4 function calculator; it doesn’t make sense to teach programming or understanding computers using modern multi-core, multi-GHz, 64-bit machines.

    You really do need to start at the bottom, on computers that boot in <1s straight into a self-hosted programming language with direct access to some simple graphics and sounds. Everything else is an illusion.

    1. Good comment Julian.

      When I started there was barely graphics, and virtually no sound. This encouraged you to use your imagination and ingenuity.

      Booting to a self hosted language within a second might seem anachronistic these days, but I have wasted years of my life waiting for a wretched PC to boot.

      Now you can buy an ESP32 based board that has VGA, PS/2 mouse and keyboard, SDcard and headphone jack which will emulate a whole range of retro machines for about $20.

      1. I’m sad we stick kids in front of Arduino in C++, or “scratch-duino” instead of in front one of these things when we want to teach them computing…

        writing digitalWrite(LED_BUILTIN, HIGH) doesn’t teach you anything about computer…

    2. > You really do need to start at the bottom, on computers that boot in programming language with direct access to some simple graphics and sounds.

      I think the root of this is: instant gratification. Kids need that to keep them motivated. Anything that is complex and takes a long time before it ‘works’, will discourage them. But if they start out with simple things that gratify quickly, then they will start to understand that more complex things give more gratification, even if it takes longer for that gratification is given.

      This is what kept me going. I remember the first time I discovered the chr$() function in BASIC, which helped me understand that all those characters that I was putting on-screen with ‘print’ were basically numbers poked into certain memory locations. And the weeks after that, I spread out to poking characters on screen in nice ways, making lines and ‘windows’. And then realising that I could speed it up a thousand times by using assembly instructions and usr(). And then realising that poking into other parts of memory did other things, and etc. etc. etc. Just because of the initial realisation that characters on-screen were just numbers in memory locations.

  8. As a baby-boomer I grew up on the cusp. I remember my first computer: had to be early 60’s. It was a mechanical contraption that was activated by pulling/pushing a mechanical slide. There were mechanical bails (wire rods that slide in and out) which caught in turn projections/pegs on the mechanical slide. So by putting pegs in the slides you could for example make the ‘computer’ count.

    I wish I still had that: what a cool toy. Also wish I had saved the “Mr Machine” I got for christmas one year. Wonder what that would be worth today?.
    Mark

    1. You are talking about the DigiComp from the 1960’s. You can find them on eBay for anything from $250 – 500. My older brother had one and I got to play with it alittle when he wasn’t around.

      1. I was one of those kids who took everything that sat around too long apart. My parents really started to pay attention when I could put them back together and they worked. Some of them didn’t work when I took them apart. I got my first job as an electronic technician when I was 15.

  9. “it doesn’t make sense to teach programming or understanding computers using modern multi-core, multi-GHz, 64-bit machines.”

    Wrong. It even makes more sense to do it now than ever before.

    My son is 16, and I’ve taught him programming (languages, data, structure, design, debugging) since he was 2. The idea is to get them interested in creating things other people use, not just sitting on a computer using what other people write.

    ie creating instead of just consuming.

    Yes, computers are more powerful, yes you can google and find a program that does 80% of what you want. But there is even more need for people who understand what’s going on and can write good software than there ever has been. And for people who look at https://xkcd.com/2347/ and understand the problem – and can write something that doesn’t depend on 57 libraries…

    So to answer the question in the article – start them at a young age with something visual like scratch then move them into programming in a text based language to do things they want to be able to do… And one day, if they take an interest, they will be able to program just about anything to do just about anything…

    1. I really like your take! I like the idea of teaching kids using bare metal simple computers, but the truth is that it won’t be useful to most of them. In the end the goal is just to get kids into doing things, creating things, and even more than that, the truth is that 99% of people will never be interested in computing anyway, we’re all so different and we tend to forget that the average Hackaday reader is nothing but average…

      1. Val, IMO Ian’s implementation is correct. Start teaching kids at the tippity top to the tech stack then go backwards. Ever seen how inspired kids get when they realize what they can do with Scratch programming? They don’t need to know what the silicon does, they need to internalize the feeling of why they should care asap.

  10. I think some of us are lucky growing up pre-home computer and growing into the technology. I had a AWS protoboard. It had a 4 bit processor, hex keypad, led display and a breadboard to link to circuits you design. Later came a TRS80 and training at NCR tech school where the cash registers were programed in binary and BCD converted to binary.

  11. As a teen in the Mid 70s, liked building electrical devices. Plenty of parts to scavenge from TVs and other devices. From there turned to early 8080/Z80 machines and a adding using a soldering iron. I am still building and using computers, my latest an AMD 3800 and an NVidia gpu running Kbuntu 20.04 – gpu used in Python. I learned to program on those early machines and still love.

    Have 2 teenage daughters that are “digital natives” and yes touch screens are very intuitive to younger children. Computers come natural too then. However, even after starting in my teens after all these decades, I too have went native.

  12. Kids today aren’t really growing up with computers. They’re growing up with media devices. The nuts and bolts that we had to learn in the 1970’s and 1980’s are abstracted out. I was shocked to learn that a coworker who had been doing many personal projects with Arduinos for years did not know how 2’s complement binary math worked. This is a thing you learned within a few weeks back in the day if you wanted to make the computer do anything useful at all, but when I explained to him that binary integer math is circular and the difference between a signed and unsigned integer is whether you put the zero at the top or bottom of a clock face, his mind was blown. And don’t even get me started on all the bugs I’ve found in industrial crap because they don’t teach people the difference between floats and reals any more.

  13. How many people actually sit down and write a program anymore?
    Now it’s all apps and internet connected media.
    Every day on every news channel on both TV and radio, you either hear
    get our app or download our app….
    How about just watching the tv or listening to the radio?
    Now you have to have an app? Computers aren’t computers anymore like
    when the C64 and Apple //e were around. How many remember sitting
    many hours on a weekend typing in machine language program from “Compute!”
    to play a new game? Nowadays, there’s no effort. When Doom first came out,
    it was one of the most downloaded pieces of software anyone wanted.
    Some downloaded it via modem which took hours. Nowadays with our gigabit
    networks and lightning fast internet connections you can download a game
    in a few minutes. No effort needed. Unless you’re going into writing software
    as a living, most don’t need to learn to program computers.
    With AI, they’ll probably be programming themselves in the future.

    1. I do. I’m an English teacher in Japan and I taught myself to program because I wanted to do things that I can’t buy or download. Around here, we tend to be creators. Effort IS the game.

  14. Fortunately, assuming you have a decent school district, many of your concerns should be figured into your child’s curriculum.

    Libraries, and librarians, have changed significantly from what you may remember from elementary school. As a public school librarian, my wife teaches media literacy, including discerning truth from disinformation, internet safety, including topics such as data privacy and malware, and basic coding with tools such as code-a-pillar, ozobot, and hour of code beginning in kindergarten.

    1. This sounds promising… except, I heard on NPR just two years ago that some entire /states/ don’t even have computer curriculums in their schools, “yet.” 20friggin19. Here’s hoping your librarians’ new roles and expertise are more commonplace!

  15. One thing that I think is pretty important to remember, and took me a while to get, is that you’re kids are not you and will find their own things that interest them – the only thing you can really do is expose them to as much as possible and see what sticks in their brains.

    To me messing around with the ZX81, Vic20, Commodore 64 in high school opened an environment where I could be a world builder and now 40+ years later that’s what I’ve been using for most of my adult life to pay my bills and feed my hobbies.

    My kids had their domain names before they were born and an old laptop to play around with before they could walk and even though I taught them binary and basic programming, and built micro-controller based projects with them when they were young, up until now they really aren’t all that interested in doing a deep dive into the nuts and bolts of programming. But, you know, that’s o.k. They’re using Blender, doing video editing, and using the computer as the tool that it has become in ways that didn’t exist when I was kid to build things and their worlds with the things they are interested in. And it’s all good.

  16. I cut my teeth on school-district Apple IIs and IBM PS/2s (the 4th-6th grade campus got a lab of Model 50s, the ones all in one box like the early Macs), my grade-school best friend had an Atari 800, and my mom’s parents got Macs by 1986 or so (an aunt was in graphic design and early desktop publishing) — but my first home computer was a TRS-80 Model I, which we got in 1988; it was nine years old at the time. We got an Apple IIgs a year later, but I credit the TRS-80 with giving me the retrocomputing bug very early.

  17. The young kids these days I’ve mostly run into are only comfortable within their social media UI. They panic when confronted with a desktop and think they broke something and panic when God-forbid a terminal pops up. The future looks bleak for the new generation.

    1. Nah, it all depends on the point-of-view. Back in the early days, there used to be fewer computer users, but they were more educated. Nowadays, we have much more users *in addition*, but with less knowledge. Someone could argue that the educated users haven’t declined in numbers, thus. ;)

  18. So if beginners need simpler times, write an emulator. Implement Forrest Mims’ PIP-1, or CHIP-8 from the 1802. Or an imoroved Sweet16 from the Apple II. Or invent a new imaginary CPU. Put in the good things like the ability to single step, and add breakpoints, and display the registers. Add a simpler assembler, like the mini-assembler that came with the Apple II, and a disassembler. Make it an integrated program, for learning.

    And then a single program can run on any existing computer.

  19. Well, my point was, computers aren’t computers in the way we remember from 50 years ago.
    Back then you had to type on a keyboard to enter a program written by someone else
    if you wanted the computer to do anything useful, otherwise it would sit there with
    a blank screen blinking a cursor. Now, 50 years later due to Windows, the computer
    has become an information portal where anything can be looked up online 24 hours a day
    7 days a week at any point in time. You couldn’t do that 50 years ago.
    If you wanted to know something, you watched the news, you went to the library or you
    read it in a book or a magazine. Now, you have a computer in your pocket that’s more
    powerful than the C64 and //e, even more powerful than the computer that was used in
    the Apollo space missions and moon landings. 50 years from now, there will probably
    be implantable computers that tie into the brain to give you a world of knowledge
    in an instant. (We are the Borg, resistance is futile).
    For thousands of years, humans have searched for immortality. Well, we have achieved
    immortality. 40 years after he passed, I can still find references to my grandfather
    online. For those tracing their family history it’s a shmorgashbord of knowledge and
    information. I’m waiting for the 1950 census to come out in 2022.
    Yes in 50 years the advancements in technology have been incredible.
    Now if we could only learn to take care of our planet and stop killing each other.

    1. Most of what’s written about my family was written for traditional places, it’s simply on line now. But then I found my father’s thesis paper online just the other day.

      But I can challenge what was written about my family. The internet gives publishing to the masses, but most of the mass has nothing to say. 25 years ago even, the internet was already seen as a passive place, a source of information (and people wanting to “curate” because they left they were better than others).

      Technical hobbies were fringe in the old days, and people who wanted encyclopedias and other reference books around were a minority.

    2. we havent even come close to achieving immortality. Finding information on people loses integrity with each decade you go back. Finding it on people now is incredibly inaccurate and balanced towards popular meme life that has little to do with people’s whole life. Unless you think most women under 30 really do have fox ears and a snout.

      At the same time we still have information from thousands of years before us. The holes we have in the written record are largely due to intentional destruction, like the burning of the library of alexandria. We see the same inclination today with ransomware attacks and deletions by states like China becoming more effective and penetrating.

      We may find very little of ourselves documented a few decades from now as the walled gardens we’ve placed the data go dark or restrict access. The environment we’re in is neither dystopia nor utopia, but people need to stop pretending that 50+ years ago were wandering around in some informationless stupor while now we have perfected record keeping and information sharing. Its just not true.

  20. I think that most comments are a tiny little-bit biased and waaay too negative. ;)

    Sure, knowing the basics about computing is very important into these digital times,
    but that doesn’t necessarily mean that everyone needs to know how x64 Personal Computers work.

    Especially, because they are nolonger as relevant as they used to be.
    Just think of UEFi, TPM and the Minix 3 systems found in intel chipsets – it’s a high-level mess.

    Gone are the days of the IBM AT, Real-Mode assembly, ISA bus, VGA and classic ports and peripherals.
    USB alone is a nightmare, let alone ACPI and PCI-Express.

    Todays single board computers (SBCs) are quite similar to primitve home computers like the VIC-20, ZX81 etc.
    Let’s just think of the Arduino UNO or the Raspberry Pico.
    They can be programmed in a mixture of BASIC and C (Arudino IDE) or in conventional BASIC dialects
    as found on the PICAXE type of BASIC stamps. Or look at the Parallax Propeller chip.

    In addition, they support bit-banging the i/o pins.
    Just like in the good old day of the USER port and peek and poke.

    Personally, I think that young people can do learn a lot by programming
    experimental computer boards; they can do a lot of things that home computers were used for:

    Measuring stuff, do perform actions if needed etc.
    Give them a robot arm and they will start coding. ;)

  21. ” I don’t know if it’s desirable, or even possible, to pretend that computers aren’t immensely complex for the sake of a first grader — he’d see right through the lie anyway. ”

    Its not really necessary. You drive in cars, fly in planes, drive on roads, use plumbing, wear clothing, consume electricity etc. All these are systems and implementations that are massively complex. All of these make use of technologies that took millennia to refine.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.