Free As In Beer, Or The Story Of Windows Viruses

Whenever there’s a new Windows virus out there wreaking global havoc, the Linux types get smug. “That’ll never happen in our open operating system,” they say. “There are many eyes looking over the source code.” But then there’s a Heartbleed vulnerability that keeps them humble for a little while. Anyway, at least patches are propagated faster in the Linux world, right?

While the Linuxers are holier-than-thou, the Windows folks get defensive. They say that the problem isn’t with Windows, it’s just that it’s the number one target because it’s the most popular OS. Wrong, that’d be Android for the last few years, or Linux since forever in the server space. Then they say it’s a failure to apply patches and upgrade their systems, because their users are just less savvy, but that some new update system will solve the problem.

There’s some truth to the viruses and the patching, but when WannaCry is taking over hospitals’ IT systems or the radiation monitoring network at Chernobyl, it’s not likely to be the fault of the stereotypical naive users, and any automatic patch system is only likely to help around the margins.

So why is WannaCry, and variants, hitting unpatched XP machines, managed by professionals, all over the world? Why are there still XP machines in professional environments anyway? And what does any of this have to do with free software? The answer to all of these questions can be found in the ancient root of all evil, the want of money. Linux is more secure, ironically, at least partly because it’s free as in beer, and upgrading to a newer version is simply cheaper.

Story time. I used to work for the US gov’t. In our Bureau, we had a few thousand Windows XP installs. When Vista came out, they looked into upgrading, but for monetary reasons, had to put the project on hold. (They dodged that bullet.) But then along came Windows 7 and the end-of-life plans for XP. Even so, it took a number of years to get through all of the security and compatibility testing required to make the switch. And that’s just the cost of labor. On top of this, they had to pay for all new software licenses. I’m sure they’re working through the same thing with Windows 10 right now.

The US Bureau of Labor Statistics isn’t badly funded by government standards, and certainly better supplied with talented technical people than many bureaucracies. And yet, the act of upgrading the system caused some real institutional pain and required real effort. We can only guess at what the budget of a rural hospital’s IT department looks like, but I’d guess they’ve got a lot fewer resources to work with. Why are a bunch of nuclear physicists at Chernobyl still running XP? Because it’s what they can afford.

The Moral

The point of this story is a simple one. The cost of upgrading Windows is non-trivial, and Microsoft is always going to insist on receiving payment for newer versions of their OSes — fair enough, that’s how they make money after all, and they need to pay their coders and shareholders. But this will push some institutions, not to mention individual users, to forgo upgrades and keep on limping with out-of-date or otherwise unpatchable systems, which will be ripe for mayhem. There will always be insecure Windows systems out there because you have to pay to upgrade. It’s all about the money.

And although Microsoft eventually offered free patches for XP against WannaCry, they allegedly held back the release of the patch for a few days, in an attempt to shake down some of their former customers who had not yet upgraded. On one hand, you can hardly blame them — they’re stuck supporting 15-year old software at this point. But they also need make users pay for XP support so that they’ll have an incentive to buy the next thing. And there’s that wedge preventing security-relevant upgrades again.

Microsoft isn’t the only company out there making money on OSes. Android may be free, but since new versions of Android are often bundled with new phones, phone companies are reluctant to give up the new hotness for their old devices. But when the hotness also comes bundled with improved security only available to those with new phones, it puts the same sort of sand in the OS-upgrading gears, even though the OS is notionally free.

Free as in Beer

I’ve been using Linux since it was installable on 3.5″ floppies over dialup, so I’m probably one of those “Linux zealots”. And I definitely value the ability to read through kernel code and add new drivers if I feel like it, although I’ve only done it once in twenty years. Still, one of the most attractive features of Linux to me is the “free as in speech” aspect.

But I’m also an economist by training, so I see the invisible hand working nearly everywhere. And watching wave after wave of Windows viruses attacking outdated systems that should have been upgraded made me wonder why, and I think it’s all about the Benjamins. So if you’re a fellow Linux zealot all caught up in “free as in speech”, spill a little for the power of “free as in beer”. After all, it might just be why you’re not running an unpatched Mandrake system on that old Pentium in the basement.

211 thoughts on “Free As In Beer, Or The Story Of Windows Viruses

    1. I had to google that as well, makes sense once you get the definitions – free as in beer = costs no money, free as in speech = no restrictions or open. He’s saying that the fact linux updates cost nothing is a non trivial contributor to why it is less virus prone.

      1. hmmm… windows updates free.
        Redhat support subscription, not free.
        windows update procedure, Automatic (unless it is turned off – possibly by policy)
        Linux update procedures, require user intervention.

        Anyway, he said, upgrades, not updates…
        The problems are exactly the same (regardless of whether you are using windows, Linux, UNIX, OSX, android etc,) the problem isn’t the cost of the license, the problem is the time (and money) to test certify and approve, combined with a real possibility of re-develop, or re-buy.

        There are situation, (especially in hospitals) where there is going to be some software written a decade or more ago, that is used for some critical function (like storing records, or keeping appointments), you can’t simply upgrade the OS without extensive test plans, testing, and generating test data (you can’t use real patient data to test!).

        Then if the software fails…? well what then? hundreds of thousands of dollars for a software re-write.
        by the time you count up the time for staff to generate a feature list, managers to collate that, a project manager to form a plan, create a tender for people to bid on, and analyse those bids to chose a contract winner… well that’s a load of money there… then, you need to pay developers, project managers/scrum masters, CEOs all projects have to cover business expenses (heat and lights in the office) there are taxes…

        The software costs a LOT.

        if you could get something simple for 100 grand you’ve done well. – and that’s still cost you 10k per year over the ten years since it went in! -remember all that testing before you realized you needed new software? (at this point just lost time) if you’re lucky you can re-use some of the test plans with the new software, but chances are the new software has a new workflow so you just throw it all away and spend on all that labor cost again.

        then you get to the real big money problems…
        diagnostic machines.

        The cost of re-writing appointment booking software is going to be trivial compared to the software running on a diagnostic machine, probably using a specialist IO card, (possibly requiring something like ISA slots – that can’t use a modern OS!), with a million dollar (probably more) MRI scanner on the end of it…

        Vista came out ten years ago. – the lifespan of these ultra-expensive diagnostic machines is greater than that, – so that they may be using software for a now arcane OS is unsurprising.

          1. You care enough to comment :)

            Specifically, no I am not a “Windows fan”
            I’m a best tool for the job fan, in some cases the only tool for a job fan…

            That is why I keep an old XP laptop for running car diagnostic tools, -there is no choice about that.

            Other than that I *mostly* run Linux, either Debian, or Centos by choice.
            Windows on a work machine where that is enforced.

          2. “You care enough to comment :) ”

            No I didn’t.

            Like the alleged superiority of Windows over Linux or Linux over Windows, it’s all just a figment of the imagination.

        1. Nice straw man! Didn’t I see you set fire to Nicholas Cage once?

          Linux != Redhat

          There are many many free Linux distributions. If you like Redhat so much try Fedora. If you just want a great, lean base for a production server made from tested, secure software versions and a painless update procedure… go for Debian.

          Also many of them, including Redhat can be set up to automatically download and apply updates. Most people don’t. but if that’s really what you want then you certainly can! I’ve seen MySQL / Debian servers run for years with automatic updates turned on. Nothing broke and nobody ever noticed an update-caused downtime. It happens so fast connections still haven’t timed out when the server comes back up! Compare that to Windows updates… I think Microsoft would take 10 minutes just to update Notepad.exe!

          Support subscription? Yeah, sure. Nobody is going to give one of those for free. So what? I’ve dealt with paid support, Microsoft and otherwise plenty of times. I’ve also used the internet to find answers myself both for Windows and for Linux.

          Once you develop decent search skills paid support sucks in comparison. Somebody on the internet has already documented your issue. You are not a special and unique snowflake. Yes, I know the business world insists on having a paid account with someone in order to sleep at night. That’s not the OS’s fault. That’s just the result of a field being dominated by old, technically incompetent people that don’t get the power of the internet.

          1. No, Linux does not equal red hat. -but I never claimed that.
            However, of all the operationg systems that *may* be used select Linux distributions are the only ones that charge for product (and security) updates… This isn’t talking new versions this is the stuff that was basically broken but not known about at release… Windows has not charged for OS updates -ever. (Upgrades are a different story, where as I’ve pointed out, the biggest cost is process, not licensing.)

            Liking red hat EL doesn’t mean using fedora, the closest comparison is probably CentOS.

            Whilst you are completely right to say if you want rock solid performance and in place upgrades Debian is hard to beat, please also try to understand that “rock solid” comes at the expense of timely updates. Debian IS great, but if it would have been in the same position as Windows regards the viruses listed in the article… If there had been (if there could have been) an exploit for Debian it is almost certain that Debian foundation would not have had a patch put out, and tested, pushed to stable, including for versions so old that they are not updated, within a week.
            It may have reached testing or unstable… But most systems would have been vulnerable.

            I know you *can* schedule updates on Linux, the point is it is neither enabled out of the box, nor is it recommended. I take the idea that you can restart your servers, and have all services back up before a connection timed out, with an incredibly large pinch of salt…

            The author is making the argument that upgrading Windows is too high, my point is, compare like with like…
            You want corporate “backup” or someone to blame, you want paid for support rather than a forum then you buy Windows, or red hat, or Oracle Linux etc.
            -all have capital costs…

            The further point of what I wrote is actually, the largest cost of a “proper” upgrade is going to be testing.
            And the largest cost of an upgrade done without that testing is going to be the cleanup…

            The business world doesn’t want paid same day support because old people don’t get the Internet, can’t work a search engine or don’t have the patience to wait for an answer on a forum. -my experience is, the kind of businesses that insist on red hat subscription are generally the kind of people who have huge turn over. (Stuff like hedge funds managing millions (or billions) -where a few hours more down time waiting for answers from experts exchange or trying to read mailing list archived, could mean substantial losses. Or the kind of industries that are bound by regulation (fsa sox etc) to use software with vendor support offering.

            That you don’t understand this suggests more ignorance on your part than on your crusty/senior managers/colleagues.

        2. But if you know the MRI machine will last 50 years, why would you make such a stupid architectural choice of using a 10 year lifespan OS? It’s not a 50 year machine any more!

          1. then one should look at ones design choices, specifically ones coding practice, besides if your machine has a 50 year service life then shouldn’t the onus be on you as the designer to make sure that it is practically usable 50 years later?

            i know people hate standards and protocols but they are there for a reason, you could internalize all the needed processing power, even if you decide to do it by just putting a pc in the case with a data diode on the output, it could serve everything up over an ftp server or similar a similar to allow for OS independent use.

        3. Agreed.

          “But then along came Windows 7 and the end-of-life plans for XP. Even so, it took a number of years to get through all of the security and compatibility testing required to make the switch. And that’s just the cost of labor. On top of this, they had to pay for all new software licenses.”

          Apart from the licences, linux has to go through the same process. How does the license cost compare to the labour cost?

          I see that as the problem. Most managers only look at the capital expenditure. They can’t see the labour cost and assume windows is bad because of the license cost.

          I’m have a similar problem trying to get my work PC upgraded. The SSD isn’t big enough to hold our source tree. It builds in 30min on the SSD, or 1hr 30min on a traditional hard disk. I can’t get the approval for a $200 SSD that would be for itself in my time in 3 builds.

          1. “Apart from the licences, linux has to go through the same process. How does the license cost compare to the labour cost?”

            Cost of Linux admins vs. cost of Windows admins may also be a factor.

            “I’m have a similar problem trying to get my work PC upgraded. The SSD isn’t big enough to hold our source tree. It builds in 30min on the SSD, or 1hr 30min on a traditional hard disk. I can’t get the approval for a $200 SSD that would be for itself in my time in 3 builds.”

            What? Do you have to sit and actively watch your builds, making it impossible to do anything else?do anything else?

            If running a build takes over your only desktop, the answer is to get a second desktop IMHO.

            >

          2. even if he doesn’t have to sit and watch it it does add time to a testing cycle, if one was fixing bugs and had to verify, change and verify then that time quickly adds up.

            the principle behind it is something i have run into many times for many different reasons, from buying better radios over server purchases and software.

          3. “even if he doesn’t have to sit and watch it it does add time to a testing cycle,”

            As I said, this justifies a second machine, not a faster single machine IMHO

            “if one was fixing bugs and had to verify, change and verify then that time quickly adds up.”

            Programming wasn’t always an iterative, whack-a-mole exercise to see if you figured out the next bug… Kids!

            >

          4. @Ken

            I don’t speak in acronyms. I prefer full, clear language, it tends to prevent any misunderstandings, especially when the nerd world is reusing acronyms for other things.

            I refer to your statement:-

            “If running a build takes over your only desktop, the answer is to get a second desktop IMHO. ”

            I assume IMHO means ‘in my honest opinion’. Are your opinions usually false? There is a world of difference betwen being frank/candid and beig honest.

        4. While working on a General Motors account, I heard about a PC that controlled a large, expensive, old machine in a plant. The PC was running DOS 6.0. The vendor had long since gone out of business, so they were stuck with it. The hardest part for the support staff was finding old PC power supplies and other parts that eventually needed replacement.

    2. “Free as in beer” and “Free as in speech” are not mutually exclusive, GNU.

      “Free as in speech” means source available, maybe at cost.

      “Free as in beer” means free to use, but you may not have access to source code.

    3. Very often in the Linux world Linux is both.

      However they are neither mutually inclusive or exclusive.

      Free beer : The concept of free beer is that, it is free. It doesn’t cost anything, however you don’t necessarily know how it was made or what is in it.

      Free speech : This is a bit of a colloquialism used to represent “liberty”, you have the agency to change what is there. However you don’t necessarily get it for free, once you have it you are free to make changes.

      The Linux Kernal it self, at the time of writing this, is both. You are allowed to download it for free, but also allowed to change it freely. Notice the dual meaning of free in this case.

    4. The definition of Linux is itself a 3 day argument. Linux as used in this article is free as in speech, but the vast majority of things in linux are also free as in beer. In my experience another cost savings is there’s generally fewer things that don’t work when upgrading linux than windows and fewer things that work differently.

    5. It is both but the focus on GPL is that it is free as in free speech. French makes the distinction between the two frees, libre is free as in freedom or free speech and gratis as in free of cost. It’s not to say someone can’t charge money for GPL code but that you must have the freedoms provided by the license.

  1. it seems to me the largest factor is that windows is so uniform. there are numerous linux devices out there but they are all different, you can’t target “linux”, you have to target certain versions of certain packages on certain distributions in order to make a working exploit. meanwhile XP devices are all so similar that work on a single exploit could yield vast returns, as we are seeing recently.

    but that’s just a marginal effect. which is really why i’m writing: a marginal effect is all we ever get. can’t dismiss something that “only helps around the margins”. free updates, incremental updates, diverse distributions, many eyes, a non-infantile approach to development, these are all things that help linux be marginally more secure than windows.

    1. ^That.
      And maybe that the OS with the most talented haters is Windows.

      Pretty much every Windows user I know are not Linux-haters. Actually, they don’t care about Linux at all, if they even know that it exists.
      Some Windows users don’t like OSx, they don’t want to touch it, and definitely not write software to it, like a virus.

      Most, if not all, of the Linux users I know are the biggest M$ haters, like most users on HAD, but they have no problem in writing software on Windows + they have the needed knowledge.

      I simply don’t understand why anybody would waste time on hating a OS.

      1. Only a very small, but very vocal, minority of Linux users ‘actively hate’ Microsoft (and/or Windows). As you mentioned above, most people simply don’t care.

        The problem is the ‘obsessives’, be they Windows users, Linux users or OS/X users. They are completely irrational about the matter and this is reflected in their behaviour. Particularly in their irrational demand that everyone use their preferred version of whatever…

        Obviously, there’s no reasoning with them. If you tell them you don’t care it simply drives them to greater efforts because “You SHOULD care!” :)

        Ignoring them doesn’t work either, it just infuriates them and things can get really nasty when that happens.

        1. That so much.
          As with most things on the internet, from politics to religion to editor or OS battles;
          Only that loud voices seem to have opinions. Only the loudest voice gets heard.
          The vast majority of people don’t care, or use both interchangeably, or whichever is convenient for the task at the time.
          Nobody shouts at the top of their lungs with vitriol that they ‘liar both Linux and Windows equally for different tasks and anyone who chooses a side will go to hell!!”

      2. heh, i remember looking around a couple years ago and realizing i don’t hate windows anymore. i used to *hate* it because i had no choice, it was a constant thorn in my side any time i tried to do anything. but it’s basically been over a decade since i’ve had any involvement at all in windows. all my application development is cross-platform (roughly, posix) or web (which now means chrome, not IE), or android. and the decision, 15 years ago, to deny knowledge of computers when making casual acquaintances has paid off. it’s literally been a decade since anyone asked me to fix their PC. pfew!

        1. Why I stopped pontificating:

          When some totally illiterate computer users ask me to look after their computer, I know it will be a Windows box full of trojans. I just tell them I use Linux and they normally just leave me alone, and go bug someone else :) ROFL

          Having said that, you can use even DOS 5 or Windows 98 to control processes, and do other stuff, quite well. The trick is: UNPLUG the darned thing from the internet!

          1. When my brother asked me to set him up with a computer, I built him an XP box. When he inevitably got some nasty malware, I told him: “Look, I can restore this system, and we can go through the same exercise every six months or so; or you can go through a bit of a learning curve and have something that won’t require me to come here that often.”

            He was willing to try Linux, and it’s going on 5 years now. I upgrade him (usually for performance reasons) every couple of years, and he has been fairly happy. I’m more happy. I would have put him on Apple, but he can’t afford it.

            Microsoft is the OS you get when you just want a computer. Linux is what you get when you’re fed up with Windows, and Apple is what you get when you have plenty of money.

          2. > Having said that, you can use even DOS 5 or Windows 98 to control processes, and do other stuff, quite well. The trick is: UNPLUG the darned thing from the internet!

            It’s impractical to completely isolate most stuff now. That process you’re controlling, someone wants to know its status and someone else wants to tweak its parameters…

          3. Don’t forget FreeDOS!!! I installed it in VirtualBox a few weeks ago and partied like it was 1990! Text based browsing at its finest. And free. BTW the free licensing allows easier testing and development of complicated systems due to not having to keep track of every instance.

      3. Pretty much. Linux users took time out of their busy days to continue to write viruses for Windows because they didn’t want to stink up their own place. Even today, you can find the standard ‘Linux is &^%$ secure’ yet every admin/super user has some sort of backdoor to accomplish what they want (which means others do too smh). I wish we could all get along but it won’t ever happen. Browsers are the closest we will ever come.
        MS is on the right track with win10. It’s weakness is that it plays well with other kids (which is also its strength) as any long-suffering net admin from the 90s/early 00s will tell ya. Linux has come a long way in usability by the unwashed masses since its “release” and only several thousand distros to choose from lol. Meanwhile OSX, oh who am I kidding? No one even uses that OS for anything real on a daily basis due to Apple’s policies and unsubstantiated cost lol. How about developing an OS rather than buying a third party print thru of ^nix heh heh.
        OS/2 ftw!

    2. Lack of uniformity is the reason people that know XP hate 10. Nothing has the same name or is in the same place. Linux has to be the most uniform OS out there, everything that does anything is exactly the same on all versions. If you choose to wrap that in various coloured interfaces and button clicks with a mouse then that is your choice, not anything at all to do with the fundamental OS at all.

      1. “Linux has to be the most uniform OS out there, everything that does anything is exactly the same on all versions.”

        At their internals level perhaps, but to an end-user Ubuntu, Fedora, etc are not ‘exactly the same’, they start with different GUI interfaces, have users jump through different apps and screens to set IP addresses, join WiFi networks, add printers, etc.

        >

        1. As I said you can decide what you want to use on top of the OS, that is your choice completely and there are hundreds of ways to do it but that does not alter the fact that the OS is identical. Micro$hite struggle separating the OS form the applications and it will forever be crap for that reason.

          1. My point is, you are talking about a level of similarity few end users are exposed to.

            The average end user lives in the graphic UI, desktop manager, etc., not at the deepest internal levels.

            An end-user accustomed to working on a WinXP-era Linux distribution will be just as lost when their desktop is replaced with a Win10-era Linux distribution as they would going from WinXP to Win10.

            I guess the point you think is meaningful is that the Linux System Calls are similar across distributions – fine, but before you criticize Windows realize that the vast majority of system calls from WinXP still work in Win10. The reason some WinXP software doesn’t run is because coding practices improved over time, and what was tolerated under WinXP is not under Win10, for example, many WinXP applications run in administrator mode, which prevents those applications from running under Win10.

            MS Office 2003 could probably run on Win10, if one wanted to, for example, while many WinXP games probably won’t.

            >

          2. The end-users wouldn’t be the ones responsible for making and distributing hacks/viruses/etc., so I don’t think the end-user perspective of how heterogeneous the Linux environment isn’t really relevant to the overall discussion here.

          3. You’re thinking like a power user.

            Think about it from an end user perspective. The typical desktop user, who thinks programs aren’t installed because they don’t have shortcuts on the desktop. They don’t care if FluxBox or XFCE is better than MATE, they don’t care what ALSA is and why their program needs it, they get a blank page syndrome when you open a terminal window.

            If they want to install something, so they search Yahoo for Google, then search Google for the name of their program. They download an .EXE or .MSI installer, which doesn’t work. They search “[application] linux” and come across some weird string someone posted on a forum and told someone else to copy/paste into a terminal window. They open a scary “DOS window” and paste what they were told, and encounter an error because they copied an apt-get command but the distribution they run relies on yum.

            They don’t care about how the computer works. They don’t want to put in tons of effort to learn something new. All they know is that something about their computer is different now, therefore it is broken and unusable. All they want is their blue W and the Mozzarella Foxfire.

        2. Yep, and those are just frontends to NetworkManager, CUPS… etc in 99% of cases. In any case, the route is often obvious.

          For a printer, a lowest-common denominator way has been to fire up a browser, and go to http://localhost:631/.

          CUPS, incidentally is the same tool used for managing printers as in MacOS X. Apple liked CUPS so much, they bought the company!

        3. What GUI?

          A GUI for Chernobyl?!?!

          If you built a network of radiation monitors would you really build it as a GUI app? GUI programs require a user. The user must log in. The user must click an icon. Ok… Windows does have a registry hack that allows it to automatically log in but this is exactly that… a hack.

          This system should come back up immediately after a power outage.

          I think a much more reasonable scenario would be a daemon (on Linux) or a service (on Windows). This would be running any time the computer is on, caching data in some database (also a daemon or service) and maybe sending emails or some other form of messages if bad things happen.

          If it also needs a UI.. then one node is a web server.

          What about that hospital IT system?

          Well.. again.. the important stuff should be happening in some servers somewhere. The server needs no GUI.

          As for tools… It amazes me to see a Windows desktop on hospital X-Ray, EEG, etc.. screens. Why a full general purpose computing environment on a tool?!?! Even if you use a ‘familiar’ OS in the background that CAN be used as a desktop OS the pieces to actually make it a desktop should NOT be there! Why must it be possible to play solitaire on a piece of lifesaving equipment?!?! In Linux it is easy to automatically start some single purpose program in a screen buffer or even in a dedicated X session. In Windows… I suppose with unsupported registry hacks…

          Finally there is whatever the nurses/doctors use for manual data collection? Who cares what the GUI is? You have a centralized server and it is 2017. They should be able to do everything through the web browser. A cheap tablet with a synced bluetooth keyboard and mouse should be all that is needed. The last I checked Firefox and Chrome look the same on Linux, Windows and OSX. They are different but most reasonable people are fine with the browsers available on Android and iOS too.

          1. Wow, impressive – you thought of a use case that doesn’t require a GUI, that proves what, exactly?

            Hospital equipment, ATMs, manufacturing tools, etc use ‘full-blown’ windows because it’s a well-known environment, programmers are cheap, tool chain costs are low, and you don’t have to invest time in spinning up your own OS from GNU/Linux bits.

            Ever heard of Windows Embedded? That’s a pared-down environment for POS tools, ATMs, etc – it also posts a Windows start-up screen, but is a smaller, more stable, environment.

            >

          2. The Chernobyl disaster was in, from memory, the 1980s. SAAS and REST didn’t exist. Of course they have a GUI.

            How many apps does the average smartphone user have? Yeah, most of them could/should just be web pages. But they’re not.

            (BTW in Windows it is also very easy to run an arbitrary program in place of explorer.exe)

            I’ve built many industrial control systems. I’m a huge Linux fan. Guess what, all those industrial systems ran Windows. The software vendors only target Windows.

            I hope one day your utopia of “everything via web” is realised, but it’s simply not reality today.

      2. at the POSIX level, linux is exceedingly uniform, but above that it is a real hodge-podge. just in mailer daemons, i’ve used sendmail, smail, postfix, exim, courier. if you had an exploit against sendmail (as many did, in the day), it would only have worked on about 3% of my total linux history, it’s hardly worth the bother. i’ve got different versions of glibc, my android devices all use bionic, i’m pretty sure i’ve got a computer running uclibc. all of the obvious attack surfaces are extremely varied. even kernel-based exploits will only work on certain versions and configurations, and i don’t think i’ve ever owned two devices using the same kernel version, let alone the same kernel binary. it’s really incidental to why i use linux, but it sure does make it a lot harder to make one exploit to rule them all and in the darkness bind them. even shockingly universal exploits like heartbleed or that bash environment variable expansion thing are only applicable to a tiny minority of linux installations (i.e., they won’t do anything to android).

        1. Every PC box running the same edition and build revision of Windows is running exactly the same kernel and other core files. That makes it far easier to exploit any security hole discovery on a large scale.

          With the smaller gross market share of Linux and with so many more variations in use, any discovered vulnerability has a much smaller attack opportunity. A malware author would have to have exact knowledge of what Linux setup the target was using in order to pull off an attack on a specific system. Other systems that happen to have a compatible vulnerability would just be collateral damage or camouflage.

          Thus virus and malware attacks on Linux fall into two broad categories.
          General annoyance / nuisance of a bunch of systems with the exploit.
          Specific targeting plus hitting any other systems that happen to be vulnerable, with the same network access as the target.

          There’s some plot fodder for a TV show. Was the murder victim’s Linux computer directly targeted or were its files trashed as coincidental happenstance because the victim hadn’t yet installed the latest updates to defend against the new “LinWeed” worm?

      3. “Lack of uniformity is the reason people that know XP hate 10”

        Indeed. Most people don’t want ‘new and shiny’ – the obvious exception being the OOH-SHINY brigade. They want familiarity, consistency. They don’t want to have to hunt all over the UI because the last update changed the layout.

        I suspect that in many cases the only reason for the change is the developers have run out of ideas and/or other work to do.

        In a corporate environment it’s a complete pain in the arse. ‘Productivity’ drops, support ‘costs’ go though the roof until most people get used to it. Some never do. And then it starts all over again with the next update cycle.

        1. IMHO–The lack of uniformity was with windows 8, and 10 was Microsoft’s answer to end user displeasure. I consider myself to be on the Linux side of this discussion, and I can’t understand why someone can’t get over the “hump” with a desktop like MATE. There is actually more delta between windows releases. I have been using a particular theme for about 9 years now. I know that I can get TWM to work still if I desired. MATE and my theme looks more familiar to me, and more similar to windows than windows 10 or 8 do. Theoretically one should be able to minimize end user churn from cycle to cycle. Especially in our super-expensive hospital example.

          It is the visual change that drives sales with hardware/software. You can’t sell an upgrade unless the PHB can see the difference.

          1. Boot up and log in to a Windows XP and Windows 10 system and you’ll see they are very similar. Open Office 2003, 2007, 2010, or 2013 on the XP box and on the Windows 10 box, and they will function identically as far as a typical user is concerned. Every other difference a typical end user will stumble into running Windows 10 in place of WinXP can be handled with one simple instruction – Type a description of what you want to do/see/open in the Cortana dialog box on the taskbar. (To see the device manager, Type “device manager”, to add a printer type “add a printer”, etc)

            People unfamiliar with Windows like to imagine that the default desktop in Windows 10 is the ‘color blob’ touch-Optimized default they saw in Windows 8, a Windows 10 default desktop looks an awful lot like a WinXP default desktop.

  2. nope. win10 systems are vuln to the last not-wannacry outbreak. then there’s java exploits or java based malware hosted on compromised sites. then there’s the medical device/workstation with a vendor who sees no need to ever upgrade. throw in scada & it’s all kinds of fun…or just a CF

  3. You bring up a very valid point and a fact many people seem to overlook when dealing with industries that rely solely on Microsoft products for their infrastructure.

    Call me a zelot all you like but its why I think its worth looking at linux still if you’re not doing any heavy gaming and just need it for basic computer uses. The nice thing I find about using linux is that security is implicitly built into the system itself.

    Unlike windows, in linux trying to blow away your stack causes a fault and is met with serious errors to your system integrity. The other main catch all (windows 7 tried to do this) was having provisioned resources for user accounts (super user and loser accounts) already built into the workflow. You can’t randomly install things if you’re not sudo.

    If you look at some of the newest distros, things like Xubuntu, Kubuntu and Mint have been around for a while and are much closer to a nice desktop experience that windows offers. Might take a bit of time to configure but you can get the same if not better results productivity wise with these setups. Curious what would happen if governments actually switched and went the open source route? Would they worry about “Security” from a basic distro? I know things like Red Hat are used by the military and security organizations. Wonder why they don’t just switch all their machines to something like that?

    1. Just saying, Windows actually has most of the prophylactic security measures Linux has now (ASLR, W^X, and recent versions of VS/MSVC have optional glibc-like heap validity and stack canaries IIRC). Doesn’t mean shit if they’re turned off to make legacy stuff work though.

      Windows is just so huge that there are so many ways to get in, and to stay in hidden. Probably no one person understands every component, and with the way Microsoft is organised, nobody has any incentive to. They’re stuck with an insane amount of technical debt and have to work out how to secure it. That’s not to say the Linux desktop ecosystem’s attack surface isn’t oversized. X11 is a nightmare.

      1. I suspect a lot of Microsoft’s problems are due to the usual corporate inertia at the middle management level.

        Losing a couple of billion dollars during Balmer’s deranged attempt to force his way into a mobile market already dominated by others probably didn’t help either.

        1. Their biggest mistake was them trying to make one UI and code base for both desktop and mobile.
          Apple avoided doing this for good reason which is why there’s both iOS and OSX.
          Another even larger mistake is all the spying in Windows 10.

          1. Indeed.

            Blame Balmer and his childish “I wanna Windows tablet and I wanna Windows phone and I wanna… I wanna.. Waa, Waa, Waa”.

            For me the biggest problem is the deliberate crippling of games compatibility.

    2. I think it mostly comes down to management.
      There would need to be a widely supported Active Directory equivalent (SSO service, file shares, DNS services, policies, etc…).
      And since there are so many different linux distros, this is very hard to do.

      1. That’s not the main reason. All of those exist in one form or another. And the problem isn’t just organisational inertia and bribes… erm… I mean customer incentives…

        Aside from the technical difficulties, major changes in working practices can be quite traumatic for many people. Some have difficulty making the adjustment, others simply cannot.

        And then you’ve got the self-styled ‘computer experts’ who insist on being able to install whatever they want and/or make whatever changes they want regardless of corporate policy, security considerations, etc.

        They tend to be rather ‘vigorously’ opposed to Linux for some strange reason…

        1. I don’t object to major changes in working practices…if they’re done for a good reason.

          Changing UIs for the sake of changing them is NOT a good reason to impact my productivity. Microsoft cares much less about improving their software than they do about “new looks” and forced upgrades (how many versions of .doc have there been?). I use Windows at work because I have to, but every new release brings a new UI for me to learn.
          And most of the new “features” are non-negotiable.

          At home, it has been Linux for going on 15 years. I was on Ubuntu, but when they replaced Gnome with the tiled “Unity” interface, I dumped them for Mint MATE. I’ve been very impressed.

          1. Indeed. As I mentioned above, most people don’t want new and shiny.

            I switched from Ubuntu when they started wholesale replacement of ‘standard’ components with ‘home grown’ substitutes.

  4. Very naive. To answer the question correctly: There are so many XP machines for the same reason, popularity and, to be more precise, supportability of the OS. It runs every software (of that time) and you mostly knew whether the HW is running before you buy it. You may argue that this is not the case anymore today, but it is. Have a look at printers, scanners, “new” chipsets, …

    Then, the majority of costs of an upgrade is not with the OS license, but with the HW, all the other SW, …
    Then, you might get “resistance” from the users (costs “produced” by users), because things are different with the new system.

    There is no point (except security) in modifying a system that is doing its job perfectly. So we are looking at support, shall we:
    How long had free, user friendly Linux distributions support back in the time and how long had XP support? How long are your devices supported before they drop out of the main line kernel? And so on.

    I have an old Android phone. It’s most likely unable to even run a currently supported version. What now? Blame me for not updating THIS phone?

    1. You hit the nail right on the head. With enterprise support agreements, the software costs of the upgrade are very close to 0. Now let’s count the number of CIOs who said, “I want to incur the cost of testing, operationalizing, and retraining so I can deploy a new version of an operating system who’s claim to fame is an interface that mimics a failed tablet OS”. That number is also close to zero.
      Unlike cars, software never wears out, so there really isn’t a pressing need to keep changing the OS with new features versions. Every pointless line of code adds new vulnerabilities, so this constant upgrade cycle just helps the bad guys. A continuously patched version of XP would become as solid as any Linux system, and would help everyone’s bottom lines (except the software vendors).

      1. software never runs out?
        wow i know of a lot of abandonware that would like to read that memo…

        software has several ways of becoming obsolete, even without any changes, there are very real limits to what can be done in any given OS before it becomes easier to start over utilizing the tools you have bootstrapped yourself to at that particular stage of development.

        1. radically off topic, but: i’ve been playing x-wing 1993 (floppy disc edition) on a 486 (found in a barn) with a joystick from the 1990s (kraft premium III!), and dr-dos (free edition). it doesn’t have any more bugs than it did the first time around and it’s still fun :)

          *some* software ages tolerably.

          1. Of course it will! That old computer will run contemporary software offline until the hardware fails. Same deal with all those old Windows 98 gaming PCs everyone seems to be building these days. I have an old Macintosh SE that still runs MacBandit, SimCity, and Lode Runner every bit as well as it did in 1993.

            The problems start coming when people try to take old software, drag it into 2017 kicking and screaming, and put it on the Internet.

    2. This.
      Businesses don’t buy computers because computers. They buy computers to solve a business problem. If the computer continues to solve that business problem, there is no reason to upgrade.
      …except for the fact that computers are constantly under attack.

      The problem is not really Windows so much as the business model of Windows (and software in general). If you buy a piece of software once (or it comes with the computer) then the software maker doesn’t get any money for the continued support of the software, so they feel obligated to make a newer, shinier version of the software to get more money out of current users.
      Then, since they are no longer making money on the old software, it is logical for them to discontinue support and updates for the old software, which puts customers who invested in it at risk.
      As much as I hate subscription software, probably the best solution is for Microsoft to continue perpetual support for Windows XP on a subscription basis. They have actually done this, it is called “Windows Custom Support”. Unfortunately, it is prohibitively expensive, and negotiated on a business-by-business basis. They know certain corporate and government users will pay it, so they have no incentive to make it reasonable to smaller businesses. Which is why we have situations like this… support is available, but most can’t afford it, so they suffer. If Microsoft was not so keen to discontinue their older products after the newer ones launch, a lot of suffering might be spared.

      This is where, as the article rightly points out, Linux makes more sense. Unfortunately, outside of running web servers, a lot of software is simply not available for Linux. This often has to do with the concept that all software on Linux must be free / open source, and commercial software companies can’t see a way to make a living under those circumstances.

    3. I know Linux since Slackware 1 and I switched from Windows to Lubuntu 16.04 last year. It was a close call and took a lot of time. I hated switching and I still like the usability of Windows7 better. The process of switching is still not finished. A lot of things are worse now or do not work at all. But I don’t regret switching! One main argument for switching was support. This is kind of ironic considering the support periods that Microsoft granted for XP and Win7. Well, times are changing…

      Switching from Windows to Linux is expensive and requires a lot of knowledge and groundwork. But there is one thing even more expensive: Switching back. If you’re unsure, don’t switch. Using the sledgehammer will not work out the way you want.

  5. Cost isn’t the only stumbling block for a lot of these organizations.
    Microsoft has been continually ramping up imaging complexity while at the same time leaving out more and more customization features.
    Creating an XP image with the exact subset of software you want on the machine (no bloatware) with uniformity in desktop icons, taskbar shortcuts, and start menu no matter who logs into the machine is trivial.
    Trying to do that same thing with Windows 10 is an absolute nightmare.

  6. One word, Shellshock. How an arbitrary code execution vulnerability existed for 25 years in one of the most used (and presumably inspected) pieces of open source software is incredible. Heartbleed was nothing by comparison. This bug was older than Windows 92, and IMHO throws a big wrench in the idea that open source software is inherently more secure.

  7. While we all like to wave the Linux flag (I’ve been using it since ’92, here and there), I’m not sure it’s as perfect as you make out.

    XP was launched in 2001. How many people are _actually_ running Linux boxes now that’ve been patched all the way up to now, rather than any reinstalls with later distributions? Bear in mind that’s 3 years before Ubuntu came into existence…

  8. Had windows 10 since start. Have not s3en a virus or any adware in the years 8ve used it. But i put that down to staying updated and the brilliant defender software cause I don’t use any antivirus software at all except defender and we’ll I can happily say windows 10 have been much like windows 8.1 incredibly reliable to me. And yet I use the internet and go to dodgy sites all the time lol 8ncluding download g . It’s the one thing microsoft has gotten right over the last 4 years. Shame about edge, Skype and mobile.

    1. And shame about telemetry that may spike 100% of the cpu and / or IO disk of a skylake machine,
      suddent update that will break programs, setting and force new setting with less feature (creator update).
      Of course that is assuming you did chose to upgrade to windows 10 AT ALL.

  9. If your still using so or any very old windows varrient and yes I put 7 in there then what can people expect. 10 years support to me is long enough for any ps4.

  10. If your still using so or any very old windows varrient and yes I put 7 in there then what can people expect. 10 years support to me is long enough for any ps4.
    As a windows user and someone that would love to leave microsoft. Even though I love windows 10, Linux just doesn’t fit for me. Gaming is a pain in the bum with whine and I love my Adobe products, also hardware comparability was a big issue when I last tried Linux. All these things make it a non starter against the very reliable windows 10 hi. Like I said I wish android or ios would make a move 8nto serious pc gaming cause a microsoft fan I am not, unfortunately I can’t stand Linux either. Well the versions I have used.

    1. Yeah if you are a gamer, Windows is the only sensible choice. I am a developer and hacker and don’t do anything with games so Linux is ideal for me. If Adobe would make a way to run Lightroom on Linux I would dump the Windows 10 machine I use just for that purpose.

      1. Indeed.

        Very few of my preferred ‘genre’, so-called ‘legacy’ FPS’, run properly under Wine so I have no choice but to dual-boot. Which means I get my ‘nose rubbed’ in the differences almost everyday. It can be rather frustrating at times.

        Windows 10 isn’t an option for the same reason, without the Wine of course :)

      1. Nah… It’s a combination of his hands shaking with rage at our audacity in daring to suggest that his favourite O/S might not be absolutely the very best O/S ever and the froth from his mouth dripping into the keyboard which is interfering with his ability to type coherently :P

    2. I hear your complaints. Linux can be a difficult pill to swallow and has become quite bloated in the pants over the years with all of the shiny updates (imagine if people had to read 5 wiki articles to just decide which desktop environment to run during install or otherwise smdh), making it not run as well on older hardware (which is the opposite of the beginning DSL and Puppy).
      If you do like Android, it is possible to install it on a PC but it really depends on your hardware. it has been a while since I have done it, but howtogeek puts out an article on it every year so check there for latest images and RUFUS links. As with most ported software ymmv.
      I used to be a PC gamer but made the switch to consoles several generations ago and have enjoyed it. Granted I miss out on all of the mods and keyboard mouse control but I also enjoy not having my PC’s config and services all over the place for security sake. That was mainly because I was cheap about buying the latest bleeding edge graphics card though and had to jump thru hoops sometimes.
      Either way have fun and keep on gaming :)

  11. I am a Linux user and use Windows only under duress (I have a Windows machine just to run Lightroom and Photoshop). Never had a bit of trouble with my Linux box getting hacked (nor my Windows machine either, hmmm ….).

    But the point of the article was that there were economic barriers that kept people running XP. I don’t entirely agree. The problem is that the people running the projects using these old XP machines haven’t budgeted for upgrades. And I know that this is widespread. And it is not always the fault of the IT staff. So let’s just say it, for a company to be using old XP machines for something safety critical is inexcusable and almost criminal. But the attitude is that when the hardware and whatever software is first put on it is paid for, the game is over. Foolish — and laughable in the health care community where they should be rolling in money (but maybe all this is different across the pond).

    I know of a company with a bunch of production and test fixtures that still run DOS! They keep the machines sputtering along and cross their fingers and have no plan for the future. A major aerospace company — and I am sure it is not an odd exception. The barrier is mindset, not money.

    For that matter I have seen some linux machines get hacked — that were running some ancient install that was never getting upgrades or patches. The same mindset led to the same results.

    1. Any large company (say > 10) should purchase volume agreement software licenses for Windows OSes, not retail OEM licenses. If the purchase under a volume agreement, they pay a small annual fee and have immediate access to subsequent releases during the term of their license.

      It is cheaper than buying retail licenses as the product is upgraded, but not cheaper than say skipping several interim releases (for example going from XP, skipping Vista and 7, then buying Windows 8 licenses).

      >

    2. Two things everyone forgets about Widows:
      1. It just works. Unless your hardware is made of unobtanium, Windows will work or let you install working drivers. And it’s backwards compatible. It has “uniform” APIs. It has GUI that can be used for 99,9% of tasks. It just works. The drawback of having such OS family that works is that viruses can run on Windows as easily as other software. And there is a Windows version that guarantees 100% security against any viruses, malware and ransomware. I’s Windows ME, and it protects itself by crashing before any virus has a chance to run on it.
      2. There are lots of special software and hardware that is necessary in some companies and it can’t be replaced or upgraded because there is nothing to replace or upgrade it with. I heard about a company that bought 200 SBCs based on 80486 just to act as spares in their industrial machinery.. Polish State Railways, Inc used old Odra 1305 minicomputer (manufactured in 1974) until May, 1st. 2010 at cargo train station. It kept track of trains and cargo, managed timetables and did other things too. It was just too reliable to replace and had too much useful software. The same problem is with old Windows boxes hat just can’t be replaced yet or have no substitutes in software or hardware…

    3. Nope, not an exception! I interned at an engineering firm that made aerospace components, and they had an election beam welder originally built in 1954 retrofitted for computer control in the early ’90s, using DOS and custom software in visual basic! And they put stuff in jet engines!
      I’m this case though, it’s definitely economic, not mindset.

      Sometimes old operating systems are used because it’s what the property equipment software was made for and the manufacturer doesn’t support any more. It’s the same with CNC welders as it is with MRI machines or mail sorters, or printing presses or any expensive computer controlled machine.

      Perhaps mindset would be at play for not wanting to learn new software and train employees.

    4. DOS is common enough in manufacturing that there are specialty companies that still make machines with ISA and PCI busses, and the problem is most definitely money.

      Say you have a $100,000 machine which is controlled by a full size ISA interface card. The company that made the machine either doesn’t support it any more or is long out of business. So today, your choice is (1) toss the entire machine and spend another $100K for a new one, (2) spend probably more than $100K reverse engineering the interface and developing your own control software (and maybe you’re a machine shop and “IT” is a novel by Stephen King as far as you know), or (3) find someone who can sell you a 486-based computer with an ISA bus you can run your software and plug your card into, even if it costs $5,000. I can tell you which choice every business will make.

      A LOT of the legacy pervasiveness is about device drivers. XP device drivers don’t work at all on 64-bit machines, so every hardware interface in use in the XP days is now orphaned unless the manufacturer feels like developing new derivers, which now have to be signed by Microsoft (for $$$$$) in order to run. And a lot of those manufacturers don’t care or aren’t even around any more. They’d rather sell you a new $100K machine for some reason.

      1. a hundred grand machine would be nice. :)
        I know of who assembly lines that are controlled by machines running windows NT4…
        as you say, replacing windows, often isn’t an £80 license cost.

  12. You probably should be running anti-virus on your linux boxes, in this day and age.

    I run Clam-AV pretty regularly, make sure the firewall is tuned right, etc.

    absolutely no-one is invincible. For a number of years MAC users have been chanting the mantra “we don’t get viruses” and they are slightly quieter about it now….

    1. In general,
      there is a very low probability anti-virus software will stop a problem that affects a kernel level issue.
      Most admins include clam-av as a filter for email to protect other Windows/MacOS clients.

      Almost all Virus/Worm that are publicized as “Linux pwned” — are in fact application specific attack vectors that have nothing to do with the kernel. The Snowden leaks were very clear about which are regularly used.

      Heart-bleed was specific to a popular version of OpenSSL, shell-shock was bash+modcgi/dhcpd/X11, and the recent FUD is from systemd+Ubuntu >=16.10 DNS resolver. Note, many linux systems use a read-only operation system partition, and enable the hardened features of the kernel (redhat&gentoo always took more precautions over usability).

      The hilarity of the obvious “look at the other guys… not our steaming dump” distraction presented by “security experts” only highlights a fundamental problem. While some windows virus can run under WINE, Java, and Flash…. Most do not… Even when users demand their malware be compatible (not sarcasm here, users ask to import trojans all the time).

      *nix boxes are often targets for bot-nets given the versatility of the OS (default passwords for example), and there are always holes in every application left open to the public. However, the core kernel itself is nowhere near as vulnerable given people have hammered on it for over 20+ years already. As a side note, I can’t recall ever witnessing a fire-walled *nix based workstation go down without a failed hard-drive or user error.

      Android OS is probably an exception to this *nix phenomena — as most other *nix are more like an ecosystem than a specific OS.

      1. Be cautious with how well hardened you think the linux kernel is, given it includes a whole bunch of drivers too. Just as an example the bug behind CVE-2017-2636 got introduced back in 2009, and has only recently been fixed. The attacker only needs to win once in order to succeed, but the defender has to win every time to stay safe. There is little room for complacency regardless of the OS/apps that you run.

        1. Privilege escalation flaws again require existing access into a system, and workstations would still need a vector to install bug free. If I recall there are less than 3 remotely exploitable kernel bugs in 10 years, and most are difficult to reproduce on a live system.

          We use Docker, crypto signature image signing plug-in (auto-versions on corrupted instances), and some RancherOS containers. The instance may be compromised, but unlikely given the minimal VM kernel is task specific. The “cloud” concept is very useful, but only if you run the thing yourselves — and keep your IDS policies secret.

          Yer right in a way, as some people do take a Desktop enabled-everything kernel and run a server on it… but the constant probes from the web force them to evolve quickly.

  13. One of the complaints about getting hospitals off XP is the big iron (not mainframes) they operate that can only run XP: MRI machines, CT machines, all sorts of medical devices which need the smarts of a full computer, and chose Windows as their OS, for which there is no upgrade available. The same goes for governments running proprietary systems, industrial businesses running CNC machines with x86 boxen inside, museums with gigantic databases that practically can’t be reproduced, etc.

    Money is certainly a problem, but straight-up compatibility and “This thing won’t run a newer OS” are also deadly.

    1. “One of the complaints about getting hospitals off XP is the big iron (not mainframes) they operate that can only run XP: MRI machines, CT machines, all sorts of medical devices which need the smarts of a full computer, and chose Windows as their OS, for which there is no upgrade available.”

      Only if you feel the need to put your MRI, CT, and various other machines on the public internet – and why would you do that?

      >

          1. Nope. Windows boxes ran SCADA software that controlled them. These were not connected to the Internet. They’ve got infection from USB pendrives used by employees. It was equivalent of STD: they put their things in wrong holes…

          2. “They’ve got infection from USB pendrives used by employees.”

            Curious as to how that virus got on those USB pendrives… perhaps when they were inserted into, oh, I don’t know – windows computers attached to the internet?

            As I understand from the stuxnet documentary I watched, the idea was they targeted computers in a certain IP range (Iran), set the virus to spread to anything they could, and when they found firmware file for the micro controller models they were interested in and modified it to house the virus until a certain day.

            >

          3. Ah, well that’s already a solved problem. “Don’t put frigging USB drives into the frigging computers enriching the bastard uranium, for fuck’s sake” ought to be a law anywhere that messes with enriched uranium. It’s already part of basic security theory.

            Sure people are gonna do it anyway, bringing in stuff from home on USB sticks. So you either need to be an utter Nazi and severely punish people who do, or spread a story about that guy who was sacked for bringing a USB stick in, which had a virus and infected the whole company network for a week, during which nobody could do any work so they all stayed at home unpaid. The guy was fired, lost his pension, and they took him to court for breaching his contract, he lost his house.

            Something like that anyway, bit of social engineering. If the pencils up nostrils thing can spread to every school in Britain, some suitable prophylactic meme can do the same in companies. If we could work some severe anal trauma into the story, it’d probably be more successful, more memorable. Hm perhaps in the future there’ll be a job of memeticist, making up stories to get a certain effect. Advertising, “journalism”, PR and the like already do that, my version’s just one step further from that.

            Anyway… you’d expect better from nuclear physicists! Particularly somewhere like Iran! I would imagine causing catastrophic failure of the nuclear weapon program would get you a time with the security services you really wouldn’t enjoy.

          4. The USB drives would have been much less of a threat if Windows didn’t by default invisibly run any autorun software “helpfully” installed on the drive. Microsoft’s urge to “help” you will be your ruination every day. Turning off autoplay and autorun is one of my standard tasks when prepping a new computer for myself at work, because USB drive promiscuity is simply unavoidable in many situations. But if you are only exchanging data files, and you never deliberately run programs from those drives, they should be perfectly harmless.

      1. What if you need the machine to access a shared database on an internal LAN? You need a way to get files from it, and the only possible ways to do so are networking and removable media. Both are vulnerable to almost all attacks.
        There are less vulnerable alternatives such as air-gapping, but they’re not cheap and may be incompatible with such old systems.

        1. You could always proxy/shield/firewall the legacy OSes by putting them behind an individual (one per device) linux ‘firewall’.
          This ‘firewall’ could fetch the files and redistribute them (or proxy any other required feature).

        2. You do know about private networks, right? Why should an MRI machine be accessible over the internet?

          I assume the MRI machine has a network jack, and it shares files with a server, but neither the server nor the MRI machine have a need to be publicly accessible on the internet.

          >

    2. There’s an added bit of headache due to government bureaucracy in healthcare: When a Medical Device manufacturer jumps through all the hoops to get a new device (IE MRI machine, CT machine, etc) the computer that’s used to interface with the equipment is included in the validation process and, once approved, *cannot be modified in any way* without going through the whole validation/registration process over again.

      Even *if* hospital staff had the soundness of mind to want to upgrade their CT/MRI machines from Windows XP to Windows 7, they couldn’t without de-validating their highly regulated piece of equipment. This actually goes to the extreme that installing updates/patches from Microsoft for that specific OS can, technically, de-validate your equipment.

      1. And I have seen plenty of times when updates introduced new problems, so prohibiting updates is an entirely reasonable point of view in this context. So keep the machine off the network and remove all the media devices
        (floppies, usb ports, network for that matter) — or pay the piper. It is all a question of how serious you are about being secure.

        1. Their fear that an update might cause a fatal error of the non-metaphorical kind is quite understandable. I’ve seen cases where patches to Windows broke very common software with a large user base – it’s quite plausible that a simple service pack might break a bit of custom software on a piece of medical equipment, and upgrading to a whole new Windows version would be even more likely to break things.

        2. A machine, properly validated, should be known to work 100% properly as-is, there should be no need to ever update it.

          Want to improve/change the machine? You will need to re-validate the machine, to prove it is still 100% correct.

          >

          1. that is only true in an idealized sense, no software is 100% validated. in fact i would go so far as to say 100% validation is practically impossible in pretty much any system, it requires knowing unknown unknowns.

            even then a 100% validated system is only that so long as there is no change in system requirements, no change in external requirements of that machine and no requirements for communication with other machines.

            if any of the above are true then the validation will quickly become obsolete itself.

          2. Do you understand how validation works?

            You establish defined requirements, configure tests to prove the requirements are met, and once all the tests are passed, the software/hardware solution is considered ‘validated’.

            It doesn’t mean the software is perfect, it means the software meets specified requirements.

            As I said, once validated, there should be no reason to update the software, unless of course requirements change or changes are made to the underlying system.

            You can validate a medicine dispensing machine, it’s requirements can be defined and proven to be met. You can not, in any practical sense, validate a word processor or spreadsheet.

            I validated trivial phone applications for managing clinical drug trials – that software was proven to meet all documented requirements before put into production, and the drug companies would audit our validation work before they would allow the software to go live.

            >

          3. “If the machine worked 100% properly… there would be no need to update it as the software would 100% properly deflect all attacks.”

            If one of the requirements was to ‘properly deflect all attacks’ – of course, you may find it hard to define ‘all attacks’ during software testing.

            >

          4. Agreed… it may be hard, it may be expensive to verify and implement… but a machine that is vulnerable clearly cannot be working “100% properly”.

            If I, as an unauthorised user, can plant my own code in the firmware of a security camera and have it run `wget https://krebsonsecurity.com/`, then clearly the device is not working 100% properly, its firmware has an egregious security hole!

          5. so you just reiterated the point i was making?

            100% validation as opposed to just validation sort of hints at what i meant, i was setting up an extreme thought experiment to illustrate the issue with validation as a concept.

            you also didn’t address the issue with change, any validation only works so long as the issue is unchanging and well defined, over time any and all validation in a dynamic system will become obsolete, system designers take this into account nowadays.

          6. you both seem to be willfully ignoring each others point.

            Sure there is one time validation/certification. This has passed BSI test number (for Britain)
            and then there is moving targets, (like PCI-DSS) that require continual change and updates as a part of the validation.

            i’m not sure *how* system designers are meant to take into account stuff that changes in windows update versions. – that’s just rubbish, you can’t quantify the unknown unknowns!

            For example: the amount of consumer equipment that was redundant overnight on the release of XP, (with ring 0 DLL for hardware access) – suddenly heaps of ISA sound cards, midi cards, educational toy robots, printers just didn’t work.

            The was that PL323 drivers from windows update just suddenly bricked a load of devices that genuine consumers bought in good faith that happened to have counterfeit prolific chips in them.

            I assume you are a system designer (as you say that they will plan around future changes that they don’t know about) – how would you have planned for them? – (you even have the benefit of hind site now…)

            you literally can’t! in ’92 ISA was literally the only bus architecture available. by 2001 (not even a decade later) most hardware made for it stopped working with no XP compatible hardware access methods… (and remember you already sold that machine, going back and spending time re-writing software (for free!) is not a good commercial decision.

            the fake chip thing, plan your supply lines better is not a good answer given that counterfeit chips have ended up in military gear.

          7. we are all pots calling the kettle black hehe, when rereading the posts we are all more or less saying the same thing.

            i literally made the same unknown unknowns point above, you are right you cannot solve those issues ahead of time, but you can plan towards having such issues and the resources and infrastructure to deal with it, ie. updates, which is why someone might need them and why software might need to change even if validated.

          8. So there’s validation in the sense of what the word means practically, and there’s validation in the FDA/international equivalent organization sense.

            Yes, software lives in a dynamic world subject to ever changing issues. The FDA doesn’t see it that way and, additionally, requires an extra vigor and documentation that creates a cost that’s a huge barrier to re-validating a system that’s already been validated. The latter is why we still have tons of unlatched older OSs in the wild

    1. Exactly… and even on Linux this can be risky, and will take time and effort.

      We’ve had an Ubuntu VM running a customer site auto-update the version of Docker it was running (to a pre-release!!!) which did not recognise the containers, and so effectively deleted them.

      *Thankfully*, it left the volumes intact, so `docker-compose up -d` re-created the containers, linked those to the volumes, and everything came back. We’ve been wary of OS updates ever since, and we now use a know-good .deb package for Docker, updating it on an as-needed basis.

      1. What kind of idiots launch pre-release, unfinished untested software as automatic upgrades? Maybe that’s a problem with Linux, in a paid-for product someone would be shouted at by their boss for doing that, which might affect their job and their income. With Linux, you face sarcastic sed puns and maybe that twat from XKCD will do an amusing “romance / language / etc” stickman comic about you.

        Anyway… that system failed.

        1. Hmmm…I remember an experience of learning to disable automatic updates on Wondoze computers because a Dell video driver update bricked about a hundred identical machines overnight. Lesson learned? Only roll out necessary updates after proving out the updates don’t break important things on a couple of guinea pigs.

          And before anyone says its a Windoze problem I also have experience with MacOS breaking perfectly good USB devices like printers and scanners and hardware interfaces.

          The problem is that the software vendors have incentive to force updates to make customers buy new hardware and software. Customers don’t want to upgrade or change anything that’s working just fine the way it is.

          That’s why I’ll NEVER buy a car made by a software company.

          1. “Hmmm…I remember an experience of learning to disable automatic updates on Wondoze computers because a Dell video driver update bricked about a hundred identical machines overnight. Lesson learned? Only roll out necessary updates after proving out the updates don’t break important things on a couple of guinea pigs.”

            Larger installations should have their desktops in a domain, and that domain, coupled with a proper WSUS server, will allow the administrator to withhold all software updates until they specifically approve them, which should happen only after the admin tests prospective update in test machines to ensure the desktop won’t crash/corrupt their production systems.

            At 100+ workstations, you would have qualified as large in my opinion.

            >

    1. First thing I change when I get in front of someone’s PC. I generally ask, but usually they don’t understand the question. Which means they NEED that stupid, dangerous setting turning off. How long did it take the hackers of the world to think of “nakedlady.jpg.exe”? And yet it never occurred to Microsoft during decades they’ve been shipping OSes that do it.

      One more big Microsoft WTF. You tend to forget about them when you use it regularly. I read an article that claims Windows gets more stable as time goes by, because users subconsciously train themselves not to do the things that will make it crash.

        1. Yeah but once you’re infected, the preventive-security game is over, and you move to the next stage. If you’re infected you’re already vulnerable to basically everything. Changing a user-interface setting I suppose is nice for helping your fellow malware-writers, but there’s worse to worry about.

  14. I think everybody is missing the simple reason on why Windows has so many viruses made for it: Windows has more high priority target machines using it. Government, business, and healthcare all rely almost exclusively on Windows based machines. State run espionage wants info from government machines, business run espionage aims for businesses, and identity theft goes for healthcare. Plus, all three systems are good targets for ransomware, especially healthcare where loss of systems costs lives. All of the other users are just caught in the crossfire from attacks aimed at these three major targets.

    After all, you aim your attacks where they can do the most damage and get the highest payoff. Android systems doesn’t keep the kind of data in amounts and quality that make it worthwhile to major actors in the malware scene. Same for Macs and Linux machines. Sorry Linux, you just aren’t important enough until the majority of the above listed three targets shift to your OS. So the smug Linux and Apple users is misplaced. It’s because they aren’t major players that make them lower priority targets.

  15. Linux still works on all but the most ancient of hardware – there is no reason to exclude anything.
    What if your custom software and hardware that works under XP doesn’t exist or is too slow (Vista had minimum requirements) under Windows 7 or 10 – even the free GWX shove the “free upgrade down your throat” often broke working systems.

    The “free as in speech” means there is likely to be a “free as in beer” version or upgrade path. There’s also the footprint reduction – Windows needs a lot of junk to be running, Linux can close all the TCP ports that aren’t used and the remaining ones can be protected since you have access to the whole stack. WannaCry was in some obscure Windows SMB networking thing that might not even be used, but it’s there, left on, not firewalled.

    I have ZERO plans to go to Windows 10 for that reason. I don’t want Cortana, nor the background privacy rape that is hard to turn off (and I won’t get ANY security updates if I manage it). I don’t want local office, or office 360, or much of any network connecting and sending my most intimate and private data to Microsoft (where they can give it to the NSA or an ad targeter). It isn’t even “free as in beer”, so I pay for it, and they make money off my data.

    1. The spying and open ports is why I feel Windows 10 can never be secure as shipped.
      Now if all the spying , forced updates ,unnecessary processes, and open ports were gotten rid of and only updates you decided were relevant applied then it could be a secure and reliable OS.

    1. Thank goodness hackers can’t use the Linux source code to find vulnerabilities! Nope, they’ll have to stick to hacking into closed-source code Windows…

      /sarcasm

      >

      1. I have always wondered about that since the first time I heard about linux and open source. …if everyone can see exactly how it works, how is it more secure?

        I get that somehow, it works, else it would not be around today. but as I currently understand it, it is like going:

        “here is my security system to my house. this is how it works; where the window and door sensors are. this is where the wires run to the main control box, and this is where it connects to the land line or internet for monitoring. here are its communications protocols for the case that someone were to break in. etc…”

        that would be more than enough information for anyone with even a basic level of DIY electronics and computer capability to figure out how to defeat.

        I am not trying to be sarcastic or anything, I genuinely have not comprehended how open source software can be secure. I understand that the community looks at it and can contribute improvements and all that, but it still leaves open the fact that if you can see/know all about it, you can use that information against it.

        I do however put all that aside and do run linux on my main computer. my laptop runs windows 10 (free [and regrettable] upgrade from 7) for adobe products, ableton and serato. it dual boots to linux to do what windows does not. –if my laptop were powerful enough, like my desktop, I would just run linux as the main OS and windows in a virtual machine.

        1. The belief is that there are more good eyes reviewing code before it goes live to ensure it is secure, than there are bad eyes looking for exploits after it goes live.

          >

          1. In practice, the ‘many eyes” approach works well with Linux because of the extreme modular design of Linux, combined with a robust internal security model that allows services to run with limited access to system resources.
            In the real world, security flaws are not discovered by reviewing source code, but through inputting out-of-band data. Once a problem is discovered, the many eyes approach is the most effective method of locating and correcting the bug.

  16. i always say im a windows user first. i know my way around the system better. or do i? never once have i written a windows driver. but i wrote an i2c joystick driver for raspian. makes me wonder how much i depend on black boxes in the windows world. on the linux side you can at least peer in and see whats going on.

    i havent really liked windows much since they discontinued my old skool start menu. now im running classic shell just to be able to use the damn thing. linux doesnt look so bad. last few distros ive used impressed me a lot. reactos on the other hand looks promising. post beta it might be a good platform for running legacy software. but i dont think it will keep itself up to par of the real windows. its possibly for the people who will stop using xp when they pry it out of their cold dead hands. people who want to run legacy software on hardware that is a few years old.

  17. I’ve used Linux since 1996 and if I tallied up the number of hours of I contributed to FOSS projects I have well and truly paid more in kind for all of my Linux installations than any Windows user would have. I’m not a zealot either, you can use whatever OS you like, just don’t ask me to help you fix it, or even show you any empathy during your self inflicted disasters, if it isn’t Linux or BSD.

    I’m not sure where that quote about things never happening on Linux comes from, mature IT people know that everything is broken, everything has security holes and the average user is a danger to themselves and the systems they use, it is just that some operating systems are worse than others while also being larger targets and it is that combination along with the low proficiency of the average user that makes Windows systems a problem.

    1. “I’m not sure where that quote about things never happening on Linux comes from, mature IT people know that everything is broken, everything has security holes and the average user is a danger to themselves and the systems they use,”

      It’s the next slide after the Linux ‘world domination’ slide in every PPT stack extolling the virtues of Linux.

      >

          1. Don’t go the full aut on me, you got the point, or you are never going to get it. Why would I need to look at this legendary PPT? I’m not a Windows user and I’ve already listed the points I do make if a Windows person asks me what the issue with Windows is.

  18. It is dubious as to Linux having more installed instances in the server space. There is zero question that Windows dominates in the desktop space. So yes, Windows is the bigger target. And Linux has certainly had it’s fair share of packages over the years that have had security issues. It is not immune, just less publicized. And in a large corporate environment, it does in fact take time to vet patches. They are not just rolled out to tens or hundreds of thousands of computers. It is not unusual for a patch to fix one thing and break something else, so patched need to be vetted and that is a process.

    It is easy to play armchair quarterback, it is an entirely different thing to be in the trenches,

  19. My reasoning for using Linux over Windows is the general less stuff-around that one has.

    – Updates of what YOU want to update, when YOU want to apply them, not when the computer thinks it should
    – Moving between versions on a desktop is generally less painless… backing up /home means you’ve got 99% of your user config. Cherry pick some things from /etc and you’re golden, most of the time.
    – No “activation” crap to worry about. (This is the reason I still keep an ISO and license key of Windows 2000 around for VMs.)
    – The UI is much the same as it always was: I use FVWM as my “desktop”, and it has changed little in 10 years, despite moves between several laptops, a Raspberry Pi 3, various desktops and between Gentoo and Debian OSes.
    – Fresh install, most of my hardware, if not all of it, JustWorks. Part of this is because I’m usually careful about what I buy, erring on the side of conservatism and what has worked well for me in the past. (e.g. avoiding nVidia graphics and Broadcom WiFi like the plague, preferring vendors that either directly maintain open drivers or have historically assisted those in maintaining said drivers).
    – Windows I can guarantee I’ll need to spend a good few hours tracking down the drivers for each device: my laptop workstation (Panasonic CF53) for example has about 3 devices I couldn’t find Windows 7 drivers for… but on Linux, everything works fine. My current work desktop (AMD Ryzen 7 based system), aside from the motherboard taking great umbrage to an old PCI NE2000 network card clone I shoved in a spare PCI slot, has been painless to set up under Debian 9.
    – As a software developer, I find Linux much easier to write software for. Particularly in asynchronous programming: in Linux, everything (files, network sockets, devices) is a file handle… waiting for events is a matter of calling select() on it. In Windows? Ohh no, we couldn’t possibly provide such abstractions for you!

    Linux isn’t perfection… but in general, it has worked much the same way for years. About the only major change to the inner workings has been systemd… and even that I’m getting my head around. Desktop Windows is a pain in the arse, and Windows CE is an utter joke! I consider both, “legacy OSes” these days.

  20. Interesting I’ve never met an American or European who didn’t just use ISOs with an activator.. Even though it’s only $200.00 per BIOS.

    Also pertaining to viruses: There is no reason for anti-virus engines to still be using signature engines.. This is 100% the reason for ransomware and worms. I shouldn’t be able to load a driver or do DLL injection without a buffer overflow, and even then simple API based behavior analysis should flag the process and freeze it with debug API or driver-based kill.. AV vendors have the money and talent to fix the problem..

  21. I still have two machines running XP. Why? Because I have a mountain of software, and even hardware, that will run on them but not on later versions of Windoze. Both machines are completely trouble and virus free… How? BECAUSE THEY NEVER GET ON THE NET! No connection to outside world. No Wifi or Cat5 cables connect to my Internet connection. I never use them for browsing or downloading. So if you MUST run XP to support expensive CAD or CAM applications, protection from malware is simple: NEVER CONNECT THEM TO THE OUTSIDE WORLD.

        1. The OTHER simple defense I use on XP ( and all my windoze machines) is to turn off autorun on all connected drives, including USB ports. Yes, you could conceivably pass a virus to a machine from flash transfer, but FAR less likely with no Internet connection and all autoruns turned off.

      1. Their not oblivious they’re not software engineers and they need data that’s not on their LAN or hard disk..

        Viruses and worms exist cause there is money in the security industry.. Cryptography and MMU security has been around since the 90s.. Most malware don’t even use memory corruption to get root or infect MBR or install a rootkit.. Anti-Viruses will readily allow anything to run if it doesn’t find a byte pattern..

        The people writing the OS and security products want it this way. Linux and BSD and OSX you at least need a buffer overflow..

        1. Cases in point:CreateFile in userland to overwrite MBR and WriteProcessMemory, also from userland, to modify memory of any other process. Anti-virus products allow all of this on top of allowing driver loading through buffer overflows..

  22. Some of this is true, but a lot isn’t. Linux kernels are pretty stock, and you can still get a few hundred thousand users of any distribution. Still its true that its not millions and millions. But there are other factors. Some versions of BSD unix are very strictly for security and this influences Linux developers. Having very specific areas of computer memory 1) dedicated to user input (I/O), and 2) *NOT* allowing that entire address space to be executable means that any I/O is sanitized regardless of application. I don’t know if Windows does that. Also, a very conscious decision was made a long time ago to not allow auto-run applications through *any* mail client. What does that mean? You *can* click on unknown emails, and they *won’t* download viruses or eat your hard drive. They open, its a silly message or whatever. There appears to be some executable in a hidden attachment, and if you really want, you can make it executable, and run it (if you want to). On Linux, .exe files are just a binary blob (it was designed to run on someone else’s operating system after all), so you can look at it, go through the binary, strip out strings and figure out what it does (or maybe just delete it). Microsoft could have gotten rid of auto-run on their mail clients a long time ago, and it would have saved a huge number of viruses from infecting millions of users, but the problem is that many users *want it* because it makes using the computer easy. They don’t have to know nerdy computer stuff. And so all windows users suffer. There is also the fact that drivers on linux are ‘centrally written’ –since most companies don’t write linux drivers for their hardware, linux developers must write drivers for hardware. And if you write drivers for 3 or 4 different kinds of hardware, you write more general software, with specific changes for each kind of hardware. And if you find a bug in one driver and fix it, you are likely fixing bugs for several or many others. The other part is that there is collaboration between Linux developers. It doesn’t matter if they work for Intel or AMD or Novell or Oracle or IBM. Companies sponsor linux developers. Proprietary software developers are locked into the company they work for, and they also don’t get input from university professors and research labs. It doesn’t make the software perfect, but it helps a lot, and having the source code available to change means people who know about computers can ‘try stuff’. It might not work, but it lets people try. No non-disclosure agreements, no paperwork, no license fees, no one looking over their shoulder. No “all of your base are belong to us” ™.

    1. I think the choice NOT to change back to Windows was wise given already saving €11m, dodging the ransom-ware fiasco, and avoiding an additional hardware costs needed to run the Windows 7.

      It was some staff that were asking for Windows 7 after Microsoft lobbying did the job.
      http://www.techrepublic.com/article/no-munich-isnt-about-to-ditch-free-software-and-move-back-to-windows/
      http://www.zdnet.com/article/munich-sheds-light-on-the-cost-of-dropping-linux-and-returning-to-windows/

      The admins and managers wisely said NO to the upgrade rhetoric.
      https://www.youtube.com/watch?v=lITBGjNEp08

  23. Something that hasn’t been mentioned (…or at least I didn’t see it) was using low-cost computers as thin clients with *nix app servers, like LTSP. That way, the only hardware that will need to be upgraded tends to be the servers; even old 486’s can act as terminals. Contrast that with schools/etc. that spend millions of dollars every few years to bring their regular setup up to date!

    1. “Contrast that with schools/etc. that spend millions of dollars every few years to bring their regular setup up to date!”

      You apparently have no idea how public schools operate/update their systems:

      Schools subscribe to Education Assurance and where I used to work 1,500 laptops and desktops had licenses to install the latest desktop OS, all the Windows Server OS, and ancillary software (SQL Server, SCCM, etc) and the latest version of MS Office for the princely sum of less than $50K/yr.

      As for desktop/laptops, they were on hardware rotation schedules of about 5 years, and after about 5 years the machines were typically ‘used up’ daily use laptops in the hands of high school students don’t physically last much longer than 5 years, desktops fare better, but they don’t last forever in a public school classroom.

      When desktops were replaced, we typically went with off-lease refurbs as the purchase price was half of a new desktop and the performance was adequate for our student’s needs (we bought refurb desktops by the pallet load).

      Oh, and the Internet connections are typically subsidized by a federal program called e-rate.

      Schools typically replace hardware because it is no longer able to rely upon 5 year-old laptops that have been abused and mistreated every day by a dozen or so different users.

      >

      1. I’m not sure where you are, but that wasn’t my experience in BC, Canada with desktop units.

        I tried to convince my local school board to go with LTSP, but politics and apathy reared their ugly heads. (To get an idea of what can be done, I was chatting with a fellow back then who was running 110 internet/office clients on a dual-processor 500 MHz Pentium 3!) What’s more, students could have logged in from their own laptops or home computers. Greece has really stepped up, though, to the point where they have 58% of all LTSP installations globally.

        1. If they can’t run the same applications at home that they do at school and that parents use at work, the community will reject any such proposal.

          You can talk about how open office really almost just as good for their purposes till you are blue in the face, but it won’t get you anywhere.

          And really, when you start talking about using 15-20 year-old computers, you lose almost everyone. First off brand-new computers can’t last 5 years in a public school setting – kids literally tear them apart during class. Second, you’ll wind up pouring your hardware savings into additional support technicians to apply liberal doses of chewing gum and bailing wire to keep machines running.

          You over-estimate the enthusiasm of parents to support their children not learning Windows, OX X, and iOS/Android tablets.

          >

  24. In many cases upgrades cant be made. It is not uncommon for special equipment that are being controlled by a computer wont work with a upgraded OS. Replacing a million dollar working device for a OS upgrade i think not.

    The insanity comes in to play when some of these devices are conected to public internet with little or no saftey inbetween.

    1. I moved to a primary linux desktop after Windows 7, and liked having software that could port anywhere.
      Indeed, we pumped almost $1.4m into applications for linux on our hardware — the public doesn’t want to know what is driving the thing… they don’t care… and we don’t advertise the OS as the Hacker trope scares people.

      Yer right tho, we still have 3rd party machinery that only supports Windows XP or Windows 7 versions (CAD stuff that costs way more money for an upgrade, or some AutoCAD f_ckery with cloud work flow).

      We use a tunnel to Guacamole inside a Docker instance to access the isolated internal LAN’s Windows workstations. They do not have wireless/web access, and a static linux docker instance is serving the LAN share.

      Note, Windows 10 was made by Marketer/Despotic managers for the gullible — It was a new low even for Microsoft.

  25. When you work for a company that wont replace production equipment that is programed with a caset tape becuase the machine still “works”. Try to explain that their 5 year old machine multimillion dollar machine needs a 100k+ hardware/software upgrade becuase the software to support it is no longer supported.

  26. the standard Linux praise, obviously.

    > So why is WannaCry, and variants, hitting unpatched XP machines, managed by professionals, all over the world?

    Because they are not USED by professional. I have been in IT for several decades. There is NO WAY you can pretend a USER from fucking up your work.
    Simple rule, just as an example, issued at a Red Cross organisation: NO OFFICE DOCUMENTS are to be mailed, only online document editing allowed, period.
    What are people doing? Mailing excel sheets, word docs etc. because “they have to get their work done and cannot take care of stupid IT professionals’ personal wishes”.
    What happened? They got their share of encrypted machines.

    That happens every single day. Believe me, you can ruin Linux desktops just as well – just have a USER at it.

    Blaming everything on evil OS – no matter which – is outright stupid. Education of users should be on top of your rant list.

    But telling THAT to a Linux advocate is … ah what the heck.

    1. every time i visit a “controversial” comment section on HaD i try to look for common sense and perspective, sadly it is often in vain, with people moving down whichever path they find best, full steam ahead and no thought to the consequences.

  27. the reason we still use XP at work? believe it or not it’s a matter of depreciation. if you look around long enough you find some things are just missing in every new major version that somebody depended on. no matter how backwards compatible the new system tries to be, theres always some permanent failure of removed functionality. DOS support is not the same, shell features disappear, and good luck finding drivers for that perfectly good printer barely 10 years old. the quick launch bar was a nice feature of XP that got depreciated, we’re slowly upgrading to 7 still and this one is one we use constantly that has no built in replacement. cost is another factor but the biggest is our staff (mostly untechnical old ladies that don’t even own their own pc) cannot cope with the huge change in interface this presents.

  28. I think you’re very wrong here. It’s the cost of upgrading, yes. But that’s not the cost of Win10 licences, it’s the cost of retesting, upgrades to £5k/seat software, or moving an entire system because it’s no longer available for Win > XP.
    Linux has fewer issues on this because updates are usually more backward compatible, I think. The issue is labour and risks, not licences.

  29. The reason there are no/little virus’ on phones…. I’m sure it has to do with phone app devs are more interested in just tracking people and their photos to want to render the phones useless. With windows PCs there is a hater mentality towards that OS and towards those users.

  30. There is another big reason hospitals and many government agencies still run XP, and you just brushed the issue–certification. Industries where secrecy/privacy or guarantees of functionality are required (medical devices, top secret servers) any change as simple as adding a 10 year old windows patch requires a full recertification. Recertification takes money yes, but also time. Thus most groups do not bother with the time, expense or effort to have to spend more money and do more work. Thus the computer controlling your MRI machine is still running XP because there is dis-incentive to patch. Thus when malicious users get access, the countless vulnerabilities are known and exploitable. Groups like the NSA are similar (though, presumably the NSA’s budget is a little higher, so likewise is their incentive.

  31. Since this thread seems to talk a lot about Linux vs Windows, let me ask a question:
    I’m an engineer (not a PE though). I design a variety of things – from the product definition for a client to the embedded code that goes into the product. As my name implies, I write that target code in a language that few like. But I digress…
    A large number of tools available to me (eg cypress software which uses gcc but only runs on Windows, old freescale software, etc) are only available under Windows.
    I still run 7 (professional) because I still have engineering tools that only run under XP. I had to finally toss out my eprom programmers (one ISA, one parallel port) because new hardware could not support them.
    I run Office 2000 because it’s good enough. I dislike OO calc because the graphing is terrible and they don’t have a reasonable version of solver yet. Solver is one of the best pieces of Office I know of. I find writer/word equivalent, but I’m not a power user.
    So – my question:
    Can I use Linux? I’ve tried it on multiple occasions in the past (back at least 15 years ago) and I always end up going back to Windows. Can I run excel 2000 under Linux?
    Frankly I don’t care what OS I’m using. I just want to run my engineering aps. I’m looking at using KiCad but in the past I was forced to use Windows as that’s where all the reasonable pcboard cads packages were. changing cad packages is painful, but now there is a cross-platform alternative.
    And I like gui configuration screens because I spend my time building products. I don’t want to mess with the OS. I might change configurations once every year or two. Too long for this old brain to remember all those command line details.

    Any thoughts?
    I haven’t tried Linux in a few years but I doubt that tools have ported much to it. And the Linux world still seems to be proud of the distinct lack of gui config tools. (I’ll get flamed for this, I’m sure)
    So I’ll repeat – I don’t want to think about the OS. I want to install it (put a disk in and walk away), install aps, and then use them. Do I develop high-tech products? – yes; do I want to know a lot about my OS? – no.

    1. You can use KiCad on either platform. I hope you aren’t seeing a switch to KiCad as being dependent on a switch to Linux.

      I wish I could talk more about comparing spreadsheet programs but you ARE a power user compared to me. I have never had a reason to use a spreadsheet beyond just as a way to put data into neat tables. For the greater (would Linux make a good desktop for the average person?) discussion I would guess that a lot more people are like me in this regard.

      But… Excel 2000? Yes, the Office 2000 suite has ran in Wine for a long time. Totally new, up to date versions of MS Office I am not so sure about but Excel 2000 should be a cake walk!

      What do you need to configure? Some Linux distributions are designed for ‘power users’ that WANT to configure things. These ones can be a lot of work. End user oriented distributions though usually take hardly any effort at all. For the most part it’s choose ‘try it (in RAM)’ vs ‘install it’. Once you chose install tell it you want to automatically re-partition / format the hard drive. Finally choose a username and password.

      When it finishes it reboots. Enter your username and password. Ta da.. it’s a GUI complete with a mouse pointer and clickable icons just like Windows. Click the one that says something about installing software or a store or whatever. Then click to install each application you desire. It’s just like installing apps on a phone but usually they are all free.

      I don’t know what all this ‘configure’ that you speak of is. Well.. ok.. there are 100s of thousands of lines worth of configuration files and even millions of lines of source code all that you CAN CHOOSE to alter. If you just want to sit down and use the computer for a task though… that stuff is usually optional!

      Of course I am talking about installing Linux on a computer where all the hardware has good open-source driver support. I’ve mostly installed on Desktops where I chose the parts with Linux in mind. My own observation is that most people who just want to ‘try out’ Linux want to put it on their old laptop. Usually it is something that came with Windows on it originally and built by a company that has an agreement with Microsoft! They are made with slightly customized chipsets that differ from their commodity hardware cousins enough to require their own drivers. It should be no surprise that such beasts require a bit of effort to tame them enough to install Linux. It isn’t easy to install OSX on such devices either is it?

      1. “My own observation is that most people who just want to ‘try out’ Linux want to put it on their old laptop. Usually it is something that came with Windows on it originally and built by a company that has an agreement with Microsoft!”

        YIKES!

        You mean like they agreed to put bloat ware on the laptop in order to get a discount/offset on their windows license?

        Those BASTARDS!

        Of course, that it makes the hardware more affordable is meaningless to you…

        >

    2. I have a similar problem with a lot of scientific software. My solution has been VM’s for everything, and then lean on the companies to provide linux-native versions. It does give some additional hassle, but on the other hand I can use just about any software I want. Need that specialized computation package from linux? Done. Need that ancient data processing program written for windows 98? Done.

      If you are willing to use money, then there has been some interesting work done toward virtualizing individual apps, it looked very promising to me but YMMV.

    1. 99% of programs won’t run on it. Well, more than that but the keyboard warranty only covers pressing 9 so many times. I’m sure it’s cosy when nearly all of your software is written by the same 2 people, but you’re going to lag behind in what you can do.

      That said, I’d imagine the reason there’s no viruses is because nobody bothered trying to write one. An obscure minority OS in unlikely to be “bulletproof”.

      1. More than 99% of diesel cars won’t run on petrol either.

        RISC OS was roughly contemporary with Windows 286 and 3.x so it’s heyday was a long time ago, although it still exists. If I recall, it had very nice applications adequate for doing actual work. The vector drawing package was very nice. Bulletproof, well the core OS was immune to being overwritten because it was on ROM.

        Also, being the native OS of ARM processors (originally) it has a certain claim to fame.

  32. Another aspect that seems to be overlooked is hardware support. I work in the physical sciences, and very frequently hardware support is actually the deal breaker in OS upgrades. New OS’s don’t support the specialized hardware, either because the software required to run an umpteen hundred thousand dollar instrument isn’t compatible with the the new OS, a critical driver doesn’t run, or worst of all, you need now specialized hardware (remember ISA cards?) to use the instrument which still has to be compatible with everything else. Any or all of these can and do stop the upgrade process cold. This is on top of all the economic considerations: funding agencies often will not allow use of funds for computers and software even if you have the money.

    It’s going to take decades for some of those old systems to be upgraded if they ever are. My lab has an instrument running Windows 95, and it won’t be upgraded until the instrument physically falls to pieces. We need it to run. If we upgrade Windows, it won’t run. I can only imagine how much worse it would be with something like a nuclear reactor or hospital where lives are at stake.

    1. “New OS’s don’t support the specialized hardware, either because the software required to run an umpteen hundred thousand dollar instrument isn’t compatible with the the new OS, a critical driver doesn’t run,”

      That sounds like a lack of vendor support, not OS support.

      “or worst of all, you need now specialized hardware (remember ISA cards?) to use the instrument which still has to be compatible with everything else.”

      That’s a hardware issue, not an OS issue, the vendor needs to evolve their product with the industry…

      >

      1. As a developer from the vendor side, sciences tend to be a buy-once-use-forever kind of market. Funding is a hit-and-miss thing for academics especially. We still get service requests for old serial instruments made in the 90s that are not supported on 64-bit Windows because we don’t sell that instrument anymore and haven’t for twenty years because it was replaced with something else.

        But as a vendor, we can’t afford to maintain a huge body of software like that which doesn’t sell instruments. Volumes tend to be low and more often than not, the NRE of back-supporting an instrument far exceeds the cost of them buying a new one.

  33. No one cares what I think, but I prefer Linux over Windows/Apple. Apple only wants to let you run software of which they approve, as does Google (Android). Microsoft over charges for over bloated software – as do many software companies. Just as an example, why pay hundreds of dollars for Photoshop (or how ever much it is a month now that they are subscription) when gimp does (pretty much) everything Photoshop can do, but for free? Same for operating systems. Why help line Bill Gates’ pockets when I can run Linux for free and do everything that I can do with Windows. Yeah, i can’t play the latest greatest games, but I don’t do that anyway. Besides that, Linux is more stable and less virus prone. Before I went Linux, I got the occasional virus, and as a technician, I see more than my fair share of infected systems. I have never gotten a virus in my 10+ years of running Linux, nor has any customer’s infected computer been running anything other than Windows.

  34. lol why did you let a noob write this article. the cost of windows licensing is very very small as far as expenses go in systems development. making any change to a running system that can not fail costs thousands of times over the price of a windows key. if you thought the price of upgrading to windows 10 was about $30/pop for corporate licensing and using linux will always save money because there’s no license then you were completely wrong.

  35. Not buying it. I work in IT security. I spent most of last week doing forensics on a cracked Ubuntu 16.04 that was auto patching. In my environment, Macs are less of a problem, but they have had notable issues from time to time.

    These kinds of love letters focus only on the theoretical technical merits, often with straw arguments which are un-provable and pander to people’s internal biases.

Leave a Reply to KenCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.