23 thoughts on “The Problem With Software”

  1. As someone who’s been a software developer for nearly 30 years, I can only say it’s gotten worse, due to more complexity. The amount of ridiculous overhead (wasted libraries, mutated protocols, erroneous and outdated and even simply false documentation) that goes into making a “simple” web-based application borders on the obscene. We have multi-core CPU chips that operate at billions of instructions per second, we have storage that could have held 1995’s entire Internet on one disk platter, much less the entire Library of Congress. Our handheld devices are more powerful than supercomputers, and yet Windows doesn’t boot up in less than 2 minutes (counting the “hidden” apps that have to start before it’s fully operational), and we have simple “hello, world” apps that require hundreds of megabytes of storage.

    Long ago the industry abandoned the “KISS” principle, which once was used to determine how robust software was by counting the “function points” and removing extraneous ones. That was back when “Computer Science” was actually trying to be a discipline. Early on I realized that the more places you call functions, the more places there are for the whole thing to fail, and I learned to write code that doesn’t trust input data, and doesn’t trust the results of any function call, even the ones I wrote. This leads to “defensive programming” which is too expensive to justify to management, but hasn’t stopped me from sneaking in “useless” error-checking wherever I can.

    If I felt like depressing non-programmers, I could write many horror stories about the projects I’ve been on, much like the author wrote about his friends in the industry, both commercial and government.

    1. “Windows doesn’t boot up in less than 2 minutes”

      You can actually fix that problem: Get a recent UEFI motherboard with a fast CPU, Windows 8, and an SSD: Windows will boot in well under minute. (Linux will boot just as fast on the same system.)

      1. Yeah, my Windows gaming PC has an SSD for the boot drive and boots to the login screen in ten to fifteen seconds; desktop takes another five or ten after login.

        Similar for my netbook running Ubuuntu.

        But, still, the OS could probably do a lot of things to improve boot performance on a hard drive. The average Windows machine seems to thrash the disk for two or three minutes at startup, presumably because it’s demand-loading a ton of programs and DLLs rather than loading the whole thing in one go.

  2. In my day job, we write software that has to run 24/7/365. All the major bugs we’ve had in the last few years are due to third-party libraries, or connections to third-party systems which barf on some data we send them, or send us bad data which we reject or drop. For example, memory leaks, memory corruption, going batsh*t crazy and spawning an ever-increasing number of requests, or, as with Heartbleed, not validating data from other systems before they process it, so they access invalid memory and crash.

    Years ago, we’d write all our software ourselves, at least all that ran between the operating system and the graphics interface. Today, it’s largely a matter of taking a bunch of third-party libraries and hooking them together with the logic required for our particular usage, and many, if not most, of those third party libraries are crap. But no-one is willing to pay enough for us to write our own code which isn’t.

    Plus, the number of software developers has exploded, which means the average software developer is far less skilled than they were. The dot-com boom, for example, brought in many who just wanted to throw some crap together and become the next pets.com, and who cares whether it’s still working after the IPO?

    Frankly, these days I’m more surprised when something works than I am when it doesn’t.

  3. If you can’t tolerate reading this horrible alternating background image/giant font style, allow me to summarize: most of the article is about software security – the least ordered item on the menu. I work in the software security industry, and what I’m about to say is heresy.

    To most of us, our personal safety, the safety of our families and the security of our couch, golf clubs and television set are a lot more important than whether or not our web server is installed properly, but you’d never know it from how geeks go on about software security. How many locks do you have on your front door? Why don’t you have bars on the windows? With rare exception, the vast majority of us know our homes can be broken into by any remotely competent band of thieves, but it doesn’t keep us up nights worrying. Even those of us who have safety screens or an alarm system are more likely to worry about whether the people we want to be able to get into our home can do so, than keeping undesirables out.

    With rare exception, no-one gives a shit about security. Asking why programmers don’t write more secure software is akin to asking why builders don’t make more secure houses – few people want them.

    1. The cost of physical intrusion is higher then the cost of cyber intrusion.
      If you break into a house, the homeowner may be waiting with a gun,
      there could be a dog, a cop could be waiting for you when you come out.
      The number of houses or businesses a diligent burglar can enter is at best a handful
      per day, while a little bit of clever code can in an evening infect 500,000 machines.

      Add in in physical security, you have insurance companies demanding security,
      managers demanding locks and people paying taxes for cops. On the internet
      a wild west free society, of libertarians, well, there are some cyber task forces
      but that’s about it.

      I guess the market doesn’t value security.

  4. The political system is a complicated bureaucracy of impossible to reconcile or understand laws operated by corrupt civil servants gaming the system. Software is a complicated assemblage of libraries and practices directed by users who don’t understand their own goals and lack knowledge of how software functions (instead treating as if it were magical).

    Perhaps people have reached the limits of their brain power. After this: simplification, poverty, wistfulness about when things were better.

  5. Yeah, we know all this stuff. We paid the price – gladly – of having more and more complex and wonderful software, by making the software so complex that no one can understand it.
    My hospital head of IT security is pretty clear about what his job is: making sure that the hospital doesn’t get sued or fined for allowing unauthorized access to patient data – by doing enough encryption and such that no one can blame us when it happens. (HIPAA pardons data loss on a lost laptop if it is enrypted, even if the password to the encryption is taped to the laptop.) He knows data can and will be lost, but we don’t want to pay fines for it.

    1. I guess I would add that this is the _right_ choice in today’s IT environment. It is the only way to make very large, very complex, very awesome applications. They just involve too many pieces, and levels of pieces. Accept this or give up on having Madden 2014.
      The real question is, what guidelines should people follow given that this is the case?

  6. Whenever I thought of software nightmares, I could remember and chuckle about the point in Vernor Vinge’s novel “A Deepness in the Sky” wherein 8000 years from now people (well, sort-of people) are still tracing down bugs that originated deep in the Microsoft heart of systems.

    It’s now become evident that there may be a lot more than that providing several thousand year old bugs.

  7. I repair computers for living and the number one question I get asked by someone after their computer crashes is, “Why did this happen?”. I tell them, “Some interaction between any number of applications, services, and hardware systems running on the computer produced a ‘Perfect Storm’ so to speak that corrupted a key component of the operating system that was needed to boot the computer; that or your hard drive crashed.”.

    I will say that it has gotten insane how many patches Microsoft has been pushing out since the Heartbleed fiasco. Now when I rebuild a computer with a clean baseline install of Windows 7 it takes a few days for the computer to download and install the 400 some odd fixes/patches that have been released to date. It’s past the time for Microsoft to release another service pack for Windows 7.

  8. This article illustrates why I think “The Singularity” is and always will be a geek fantasy.

    1. Can you describe your thinking in more detail?

      When I first read your comment, my reaction was to say you could just as well say “This article illustrates why I think X is and always will be a geek fantasy”, where X is any complicated software goal, particularly those requiring privacy and a high degree of competence. Before PayPal, you could say “this article illustrates such as online money transfers will always be a geek fantasy”.

      Are you saying that there won’t be an ever-increasing speed-up in computational capability or are you saying that “The Singularity” itself, whatever that means, can’t happen because it would depend on too much complexity, or are you saying something else?

      Why wouldn’t we just have a very buggy, very insecure sort of Singularity, just as our own brains are incredibly complicated, but rather buggy and quite insecure?

      1. That should have read ” Before PayPal, you could say “this article illustrates why online money transfers will always be a geek fantasy”.

      2. Hosting a human consciousness in a machine (which necessarily won’t be representative of the brain), without corrupting the consciousness to the point where it was no longer the same consciousness, would require a perfect operating system, and a perfect brain function emulator.

        1. I think the answer there is to accept “sufficiently the same” rather than “exactly the same”.

          After all, to go right to an extreme example for illustration purposes, you can puncture a brain with a bullet or knock it with a hammer, and sometimes it’ll keep on ticking, to the extent that the person and their friends all agree that the person is still the same person.

          Brains, like other body parts, seem pretty robust. Presumably personalities hosted by brains are too. Software can be robust as well – a computer program that assembles an image might have glitch here and there (I have one right now that draws all lines except ones that are parallel to the horizontal axis, due to a silly geometry mistake) but the image still comes through to the extent that the output is still useful.

          Think of a personality as a old time radio show: you turn the dial, trying to get good reception, static, static, and then hey, turn the dial a bit more and you hear the radio program, you can understand the story the actors’ voices are enacting, even though there is still a little static in the background…. Similarly, we can pick out voices at a crowded noisy party, we can see even when our glasses are splattered with mud, and we can reason even when we have “food coma” or have too much alchohol or any number of other sources of minor brain impairment … I realize this is impressionistic rather than scientific, but my point is that a personality might be robust enough to handle some noise, noise in the form of bugs giving bad data every once in awhile.

  9. I’ve said it here before but I guess it bears repeating. Every software product starts life as a vital organ and then, over time, owing to uncontrolled growth and complexity, becomes a tumor.

  10. Back In 1990, I had something of an epiphany about the nature of software. I was a Space Surveillance Crew Commander for the Cobra Dane intelligence radar at Shemya, AK. On a routine Saturday day shift, my crew quietly went about our tasks. My Space Console Operator (SCO) was setting up short term fences (radar search patterns) for high-interest objects, something she had done hundreds of times before. When she entered the information and hit compute, the radar’s computer crashed. Crashing a computer that controls a 16 megawatt phased array radar is no small thing. Our crew quickly executed the checklist procedures and my SCO resumed her task. When she once again hit compute, the computer crashed. Working with our Raytheon contractors, we duplicated the crash twice more to confirm what was happened and then abandoned short term fences for that day. The investigation showed that a small, routine patch had uncovered a bug that had existed in the Fortran compiler’s math library since it was created. At the time, the Dane still used the original computer system that dated from the mid 1970s. The bug had laid dormant for over 15 years before the right set of circumstances revealed it.

    Computers have multiple layers. Inside the CPU, there is microcode. As the Pentium floating-point error of the mid 1990s proved, microcode can have errors. Above that, you have the computer’s BIOS which can and does have errors. On top of that, there’s the physical design and construction of the computer itself which can have errors, especially when the motherboard ages. On top of that, there’s the operating system with its host of errors. Next, you have the applications that are running, each with their own set of errors. In most cases, the applications are written using compilers which, as my example shows, can have their own errors.

    Any given programmer only has control over his own code. His code could, in theory at least, be perfect but is still subject to all of those other errors. Nothing he can do will eliminate all those other errors. At best, he can try to make his code fault-tolerant of known errors beyond his control. There is no such thing as an error-free computer system. You’d have to validate every one of those layers and that is beyond anything we could reasonably accomplish. If you had unlimited time and money, I suppose you could custom build a computer system with validated code at every layer. However, it’d have to be a simple system with rigorous controls and it would cost a fortune.

  11. …And yet somehow the world goes on…

    After forcing myself through the linked article, I discovered that a journalist with an interest in computers wrote an unsourced article saying “everything is broken.” She also went on (and on) about the NSA, as if that has anything to do with the article. Or does she think the NSA can only spy on people via software vulnerabilities?

    Here I thought I would read something written by a security expert or coding professional. Instead I get “True Confessions” anonymous stories.

    What’s depressing is that people take her seriously.

  12. Solid code can be written. Most programmers don’t know what solid code is. Simplicity is the first requirement (eligence actually but that’s harder to understand.) However, usually you have some dependency on code you have no control over. Richard Stallman and others have tried to address the problem.

  13. Back in the late 70’s, when I started programming professionally (microcode, assembly, C), my father (a NASA pilot) handed me a paper written by a colleague at NASA Ames. The conclusion was similar even then, but regarding the robustness and reliability of control systems software, instead of web apps etc.
    Plus ca change, plus c’est la meme chose.
    I have the cure, by the way. Just read this anecdote:
    See: A bunch of real-time systems programmers, griping that the Air Force wants “program load verification” (don’t ask, this was 1982) included in the system. “That will slow the system to a crawl!” we all yelled. Enter grizzled old (OK, maybe 35) Sr. Systems Engineer. He regards us with a jaundiced eye, and slowly drawls (Texan-style) “I want y’all to remember, there’s NUCLIAR BOMBS attached to this thang. Get’n the target data k’rect is kind’a more m’portant than speed.”
    (Yeah, he actually pronounced it NUCLIAR. Sue me.)
    So we slowed the system down to get it right (and thank god it was never stress-tested in action.) I frequently remembered that incident in my later, less bomb-ful, career.

Comments are closed.