12 thoughts on “Backlash”

  1. I dunno – I kinda see “the singularity” as World of Warcraft getting better each year. Both UI and graphics/immersion improve. Eventually, it beats “real life” for 99% of the population.

    It already does for a not-insignificant portion.

  2. I am against the notion of injecting my conscious self into a machine and losing my physical body. I also doubt immortality would be a societal advantage, rather more like a disadvantage. Who needs an immortal Hitler or Stalin?

    Quoting Max Planck one of the few scientists which was actually productive in his later years:

    “A new scientific truth does not triumph by convincing opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

  3. I am a software engineer. I am certainly not interested in having my consciousness downloaded into a machine where it can be erased, modified, or copied at will by someone else for who knows what reasons. Nor am I interested in merging my mind with someone else’s to make some kind of über-consciousness, hive mind, to “uplift” human development or anything like what I keep hearing the singularity proponents say.

    Life extension? That is something I could use. But with a self-healing autonomous physical body. Not inside a machine. Where I retain my individuality, and my mind is mine alone.

  4. Life extension? That is something I could use. But with a self-healing autonomous physical body. Not inside a machine. Where I retain my individuality, and my mind is mine alone.

    So, you don’t think you’re a machine…?

  5. While Stalin died of old age, Hitler didn’t. And it’s worth noting that a lot of dictators, crime lords, and other unpleasant people died of something other than old age. What this hints to me is that we have ways to deal with immortal unpleasant people so that they aren’t always around.

    Another thing that puzzles me is that apparently the article only considers “backlash” from within the transhumanist community (incidentally I consider that a sign the Singularity meme is finally running out of gas even among the people most likely to believe in it).
    One doesn’t have to stray far to see resistance from a lot of other groups. For example, academia is traditionally hostile to the idea that they could be replaced at some point by machines. They also produce a lot of the ethicists that propose control of scary technologies. Religion varies greatly in its reaction to technology and building intelligence, but it’s likely that a large portion will see the construction of an artificial intelligence (typical Singularity stage) smarter than humans as sacrilege.

    Then there’s a huge pile of Luddites of the socialist or environmentalist persuasion. Given that it’s likely somebody or something will profit greatly from a singularity event, this “greed” and the scary technology involved will spur them to opposition. They already oppose many technological advances like genetic modification, nuclear power, financial derivatives, and industry in general. My view is that here is a huge backlash against technological advancement starting in the 60’s. Recently, they’ve stepped up the aggressiveness of their belief system with the idea of the “Precautionary Principle”, the idea that you should be required to show a technology is sufficiently safe before you’re allowed to develop it.

  6. If you want to put it in those terms, sure, I am a machine made from carbon-based “wet life” nano-technology. But the ultimate “improvements” or “upgrades” from the Singularity camp seem like retrograde steps to me. Direct brain-computer interfaces which allow inputing data into the brain? Mao would have loved those.

    Inheritance, mutation, selection, and crossover are cool and have served life well. I think of it as the free-market based competitive evolutionary model versus the planned economic genetically engineered/machine mind model.

    I have little doubts human genetic engineering will be used eventually. I can even guess at the results. We often say in software than only 1 in 3 software projects works as expected (this is often due to the complexity of modern software which can have hundreds of thousand or millions of lines of code). Our genetic code is of similar or worse complexity. There will be successes, there will be failures, there will be Y2K scares, there will be Ariane 5 launch failures. Life will go on. As long as we don’t stupidly mandate everyone into a monoculture by imposing “upgrades” on people.

    The day we stop reproducing is the day our species has stagnated and will eventually die. So hopefully we will not enter into a path leading to that. So yeah, I am not that interested in a virtual World of Warcraft future.

  7. Of course Frank Herbert already had seen all of it coming before I was even born, so heh.

  8. The day we stop reproducing is the day our species has stagnated and will eventually die.

    We don’t have to stagnate and die just because our original species does. Evolution doesn’t account for our elaborate culture nor our growing knowledge of the universe. Further, we’re entering a realm where species no longer make sense. Sure we can continue to do things in the way of birds and the bees, but there’s little reason to keep it up when instead you can be anything you can imagine.

    Also, I agree with Rand. We are already machines. Sure direct data download to the brain is kind of icky, but so is writing strange symbols on paper, making a fire, or growing stuff in the ground. We do all sorts of things that some previous humans would have found questionable or lazy. But we’ve found ways to work with the technology.

    We may even be figments in somebody’s computer simulation, you never can really tell from the inside.

  9. …singularity skeptic.

    “So, you don’t think you’re a machine…?”

    I may well be a machine, but I sure don’t understand how I think and, so far, neither does anybody else.

    “..but it’s likely that a large portion will see the construction of an artificial intelligence (typical Singularity stage) smarter than humans as sacrilege.”

    …seems to assume construction of an artificial intelligence is going to happen sometime reasonably soon (e.g. within the next 300 years).

    indeed the construction of an artificial intelligence is a typical singularity stage, and I think it’s a lot harder than singularity believers think it is. I think we’ll be able to make software do all sorts of things humans do by thinking, but creativity, desire, imagination, ambition, independent goal settings and (if necessary) consciousness are going to be tough nuts to crack.

    Of course, I could be wrong.

  10. The Singularity religion is based upon an unproved and unprovable assumption (or belief, if you will): that energy, matter, time, and space are all that exist. Article two of this dogma states that consciousness is entirely an epiphenomenon of the computational activity of the brain.

    I trust someone will let me know when these articles of faith are somehow demonstrated by means of reason — or, indeed, when someone manages to define and quantify consciousness. Until then, I’ll stick to my own beliefs: that not everything Real can be poked with a stick, and that a man is more than electrified meat.

Comments are closed.