Downloaded Brains versus VGER

And while I'm on a roll trying to dump pent-up writing thoughts: Last month, I read this article in the Guardian presenting claims that we'll be downloading our brains in 2050.

Well, it's no secret that I'm a Ray Kurzweil fan and a Singularitarian geek who's certainly eagerly working toward achieving The Rapture. But, it really peeves me when people assume Moore's Law alone will get us there--the idea that, if we just had a Pentium chip with enough mega/giga/tera/petahertz, an assemblage of hardware will magically come to approximate our own brains enough to spontaneously gain consciousness or at least enable it to host one of our own existing cells of awareness.

The problem is that consciousness and cognition aren't a matter of brute force processing, and even an infinite array of CPUs won't necessarily play a good game of Go or cause a mind to emerge. Sure, incomprehensible resources of computational power helps, but it's about the clever algorithms.

Look, here's an analogy: Consider dynamite. It blows things up. Make better dynamite, blow bigger things up. Make infinitely powerful dynamite, and you can blow up anything in the universe. But, you're still just blowing things up--you're not making Mount Rushmore.

Another analogy: Consider a sports car. It goes fast. Make a better sports car, it goes faster. Make an infinitely fast sports car, and it goes as fast as you can imagine. But, it's not a boat or a plane. Drive that car into the ocean and, barring any quirks or inertia that allow insanely-fast tires to grip water or skip over it altogether--I'll bet that the car sinks to the bottom rather than getting you from New York to London.

You can have all the power promised by Moore's Law projected out into infinity, but you'll never build a brain until you figure out what real tricks it has up its sleeves.

In the meantime, I think efforts like Doug Engelbart's Bootstrap Institute are where it's really at. I think that human brains won't get replaced for a few hundred or thousand years yet, but in the meantime we'll wrap ourselves in layer upon layer of cognitive amplification. Hopefully, medicine will come to a point where the little magic meat in the middle can be sustained indefinitely, so we can all become our own versions of VGER.

Archived Comments

  • Les, one thing I've noticed about the computer-as-a-brain idea is that computers do a lousy job at forgetting. Your brain can hide away some useful fraction of your total memory from easy recall so it's not impinging on your day to day consciousness, but computers when the lose their memory or storage have nasty things like disk crashes.
  • I agree that cycles are clearly no substitute for understanding the algorithm. That said, though, I believe we're quite likely to create consciousness (or things that look conscious) without really understanding the mechanisms of consciousness. I imagine that at some point we'll build a model of a human brain -- which might model operations at the neuron level or at the molecular level -- and it will act as if it's conscious. However, we won't actually understand what it's doing that makes it conscious. I think that's a more likely scenario than us understanding consciousness prior to building it.
  • IMO, consciousness forms from random interactions in massively connected systems. Consciousness will evolve out of systems which allow the random noise to exist and feed back on itself. It doesn't need to be fast, just large and patient enough to let things emerge. A randomly-seeded instance of Conway's Game of Life (with additional data injected now and then based on some sort of "sensory input") running on a slow computer would be much more likely to generate a (very slow-thinking) sentient being than a blazing-fast large-scale cluster of supercomputers which is programmed for a specific task.
  • fluffy: The notion that seems most attractive to me so far, with regards to the origins of consciousness, is that you need to have a reason to predict the future behavior of other critters. So, to the degree that you can model and simulate other critters--competitors and co-conspirators--you gain survival advantages. And then, one day, you start taking a shot at predicting your own behavior. Then, it's like pointing a camera at its own monitor, and all hell breaks loose.
Gadget Flashback  Previous Blosxom, Tiger, and Spotlight Next