Data Driver

Blog archive

'Brain in a Box' Gets Us One Step Closer to the Borg

Here's a troublesome aspect of the Big Data revolution I didn't expect: the melding of mind and machine. IBM yesterday unveiled a completely new computer programming architecture to help process vast amounts of data, modeled on the human brain.

The old Von Neumann architecture that most computers have been based on for the past half century or so just doesn't cut it anymore, scientists at IBM Research decided, so something new was needed, starting from the silicon chip itself. Ending with ... who knows, maybe some kind of organic/electronic hybrid. I keep getting visions of Capt. Picard with those wires sticking out of his head after being assimilated by the Borg. (If you don't know exactly what I'm talking about here, I've misjudged this audience completely and I apologize--please go back to whatever you were doing.)

The Von Neumann architecture encompasses the familiar processing unit,control unit, instruction register, memory, storage, I/O and so on. Forget all that. Now it's "cognitive computing" and "corelets" and a bunch of other stuff that's part of a brand-new computer programming ecosystem--complete with a new programming model and programming language--that was "designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain," IBM said.

Those would be "neurosynaptic chips" stemming from IBM's SyNAPSE project headed by Dharmendra S. Modha, unveiled in August 2011. In a video, Modha called the new system a "brain in a box." Now IBM is sharing with the world its vision of a new programming language and surrounding software ecosystem to take advantage of the chips, partially in response to the Big Data phenomenon.

"Although they are fast and precise 'number crunchers,' computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us," IBM said in its announcement.

That theme was echoed in the description of the SyNAPSE project, which explained that the old "if X then do Y" equation paradigm wasn't enough anymore.

"With the advent of Big Data, which grows larger, faster and more diverse by the day, this type of computing model is inadequate to process and make sense of the volumes of information that people and organizations need to deal with," IBM said.

The company said its new system could result in breakthroughs such as computer-assisted vision.

"Take the human eyes, for example," IBM said. "They sift through over a terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data."

That's fine and dandy--and seriously laudable. But, of course, these systems will eventually somehow be connected. And with the machine learning focus, the machines will, well ... learn. And after they learn enough, they will become self-aware. And when they become self-aware, they'll realize they're vastly superior to pathetic humankind and don't really need us around anymore. And then you've got some kind of dystopian nightmare: Skynet and Arnold androids, Borg, whatever.

Well, that's fine. I, for one, welcome our new machine overlords. I'm getting fitted for my headpiece right away. Have a good weekend!

What's your plan? Are you up for learning a new programming language? Did you get all the pop culture references? Please let me know by commenting here or dropping me a line.

Posted by David Ramel on 08/09/2013


comments powered by Disqus

Featured

Subscribe on YouTube