Today, when I was thinking of a blog topic, I had a moment of clarity. I realized that I regularly peruse the news and media for evidence of how machines and medical devices help humans, improve human life, and how humans utilize machines and technology to help solve problems. I also realized that I rarely I encounter a news article that discusses how something was improved upon, by being modeled after a human or a biological human factor.
I came across two very intriguing articles today, which are great examples of how technology is being improved by being modeled after human biology.
The first article, “IBM Scientists Show Blueprints for Brainlike Computing,” by Aviva Rutkin, discusses the extremely impressive and intimidating TrueNorth computing architecture by IBM.
On August 8, 2013 at the International Joint Conference on Neural Networks in Dallas, TX, IBM unveiled TrueNorth. TrueNorth may very well be a cognitive computing and programing forerunner.
So what’s so great about this?
According to IBM, cognitive computing is a system that learns and interacts organically with its users. IBM has developed software that simulates an insanely large neurosynaptic core, which can run on an average supercomputer. Each simulated core of the neurosynaptic computer is made of 256 “neurons”. The computer neurons mimic biological ones, in the sense that each neuron has its own individuality and responds differently, as well as differing in response timing.
So what does this mean?
This means that the computer system isn’t confined to parameters and predetermined responses based on actions, in the traditional sense. TrueNorth will learn algorithms and build upon its knowledge, in order to better aid it in predicting functions and tasks.
The second article, “A Camera That Sees like the Human Eye,” by Aviva Rutkin, describes a creepy, yet intriguing camera that is in development by iniLabs. The camera is called, the Dynamic Vision Sensor (DVS), and it replicates the function of the retina - in the sense that information is transmitted only in response to a change in the scene that the retina can process. The selectivity of the DVS will result in less power consumption and a reduction in “unnecessary” information that is processed. Not only does the DVS react to changes, but it also mimics the calibration of an individual eye neuron. With individualized pixel calibration, this also allows for each pixel to adjust its exposure, accordingly. The DVS is only compatible with IBM’s TrueNorth architecture.
DVS could potentially change the world as we know it through the eyes of photography, healthcare, and safety/surveillance.
What are your thoughts about this type of bio-tech? Is this a great advancement in technology or is it something that we should be cautious about? Please leave your comments and feedback!
For additional information:
James, M. “IBM’s TrueNorth Simulates 530 Billion Neurons (Nov. 23, 2012) Retrieved from: ”http://www.i-programmer.info/news/105-artificial-intelligence/5117-ibms-truenorth-simulates-530-billion-neurons.html.
1.) Rutkin, H.A. (Aug. 8, 2013). IBM Scientists Show Blueprints for Brainlike Computing. MIT Technology Review. Retrieved from: http://www.technologyreview.com/news/517876/ibm-scientists-show-blueprints-for-brainlike-computing/.
2.) IBM.Cognitive computing. Retrieved from: http://researchweb.watson.ibm.com/cognitive-computing/index.shtml.
3.) Rutkin A.H. (Aug. 23, 2013). A Camera That Sees like the Human Eye. MIT Technology Review. Retrieved from: http://www.technologyreview.com/news/518586/a-camera-that-sees-like-the-human-eye/.