Forget HAL 9000; Here Come the Cylons
For some time now, Google's uber-secretive and appropriately titled "X Laboratory" technology center has been quietly creating one of the largest artificial intelligences in existence. Afterwards linking at the same time over 16,000 computer processors to create a neural network with over 1 billion connections, researchers at the X Lab exposed their new creation to 10 Million digital images found in YouTube videos.
The experiment is however one example of the many possibilities unleashed by such large scale software simulations, known as "deep learning models." These simulations tap into the power of massive computing data centers to build programs that can mimic higher-level brain functions just as vision and perception, speech recognition, and language translation. As a matter of fact, just last year Microsoft scientists presented innovation showing how such systems could be utilized to understand human speech.¹
To do this, the scientists were able to mimic what by nature takes place in the brain's visual cortex. "A loose and frankly awful analogy is that our numerical parameters correspond to synapses," said Dr. Ng, another member of the Google team. "It is worth noting that our network is for all that tiny compared to the human visual cortex, which is a million times larger in terms of the number of neurons and synapses," the researchers wrote.¹
For example, much like our own learning process, it forms context-based connections to discern the difference between an apple you eat, and a company you invest in.
The Knowledge Graph
When I first heard about the Knowledge Graph, I assumed that scientists were for all that years away from the creating an artificial neural network that could teach itself, especially given existing limitations in processing power, coupled with the bewildering complexity of the human brain. Like many developments in innovation, these obstacles were overcome much faster than expected.
Granted, we are for all that a long-way off from creating an artificial neural network to rival that of our own. However the foundation stone has now been laid; from here, it's only a matter time.
With time, aided by the power of big data and integrated networks, researchers will create a neural net that first rivals, than quickly exceeds, the intelligence of the average human brain. As scientists get closer to doing so, they will be ever-more reliant on increasingly-intelligent computers to do so.
This may take a few years, decades, or longer, depending on the limitations of processing capacity and speed, and the pace of technology in such fields as quantum and bio-computing.
As we become ever-more reliant on these highly-networked and highly-complex computer systems to control macro-functions just as our energy processing plants, cities, nuclear defense installments, and financial markets, the prospect of "turning them off" or "unplugging them" will be untenable.
Chris Horton is a contributor for Minneapolis-based Internet marketing company SyneCore Technologies. He is an informative resource for those interested in online marketing, earned media, app development, digital signage, software development, and various digital marketing technologies…. View full profile
Business 2 Community is an independent online community focused on sharing the latest news surrounding Social Media, Marketing, Branding, Public Relations & much more. Every day we feature the thought leadership of our open community of bloggers and aim to provide a balanced view of the business landscape based on industry news, trends and real-life experiences.
- · Rackspace debuts OpenStack cloud servers
- · America's broadband adoption challenges
- · EPAM Systems Leverages the Cloud to Enhance Its Global Delivery Model With Nimbula Director
- · Telcom & Data intros emergency VOIP phones
- · Lorton Data Announces Partnership with Krengeltech Through A-Quaâ¢ Integration into DocuMailer