Showing posts with label Central processing unit. Show all posts
Showing posts with label Central processing unit. Show all posts

Thursday, September 11, 2014

Neuromorphic Chips

Dr. Isaac Asimov, head-and-shoulders portrait,...
Dr. Isaac Asimov, head-and-shoulders portrait, facing slightly right, 1965 (Photo credit: Wikipedia)
Is this what you see on the way to Singularity?

Neuromorphic Chips

Traditional chips are reaching fundamental performance limits. ..... The robot is performing tasks that have typically needed powerful, specially programmed computers that use far more electricity. Powered by only a smartphone chip with specialized software, Pioneer can recognize objects it hasn’t seen before, sort them by their similarity to related objects, and navigate the room to deliver them to the right location—not because of laborious programming but merely by being shown once where they should go. The robot can do all that because it is simulating, albeit in a very limited fashion, the way a brain works. ..... They promise to accelerate decades of fitful progress in artificial intelligence and lead to machines that are able to understand and interact with the world in humanlike ways. Medical sensors and devices could track individuals’ vital signs and response to treatments over time, learning to adjust dosages or even catch problems early. Your smartphone could learn to anticipate what you want next, such as background on someone you’re about to meet or an alert that it’s time to leave for your next meeting. Those self-driving cars Google is experimenting with might not need your help at all, and more adept Roombas wouldn’t get stuck under your couch. “We’re blurring the boundary between silicon and biological systems” ...... Today’s computers all use the so-called von Neumann architecture, which shuttles data back and forth between a central processor and memory chips in linear sequences of calculations. That method is great for crunching numbers and executing precisely written programs, but not for processing images or sound and making sense of it all. It’s telling that in 2012, when Google demonstrated artificial-­intelligence software that learned to recognize cats in videos without being told what a cat was, it needed 16,000 processors to pull it off. ..... “There’s no way you can build it [only] in software,” he says of effective AI. “You have to build this in silicon.” ...... Isaac Asimov’s “Zeroth Law” of robotics: “A robot may not harm humanity, or, by inaction, allow humanity to come to harm.” ..... glasses for the blind that use visual and auditory sensors to recognize objects and provide audio cues; health-care systems that monitor vital signs, provide early warnings of potential problems, and suggest ways to individualize treatments; and computers that draw on wind patterns, tides, and other indicators to predict tsunamis more accurately.


Friday, July 27, 2012

Algorithms And Creativity


Can Creativity be Automated?
Computer algorithms have started to write news stories, compose music, and pick hits...... The process record labels use to find new talent—A&R, for "artists and repertoire"—is fickle and hard to explain ...... an algorithm tasked with finding hit songs. .... hundreds of books and studies that have attempted to explain creativity as the product of mysterious processes within the right side of the human brain. Creativity, the thinking has been, proves just how different people are from CPUs. ..... When Novak submitted a song to McCready's engine through the Web, it was graded on a par with classic hits such as I've Got a Feeling by the Eagles and Steppenwolf's Born to Be Wild.
You may have a hit.
Music X-Ray's algorithms use Fourier transforms—a method of separating a signal from the "noise" of complex data—to isolate a song's base melody, beat, tempo, rhythm, octave, pitch, chords, progression, sonic brilliance, and several other factors that catch a listener's ear. The software then builds three-dimensional models of the song based on these properties and compares it with hit songs of the past. Putting a just-analyzed song on the screen with No. 1 tracks of yore shows a kind of cloud structure filled in with dots representing songs. The hits tend to be grouped in clusters, which reveal similar underlying structures. Get close to the middle of one of those clusters and you may have a hit.
And why, writing also. This bodes well for natural language search and communication, the idea that you can have a conversation with anyone in real time regardless of if they speak your language.
Music lends itself naturally to being parsed by algorithms—mathematics is mixed up in every chord, beat, and harmony that we hear. ..... Bots can't yet script prose worthy of awards, but on some metrics of economic importance to publishers—such as number of page views a site registers—bots can be far more productive than any journalist. They can write articles in seconds ...... his newer ones have consistently composed classical music that imitates masters like Johann Sebastian Bach so well that people can't always tell the difference
Enhanced by Zemanta

Thursday, May 05, 2011

Intel's 3D Transistors: Moore's Law Marches On

Image representing Intel as depicted in CrunchBaseImage via CrunchBaseThis is a big leap in a core sector of computing hardware. We will reap the benefits for years. This is great news for the mobile web, and for computing in general.

Intel Explains

Friday, December 10, 2010

Eric Schmidt's Cloud Computing And My IC Vision


The Official Google Blog: Cloud computing: the latest chapter in an epic journey: It’s extraordinary how very complex platforms can produce beautifully simple solutions like Chrome and Chrome OS ...... but then there are very few genuinely new ideas in computer science. The last really new one was public key encryption back in 1975. ..... But the web is not really cloud computing—it’s an enormously important source of information, probably the most important ever invented. One major web innovation cycle happened in 1995—remember the Netscape IPO, Java and all of that—ultimately leading, in 1997, to an announcement by Oracle
El número 14Image by wicho via Flickr (and bunch of other people including myself) called “the network computer.” It was exactly what the Chrome team at Google was talking about on Tuesday. ....... Moore's law is a factor of 1,000 in 15 years—so 15 years ago versus today, we have 1,000 times faster networks, CPUs and screens. ...... Asynchronous JavaScript XML, or AJAX, came along in in 2003/04, and it enabled the first really interesting web apps like Gmail to be built. ...... LAMP, which stands for Linux, Apache, MySQL, PHP—and Perl, Python and various other Ps—evolved as a platform for the back-end........ Instead of building these large monolithic programs, people would take snippets of code and aggregate them together in languages like Java and JavaScript. ..... As usual, Larry and Sergey were way ahead of me on this. From my very first day at Google, they made clear that we should be in the browser business and the OS business. ...... we've gone from a world where we had reliable disks and unreliable networks, to a world where we have reliable networks and basically no disks. Architecturally that’s a huge change—and with HTML5 it is now finally possible to build the kind of powerful apps that you take for granted on a PC or a Macintosh on top of a browser platform. ....... a small team, effectively working as a start-up within Google
I am working on a blog post called Google stole my idea. I am only half kidding, of course. I first thought of the IC concept in 2000. That was before I ever ready about Larry Ellison's network computer vision, something he had talked about apparently a few years before that.

The IC vision is what I hung on to as my straw when the dot com collapse happened.