Many Core processors: Everything You Know (about Parallel Programming) Is Wrong!

David Ungar is "an out-of-the-box thinker who enjoys the challenge of building computer software systems that work like magic and fit a user's mind like a glove.". this is a summary from  SPLASH 2011 in November 2011
In the end of the first decade of the new century, chips such as Tilera’s can give us a glimpse of a future in which manycore microprocessors will become commonplace: every (non-hand-held) computer’s CPU chip will contain 1,000 fairly homogeneous cores. Such a system will not be programmed like the cloud, or even a cluster because communication will be much faster relative to computation. Nor will it be programmed like today’s multicore processors because the illusion of instant memory coherency will have been dispelled by both the physical limitations imposed by the 1,000-way fan-in to the memory system, and the comparatively long physical lengths of the inter- vs. intra-core connections. In the 1980’s we changed our model of computation from static to dynamic, and when this future arrives we will have to change our model of computation yet again.
If we cannot skirt Amdahl’s Law, the last 900 cores will do us no good whatsoever. What does this mean? We cannot afford even tiny amounts of serialization. Locks?! Even lock-free algorithms will not be parallel enough. They rely on instructions that require communication and synchronization between cores’ caches. Just as we learned to embrace languages without static type checking, and with the ability to shoot ourselves in the foot, we will need to embrace a style of programming without any synchronization whatsoever.
In our Renaissance project at IBM, Brussels, and Portland State,   we are investigating what we call “anti-lock,” “race-and-repair,” or “end-to-end nondeterministic” computing. As part of this effort, we have build a Smalltalk system that runs on the 64-core Tilera chip, and have experimented with dynamic languages atop this system. When we give up synchronization, we of necessity give up determinism. There seems to be a fundamental tradeoff between determinism and performance, just as there once seemed to be a tradeoff between static checking and performance.
The obstacle we shall have to overcome, if we are to successfully program manycore systems, is our cherished assumption that we write programs that always get the exactly right answers. This assumption is deeply embedded in how we think about programming. The folks who build web search engines already understand, but for the rest of us, to quote Firesign Theatre: Everything You Know Is Wrong!
 This video, is an interview with David Ungar from IBM Renaissance Project on programming many core computers and non-determinism, is mind boggling and expands beyond the text above.

 

David Ungar likes anything creative, from poetry to starting a business. The moment we built a 1,000 core process is like discovering a new space .  He also says , Small Talk - originally developed i

Note:  I added on February 7, 2014  the  original talk at Splash-2011  Everything You Know (About Parallel Programming) Is Wrong!: A Wild Screed About the Future 
16 comments

Popular posts from this blog

Bernard Madoff and I - updated January 1, 2015

Platforms always win

Insurance Industry and the next Industrial revolution