The astounding , jaw-dropping needs for computer resources in Human Brain Project

I quote from  my previous post The Amazing Brain Research and Supercomputing :
 I noticed these incredulous  comments
Apr 4 2013: Hello I have a question. If you can simulate 10 000 neurons with a supercomputer now, if you follow Moore's law you will be able to simulate 5 120 000 neurons in 10 years time. The brain has 86 billion neurons. How do you expect to achieve simulation of all of them?
The answer came from another reader:
 May 25 2013: with 168 super computers. Which might sound crazy but isn't that the idea with cloud computing? Grant it we would then potentially have a consciousness on the Internet which if we go by terminator is a bad thing. However it should be a consciousness equivalent to only 1 human which doesn't sound so bad.
Lets do a very simple math. If we need 168 computers the size of the IBM Blue Gene,
which costs $20 million and whose cooling alone with Geneva lake water has an annual price tag of $1 million.
 This means we need 168 * ($20 million) a total of $3.4 Billion plus 168 millions per year in cooling. The project total funding is $1.3 Billion, so cost of computer hardware alone is 2.6 times higher than the entire project budget.

This is exactly the dilemma CERN  faced when searching for God Particle. The cost of the hardware alone - using the yesteryear mentality - is staggering. Yet they found the Higgs particle, using collaborative computing, called High Throughput Computing (HTC), a concept pioneered by Dr. Miron Livny and it is now proven.

There is only one way to have the resources needed to map brain in the Human Brain Project. This is called sharing resources. For science Open Science Grid and the CHTC were great contributors in setting the CERN infrastructure. All this know how is available  to HBP and to the  Brain Activity Map initiative from US.

OSG Management:  Dr. Miron Livny far right and Frank Wuerthwein second from right
We have made big progress in linking supercomputer resources with High Throughput Computing providers like OSG. See Dr. Frank Wuerthwein  SDSC’s Gordon Supercomputer Assists in Crunching Large Hadron Collider Data.

Frank believes that a  computing model is driven by one cardinal rule: "Technology must support Sociology" of our global enterprise and science efforts. The daring goals of the brain projects need  a large scale processing center, created dynamically, with resources from all over the world and seamlessly integrated into the  Human Brain Project  production system, for example.

HBP sharing with the Brain Activity Map initiative is the logical step to go. This computing infrastructure must be created first - we have the know-how to do it - before the brain map becomes a reality. This is how we identified Higgs. We accept this particle as reality. The same will happen with the brain map.

Post Scriptum: Bosco

And we need the simple tool like Bosco to free the creativity of thousands of scientists who will contribute. Bosco 1.2 is out. You will notice right away that it is much easier to install, particularly if you never used cluster management software before. Have a look at the new Bosco QuickStart to see what we mean.

To download Bosco, click below





1 comment

Popular posts from this blog

Bernard Madoff and I - updated January 1, 2015

Platforms always win

Insurance Industry and the next Industrial revolution