Pleasantly Parallel Computing.

Jason Stowe, CEO of Cycle Computing  will be the keynote speaker at ISC Cloud'13 this September 23-24. Below are some excerpts from his interview with Nages Sieslack.

Jason  calls HTC "high throughput computing" , "pleasantly parallel computing" 

The terminology is not new. The term "Pleasantly Parallel Computing" was coined by  Miron Livny , - the CTO of Open Science Grid - at the SciDAC PI Meeting - Boston, June, 2007. You can see Miron's slide presentation here

Wikipedia also recommends using "pleasantly parallel computing" and not "embarrassingly parallel computing"

It took then seven years for this terminology to reach mainstream

Maybe we should start a new acronym, PPC. Jason build his company on HTCondor, the open source product for the HTC technology and implemented in the Open Science Grid. By offering a commercially  "on-demand" HTC and HPC application, Cycle Computing created a cloud. It a new name for the same technology. Jason  proved the practical benefits of the "pleasantly parallel" technology outside the realm of science and academia.

I can see a similar potential for Bosco, a product developed by Open Science Grid. We try to tame the complexity make all these powerful tools to the mainstream scientists. We try to make HTC and HPC, Pleasant Computing
Many use cases for cloud-enabled technical computing seem to be in the life science realm. What do you attribute this to? 
Stowe: Many of life science workloads, such as genome sequencing, or needle-in-a-haystack simulations like drug design are “pleasantly parallel” or high throughput, where computations are independent of each other. In the case of drug design, a cancer target is a protein that, much like a lock, has a pocket where molecules can fit, like keys, to either enhance or inhibit its function. The problem is, rather than the tens of keys on a normal key chain, you have tens of millions of molecules to check. Each one is computationally intensive to simulate, so in this case, a drug designer has approximately 340,000 hours of computation, or nearly 40 compute years, ahead of herself.With utility HPC, what would have taken a year to set up at a price-tag of $44 million, this drug sequencing workload completed in just 11 hours at a cost of $4,372. Without utility HPC, it’s safe to say this science would never happen. Even though life science was a logical proving ground for HPC in the beginning, other industries - financial services/insurance, EDA, manufacturing, and even energy, are now capitalizing on these kinds of benefits.
What other types of HPC applications or industries do you think are most suitable for the utility model of computing at this point?  
Stowe: We think that utility HPC will be the single largest accelerator of human invention in the coming decades. We have many use cases – energy, manufacturing, financial services, and many more - that prove how most modern sciences, especially Monte Carlo or data parallel simulations, work great in the cloud. Researchers, quants, and scientists of all disciplines can now execute computational science and complex finer-grained analysis that was previously unapproachable due to cost or overhead. Consider the impact on Financial Services as an example: a Fortune 100 firm uses HPC in the cloud to launch its monthly risk report – a 2.5 million compute hour Monte Carlo simulation that now completes over a weekend. A Fortune 1000 life insurance firm dynamically hedges risk across its entire portfolio, with nested stochastic on stochastic models and billions of inner paths for each annuity. Even at smaller scales, where scientists can start work in 10 minutes instead of waiting 6 weeks to get five servers, great science can now be done in a wide range of industries and applications.
Post a Comment

Popular posts from this blog

Bernard Madoff and I - updated January 1, 2015

Platforms always win

Insurance Industry and the next Industrial revolution