Google's new Cloud Data Center Strategy
I am a long time grid and cloud computer observer, blogger and dreamer.
Three years ago, everyone thought Amazon, Google are the very personification of cloud computing.
I read What Cloud and AI Do and Don’t Mean for Google’s Data Center Strategy. About a year ago, Google brought in new people to revitalize their cloud computing.
The man in charge is Joseph Kava, a Google VP who leads the company’s data center engineering, design, and operations. This is what he says that I thought is worth commenting
Google proved Ericsson right, and not the other way around
Phones are not made to chat to data centers (yet). They are designed to send voice and text to other human beings Android can not offer a service secure and reliable to make one phone, the same phone, accessible in 187 countries seamlessly. Ericsson has both the experience and technology to do that with any IoT device, not only phones
Based on this, a year ago, a computer system beat a professional human player at the ancient Chinese board game Go. The AI system, AlphaGo, was built by Google and trained using machine learning techniques.
This how it works:
If things are too beautiful to be true, you are right. There is new skill to learn on how to program many core processor. Please see An Interview with David Ungar, IBM Research
There are too many new developments that are not reflected in this graph. It shows the past, not the future. The future is in having a cloud as reliable and easy to use as a cell phone worldwide.
Note; The opinions on this blog are mine and not of my clients and or employers
Three years ago, everyone thought Amazon, Google are the very personification of cloud computing.
I read What Cloud and AI Do and Don’t Mean for Google’s Data Center Strategy. About a year ago, Google brought in new people to revitalize their cloud computing.
The Alphabet subsidiary has been taking big steps to show that it is “dead serious” about its cloud business, to quote Diane Greene, the founder of VMware whom Google hired last year to lead this charge.In March 2016, only tree month ago, we read Google to Build and Lease Data Centers in Big Cloud Expansion
The man in charge is Joseph Kava, a Google VP who leads the company’s data center engineering, design, and operations. This is what he says that I thought is worth commenting
Hybrid Strategy
We all know that as people move to the public cloud, they are developing a hybrid strategy. They are still keeping some of their apps and some of their systems either on-premise or in their colo, and they’re offloading a tremendous amount of workloads to the public cloud providersOf course we knew, but this is the first time Google admits it! The Diane Green team made a difference Ericsson right from the beginning adopted the hybrid model and said that we are not too late in cloud computing. From hyperscale Ericsson website
One can not create the IoT as part of the networked society, or the Planet as a Service by hosting everything on Google or Amazon classic public on demand clouds.The problem: the infrastructure is dictating the business, instead of the other way around. It’s time to say goodbye to traditional IT infrastructure. Say hello to the new era of digital industrialization.
Google proved Ericsson right, and not the other way around
The Internet of Things
The question asked is: What implications do you think IoT has for Google’s data center strategy? Joseph Kava replies
We’ve already had the Internet of Things. They’re called smartphones. Android has over a billion registered things that are chatting with our data centers all the time.Well, this is not so. Google Android and Apple IoS are platforms for smart phones using a specific OS When a platform enters the market of a pure pipeline business, the platform virtually always wins as business model.
Phones are not made to chat to data centers (yet). They are designed to send voice and text to other human beings Android can not offer a service secure and reliable to make one phone, the same phone, accessible in 187 countries seamlessly. Ericsson has both the experience and technology to do that with any IoT device, not only phones
Having the next billion interconnected things doesn’t really worry me, because those devices, whether they’re your refrigerator at home, or whatever those internet-connected things are going to be, they’re generally not going to be as chatty with data centers as your smartphone is. We’ve already dealt with it.Published studies of small scale IoT pilot trials contradict what Mr. Kava says. Just read IoT technologies hit their awkward tween years
Even with pilot projects, the data volume generated by IoT technologies is massive.
Machine Learning Tensor Processing Unit boards
Goggle's TPU
This shows Google's leadershipThere are customized hardware platforms that machine learning runs better on. It doesn’t affect the way we design our data centers,
The Tensor Processing Unit board. The TPU is a chip, or ASIC, Google designed in-house specifically to power Artificial Intelligence systems in its data centers |
This how it works:
The chip is tailored for machine learning. It is better at tolerating “reduced computational precision,” which enables it to use fewer processors per operation. “Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly,”The machine learning in Data Center is key in processing big volume of Data. Derek Collison from Apcera predicts all data will be ingested in Machine Learning engines and he says
I do believe that the notion of Hadoop 3.0 would simply be, "we will not even bother with it." We are going to plug our data in the Google Brain projectEricsson is a majority shareholder in Apcera.
If things are too beautiful to be true, you are right. There is new skill to learn on how to program many core processor. Please see An Interview with David Ungar, IBM Research
Nvidia Tesla P 1000
In April 2016, Nvidia announced a new chip called the Tesla P100 that’s designed to put more power behind a technique called deep learning. This technique has produced recent major advances such as the Google software AlphaGo that defeated the world’s top Go player.Intel's Xeon E7 v3 and Lustre* file-system, which is part of the Intel Scalable System Framework (Intel SSF}
From Intel's 18-core Xeon chips tuned for machine learning, analyticsWith its new top-line Xeon E7 v3 server chips based on the Haswell microarchitecture, Intel hopes to capitalize on the demand for this type of server. With up to 18 CPU cores, the chips are Intel’s fastest, and designed for databases, ERP (enterprise resource planning) systems and analytics related to machine learning....
Complex machine learning models can’t be distributed over the cloud or a set of smaller hyperscale servers in a data center. Instead, a more powerful cluster of servers is needed to run deep-learning systems, where the larger number of cores could power more precise analysis of oceans of data.
“To create an algorithm to look across thousands of genomes, and to look for correlations, is not the sort of workload that existed a few years ago,”
People still believe this
Note; The opinions on this blog are mine and not of my clients and or employers
Comments