Sunday, June 23, 2013

Oracle: what is going on?

How come Oracle missed the second consecutive quarter "on disappointing" revenues?

The Internet and media are full of "logical" explanations: Analyst  Brad Reback writes:
“Given [this was] Oracle; second consecutive miss, we believe many investors will question if there are other forces at work besides the macro [business environment] and if Oracle is losing its relevance to newer technologies.
He then brushes off his own conclusion. Reback holds a buy rating and $37-a-share target price on Oracle’s stock.


In Sept 2011, almost two years ago, Marc Andreesen  said Oracle is doomed:  "not a single one of Andreessen-Horowitz's startup investments use Oracle"
Cloud computing has lowered costs so dramatically, it's made huge success stories like Facebook possible. "Ten years ago, Facebook wouldn't have been viable as a company or a business. The infrastructure costs wouldn't have worked. The checks to Oracle, Sun, BEA, and EMC would have crushed the company before it ever got off the ground."
 "I think the clock is really ticking and Oracle is the most vivid case. I have huge respect for Oracle, Larry Ellison is one of my idols -- I wouldn't quite say role model....But objectively looking where they are, they have all the old software, they've cranked up the maintenance fees. That's all well and good for customers that model, but they have not leaped forward into this change."
As an ex Sun employee, I saw how Oracle took all what it appeared the cream of Sun and wasted away anything not useful short term.

Anything on cloud computing is much more economical to run than paying enterprise licenses to Oracle. If you are curious, see the current Oracle price lists here.

For example Transportation Management license is $16,100 plus  $3,542.00  for each 1$M Freight Under Management  with a minimum of $25M., this  is a a minimum of $104,650 per year. If you add options it shots to $500,000 per year plus easily for $100M

Even MySQL supposedly an open source program , is charged as "MySQL Cluster Carrier Grade Edition Subscription (5+ socket server) at $20,000 per year per server..

This is very simple: Oracle will not be able to build a cloud business without eroding it's own mighty enterprise software  revenues. And in transition to cloud, there are so many other players more powerful than Oracle.

All these alliances with Microsoft and come a little late. And who is going to fix this?  A CEO who lost the sense of humility, with not enough hunger as he is super-rich? Huge egos make most intelligent humans loose their prophecy gift.  He is surrounded by "Yes" executives, all decisions are top-down. As reported by my ex-colleagues who worked for a while in Oracle, to buy a laptop requires approval from the office of  Executive VP Safra Katz.

Some inevitable change is about to come, sooner or later and a temporary jump of the stock at $37 as predicted by  Brad Reback will not last, unless the very structure of Oracle changes miraculously.

Thursday, June 20, 2013

Umair Haque about a career, Upside Down

4. Build your personal brand. It'll help more than you think.

3. The point of the work you undertake isn't just to do stuff for other people. It's to develop you. Never confuse the two.

2. If it's a "job", stop. It's not going to develop your career.

1. You don't get what you don't negotiate for. So whatever you do, don't just negotiate for "money ".

Ok. I'm going to make three points about careers, work, and jobs. I'll be brief 

Actually, he made four.

See also 2012: Tell the status quo to go to hell

Tuesday, June 04, 2013

The astounding , jaw-dropping needs for computer resources in Human Brain Project

I quote from  my previous post The Amazing Brain Research and Supercomputing :
 I noticed these incredulous  comments
Apr 4 2013: Hello I have a question. If you can simulate 10 000 neurons with a supercomputer now, if you follow Moore's law you will be able to simulate 5 120 000 neurons in 10 years time. The brain has 86 billion neurons. How do you expect to achieve simulation of all of them?
The answer came from another reader:
 May 25 2013: with 168 super computers. Which might sound crazy but isn't that the idea with cloud computing? Grant it we would then potentially have a consciousness on the Internet which if we go by terminator is a bad thing. However it should be a consciousness equivalent to only 1 human which doesn't sound so bad.
Lets do a very simple math. If we need 168 computers the size of the IBM Blue Gene,
which costs $20 million and whose cooling alone with Geneva lake water has an annual price tag of $1 million.
 This means we need 168 * ($20 million) a total of $3.4 Billion plus 168 millions per year in cooling. The project total funding is $1.3 Billion, so cost of computer hardware alone is 2.6 times higher than the entire project budget.

This is exactly the dilemma CERN  faced when searching for God Particle. The cost of the hardware alone - using the yesteryear mentality - is staggering. Yet they found the Higgs particle, using collaborative computing, called High Throughput Computing (HTC), a concept pioneered by Dr. Miron Livny and it is now proven.

There is only one way to have the resources needed to map brain in the Human Brain Project. This is called sharing resources. For science Open Science Grid and the CHTC were great contributors in setting the CERN infrastructure. All this know how is available  to HBP and to the  Brain Activity Map initiative from US.

OSG Management:  Dr. Miron Livny far right and Frank Wuerthwein second from right
We have made big progress in linking supercomputer resources with High Throughput Computing providers like OSG. See Dr. Frank Wuerthwein  SDSC’s Gordon Supercomputer Assists in Crunching Large Hadron Collider Data.

Frank believes that a  computing model is driven by one cardinal rule: "Technology must support Sociology" of our global enterprise and science efforts. The daring goals of the brain projects need  a large scale processing center, created dynamically, with resources from all over the world and seamlessly integrated into the  Human Brain Project  production system, for example.

HBP sharing with the Brain Activity Map initiative is the logical step to go. This computing infrastructure must be created first - we have the know-how to do it - before the brain map becomes a reality. This is how we identified Higgs. We accept this particle as reality. The same will happen with the brain map.

Post Scriptum: Bosco

And we need the simple tool like Bosco to free the creativity of thousands of scientists who will contribute. Bosco 1.2 is out. You will notice right away that it is much easier to install, particularly if you never used cluster management software before. Have a look at the new Bosco QuickStart to see what we mean.

To download Bosco, click below

Monday, June 03, 2013

The Amazing Brain Research and Supercomputing

Nages Sislack, the International Supercomputing Conference (ISC) Public Relation Director opened my eyes about the Human Brain Project.often called the Blue Brain Project  She interviewed., the project leader Henry Markram, in preparation for 'Supercomputing & the Human Brain Project", which will be held on Tuesday, June 18 in Leipzig ISC'13

Nages Sislack, Public Relations ISC
The Intense World Theory on Autism

Before going into details on why the brain research exploded almost overnight, I noticed that prof. Markram and his wife Kamila Markram - a well known brain researcher as well - are parents of autistic children. This experience motivated them  to extraordinary achievements. In 2010 they published  The intense world theory - a unifying theory of the neurobiology of autism:
... the Intense World Theory predicts that severely autistic people that cannot speak or interact at all have locked up abilities even greater than savants. In other words, those autists classified as severely mentally retarded by the psychiatrist, may be the greatest savants of all. 
Dr. Kamila Markram
 we will learn how valuable the autistic community is for society. We will adapt the planet to embrace rather than lockup autistic people. Normal people guess at the world, while autists process information completely, comprehensively. This feature would not be good for survival in the jungle, but in human society, we can nurture these individuals and they can make a fantastic contribution to society. We will begin special compensation to families with autistic children as if they are potential Olympic athletes of the world. 
This is music to my ears. Trying to solve the autism mystery from a total  different point of view, started it the most important scientific project in the 21st century so far.

$1.3 Billions in funding in Europe and $3 billions funding proposed in US by Obama Administration:

To add to the brain-mapping mix, President Obama in April announced the launch of an initiative called Brain (commonly referred to as the Brain Activity Map), which he hopes Congress will make possible with a $3 billion NIH budget. (To start, Obama is pledging $100 million of his 2014 budget.)

This adds to the 1.3 Billions Blue Mind has in Europe. But Wired Magazine is cynical:
For Markram, the American plan is just grist for his billion-euro mill. “The Brain Activity Map and other projects are focused on generating more data,” he writes. “The Human Brain Project is about data integration.” In other words, from his exalted perspective, the NIH and President Obama are just a bunch of postdocs ready to work for him.
Makram worked at NIH and everybody knows him there. Although the project has its' share of detractors like  John Horgan in Scientific American
 This finding bolsters the argument that the Big Brain Projects–by funneling precious resources toward paradigms supported by flimsy findings–are premature.
Luddites from two centuries in England ago re-surfaced in America. .Horgan  is a teacher at Stevens Institute of Technology (whatever that is)  and the author of four books, including The End of Science. Oh well...

The Blue Brain Project: Neuroscience, Medicine and Supercomputing

There is so much written on the Internet and I wanted to extract in a nutshell, why Professor Markram believes the brain could be simulated in a computer by 2020. In his opinion, we are stuck with several outmoded paradigms of thought. All quotes were from a 2009  interview in Discovery magazine

The first is related to how the brain represents information, what I call an action-potential paradigm. It’s a spike-based paradigm. Neurons generate spikes of activity. You can think of these spikes as digital: zeros and ones. Individual spikes do not have enough information to represent perceptions. The current view of perception is based on analyzing these zeros and ones and trying to decompose or reverse engineer the representation that these zeros and ones capture. But I think that the zeros and ones—the information generated or emitted by single neurons—are a reflection of perception, not perception itself.
 This is the first Aha! The other big misconception is memory:
For 50 years we’ve been thinking of memory, despite all evidence to the contrary, as where you imprint changes in the brain. You go to your synapses, you go to your neurons, and you change them when you remember something. And then you’ve got to protect those changes. It’s an imprint. It’s called an engram. Hundreds of scientists have been chasing this engram. Where is the print of memory in the brain? They think it’s like a scar, a mark. This is one of the most fundamental mistakes in neuroscience. And the reason why I say it’s a mistake is simple. All evidence indicates that the neuron does not reset. The synapses do not reset. They are always different. They’re changing every millisecond. Your brain today is very, very different from what it was when you were 10 years old, and yet you may have profound memories from when you were 10. What has to be answered in neuroscience is this: How do you remember something from long ago when your brain now is actually different?
In our view, the idea that memory is held in the brain the same way it is held in a computer is fundamentally wrong.
So this is not a simulation in our preconceived idea an engineering simulation.
You needed an entire, very powerful computer to run that simulation of one neuron. You needed the whole process. Of course, today computers are a little more powerful, and I can run a simulation of 100 neurons. But there’s really no point in doing a 100-neuron simulation. The reason is simple: A neuron lives in an environment. It receives thousands of inputs. So actually you need to make a quantum leap from one neuron to 10,000 neurons. You need to make that leap into what we call a microcircuit. A circuit of five neurons is not what the mammalian brain is made up of. To simulate the neurocircuitry that is creating the mammalian brain, you need to make a quantum jump in complexity. You need at least 10,000 computers to do that. And that’s what Blue Gene is—16,000 processors squeezed into a space the size of four refrigerators. It was important for us to have that many processors, because on each one we’ve got 1,000 neurons. The processors themselves did not have to be extremely powerful. They just needed enough memory to hold the neurons.
Henry Markham is featured TED speaker, He says:
And we've got the mathematics to describe this process. So we can describe the communication between the neurons. There literally are only a handful of equations that you need to simulate the activity of the neocortex. But what you do need is a very big computer. And in fact you need one laptop to do all the calculations just for one neuron. So you need 10,000 laptops. So where do you go? You go to IBM, and you get a supercomputer, because they know how to take 10,000 laptops and put it into the size of a refrigerator. So now we have this Blue Gene supercomputer. We can load up all the neurons, each one on to its processor, and fire it up, and see what happens. Take the magic carpet for a ride. 
 I noticed these   incredulous  comments

Apr 4 2013: Hello I have a question. If you can simulate 10 000 neurons with a supercomputer now, if you follow Moore's law you will be able to simulate 5 120 000 neurons in 10 years time. The brain has 86 billion neurons.
How do you expect to achieve simulation of all of them?
The answer came from another reader:
 May 25 2013: with 168 super computers. Which might sound crazy but isn't that the idea with cloud computing? Grant it we would then potentially have a consciousness on the Internet which if we go by terminator is a bad thing. However it should be a consciousness equivalent to only 1 human which doesn't sound so bad.
In 2009, we have not discovered yet the Higgs particle using CERN collaborative computing involving over 150 data centers in the world based on the collaborative computing technology developed by Prof. Miron Livny from University of Wisconsin, Madision. It is interesting to know that both Henry Markram and Miron Livny graduated from Weizmann Institute of Science.

Helping Medicine

...there’s not a single neurological disease today in which anybody knows what is malfunctioning in this circuit—which pathway, which synapse, which neuron, which receptor. Doctors don’t even know this for a single drug—I mean, this is a multibillion-dollar industry!—that they’re giving for Parkinson’s disease, for depression, schizophrenia, attention deficit, autism, dementia, Alzheimer’s. When they give a drug, they have no idea what it does to this processor. And the neocortical column is the elementary processor for human beings to have coherent perception, attention, and memory. This is shocking. I mean, we are living in such a primitive time of medicine, you cannot imagine
 How the Blue Brain will help?
 If you have the model where you’re able to embody all the key parameters, you can start exploring a hypothesis for a disease. When you tweak the model, you can see what kind of pathology occurs. You’ll be able to really isolate very precisely what has gone wrong. If a certain part of the circuit malfunctions, it’s going to display certain symptoms. You can actually simulate and test hypotheses for different diseases. If we know which pathways are malfunctioning, then we can look and see what it means for the circuit, what kind of information is it not able to process. This can guide drug discovery by letting you simulate the effect of a drug on the circuit. You’re going to find out exactly how it operates, what it is altering. Drug discovery is terribly expensive, just to find out how one drug could or could not work and all its side effects. Simulations could cut drug discovery costs by 70 or 80 percent. What we call in-silica-based drug discovery—simulation-based drug discovery—is going to be the future.

A case for Bosco

This newly focus on brain research and computerized reconstructions of the human brain based on biology will require a colossal amount of computer power. It is the natural successor in the big challenges computing has to solve to simulate the brain.
Professor Adan Segev
The leader of Blue Brain Project in Israel is Adan Segev from Hebrew University. He received fundingy from Safra Foundation and as a great achievement we read in the interview:
Since 2001, he has been one of the leaders of the Blue Brain Project, the only participant to have a direct line to the Swiss IBM supercomputer, which costs $20 million and whose cooling alone with Geneva lake water has an annual price tag of $1 million. The supercomputer, he says, takes up about twice the area of his own office
The emphasis is mine: why accessing a super computer is such a privilege and probably only for initiated in this esoteric skill? What Profesor Segev says between the lines is the following
  • A single supercomputer, not matter how big, is a limited resource unable to accommodate all researchers in a project of such magnitude
  • We need tools to create an infrastructure of hundreds of if not thousands of supercomputing nodes and clusters all over the world
  • We need a simple tool to be able to access all these computers from a laptop like Mac
There is a tool, called Bosco . The tool, now in release 1.2 will allow -one day in the near futu student with a Mac connect to clusters and supercomputers.

Project Blue Brain  and the US Brain Activity Map need to provide a democratic access to the resources.

Watch Henry Markram at TED 2012

Blog Archive

About Me

My photo

AI and ML for Conversational Economy