So, here goes - my review of the year 2012 in supercomputing and related matters. Thoroughly biased, of course, towards things that interested me throughout the year.
Predictions for 2012
Towards the end of 2011 and in early 2012 I made various predictions about HPC in 2012. Here are the ones I can find or recall:
- The use of "cloud computing" as the preferred marketing buzzword used for large swathes of the HPC product space would come to an end.
- There would be an onslaught of "Big Data" (note the compulsory capital letters) as the marketing buzzword of choice for 2012 - to be applied to as many HPC products as possible - even if only a tenuous relevance (just like cloud computing before it - and green computing before that - and so on ...)
- There would be a vigorous ongoing debate over the relative merits and likely success of GPUs (especially from NVidia) vs. Intel's MIC (now called Xeon Phi).
- ARM would become a common part of the architecture debate alongside x86 and accelerators.
- There would be a growth in the recognition that software and people matter just as much as the hardware.
So - how did those predictions match reality? (Or - how can I pretend I was right all along?)
Well, "cloud computing" is still with us as a label for a whole range of products and services that I think could be delivered just as effectively without that marketing label (and that already existed before "cloud" was invented). However, it is no longer being used as the first choice buzzword of HPC marketing departments. It is no longer guaranteed to turn up in any HPC panel discussion. "Cloud" has been replaced by "Big Data".
No-one can doubt the explosion of "Big Data" throughout the HPC community in 2012 - whether applied to all sorts of products that previously existed quite happily without a "Big Data" label, or mandatory mentions in panel discussions on any aspect of HPC at every HPC event, or sometimes even in a context that is actually relevant! The "must-mention-at-any-cost" aspect of "Big Data" threatened to hide that there was a real issue within - both in terms of technical challenges and business opportunities. Data (big, fast, complex, whatever) and HPC have always been natural partners but there is so much potential for more to come from this intersection.
In 2012, GPU computing eventually became normal. So much so in fact, that it has taken on the role of incumbent against the newcomer - Intel's Xeon Phi (I still think "MIC"). Anyone tracking the future of HPC either towards exascale computing or towards more energy & cost efficient individual HPC will know that some form of manycore seems to be inevitable as the base processor architecture. GPUs and MIC each have their own strengths - and neither are probably the final format that will see us through the exascale era. However, both are here now.
Much of 2012 was spent in anticipation of the launch of Xeon Phi. NVidia continued to march onwards with the announcement of Kepler products. Both sides worked the marketing and technical conversations to highlight the advantages of their manycore solution. Even AMD joined in at SC12 with the S10000 (how many zeros?). I think it is safe to say we haven't seen the end of this battle yet.
2012 also saw the emergence of ARM as a serious option in discussions around processors that could take us to both exascale computing and energy efficiency for individual-scale HPC systems. Several prototype/research projects using ARM for HPC progressed in 2012. Some vendors even announced ARM based servers for technical computing. ARM still needs to evolve in various aspects to catch up with x86 (and even GPUs) in HPC but the creeping of ARM into the HPC community has a quiet but firm momentum.
Finally
So, as you can see, I am claiming that all of my predictions for 2012 have turned out to be true! It is possible I am a visionary seeing the future of HPC. It is more likely I was stating the flippin' obvious and steering clear of anything controversial :-)
However, I have yet to address my last prediction: "a growth in the recognition that software and people matter just as much as the hardware". I think this has only partly come true.
Yes, there are people out there who have seen the light (I am biased) and have realized that, however powerful the hardware is, it requires effective software/applications and people to really deliver the best science and engineering output. (Or, as someone told me in an excellent phrase "to actually realize some impact based on having moved a lot of electrons around a big heat-generator".)
And the theme of software and people does turn up in discussion panels and opinion articles reasonably regularly. But there has still been a focus on measuring supercomputing efforts by the scale of the hardware - a supercomputer centre needs a big machine - a centre with a big HPC software team but only a small supercomputer doesn't yet carry the same impact/prestige in the community.
There has been a steady growth in the number of commercial and public sector organisations investing in HPC software innovation services and skills development/training. (Which is good - because that is what we [NAG] deliver - and business has grown nicely in 2012.)
However, I think there is still so much untapped potential in software innovation that users and providers of HPC services can exploit.
One of the strongest outcomes of NAG's HECToR Computational Science and Engineering (CSE) Support Service (the UK national supercomputing service) is a large body of evidence that has proved that investing in the application software alongside the hardware delivers much more science that investing the same total budget in supercomputers alone.
Next - the rest of 2012 and looking onwards to 2013
In Part 2 of this review, I discuss the themes and events that emerged during the year (i.e. the ones I didn't predict). Then, in the first few days of the new year, I will make my predictions for 2013 in HPC. Follow me on twitter (@hpcnotes) to catch the next blog posts - and for my general tweets on all things HPC.
2 comments:
For 2013, Big Data is still driving the forefront of HPC and the Cloud Computing facets. Structures like Hadoop are expected to grow in sophistication and usage as agile marketing and business intelligence now operate parallel processing of metrics.
In my view, Big Data is both a real issue and an over-hyped one. I am not convinced that Big Data is driving the forefront of HPC, although it is probably one of the main drivers for the near future, alongside more traditional uses of HPC (both in industry and public sector).
Post a Comment