[Originally posted on The NAG Blog]
I’m sure something like this is familiar to many readers of this blog. The focus here is HPC, but there is a similar story for mathematicians, numerical software engineeers, etc.
You've just met an old acquaintance. Or a family member is curious. Or at social events (when social means talking to real people not twitter/facebook). We see that question coming. We panic. Then the family/friend/stranger, asks it. We freeze. How to reply? Can I get a meaningful, ideally interesting, answer out before they get bored? What if I fail to get the message across correctly? Oops, this pause before answering has gone on too long. Now they are looking at me strangely. They are thinking the answer is embarrassing or weird. This is not a good start.
The question? “What do you do then?” Followed by: “Oh! So what exactly is supercomputing then?”
The hpcnotes HPC blog - supercomputing, HPC, high performance computing, cloud, e-infrastructure, scientific computing, exascale, parallel programming services, software, big data, multicore, manycore, Phi, GPU, HPC events, opinion, ...
Friday, 19 August 2011
Thursday, 11 August 2011
Big Data and Supercomputing for Science
It is interesting to note the increasing attention “big data” seems to be getting from the supercomputing community.
We talk about the challenges of the exponential increase in data, or even an “explosion of data”. This is caused by our ever-growing ability to generate data. More powerful computational resources deliver finer resolutions, wider parameter studies, etc. The emergence of individual scale HPC (GPU etc.) that is both cost-viable and effort-viable gives increased data creation capability to the many scientists not using high end supercomputers. And instrumental sources continue to improve in resolution and speed.
So, we are collecting more data than we have before. We are also increasing our use of multiple data sources – fusion from various sensors and computer models to form predictions or study scientific phenomena.
It is also common to questions such as: are we drowning in volume of data? Is this growth in data overwhelming our ability to extract useful information or insight? Is the potential value of the increased data lost by our inability to manage and comprehend it? Does having more data mean more information – or less due to analysis overload? Do the diversity of formats, quality, and sources further hinder data use?
Data explosion
We talk about the challenges of the exponential increase in data, or even an “explosion of data”. This is caused by our ever-growing ability to generate data. More powerful computational resources deliver finer resolutions, wider parameter studies, etc. The emergence of individual scale HPC (GPU etc.) that is both cost-viable and effort-viable gives increased data creation capability to the many scientists not using high end supercomputers. And instrumental sources continue to improve in resolution and speed.
So, we are collecting more data than we have before. We are also increasing our use of multiple data sources – fusion from various sensors and computer models to form predictions or study scientific phenomena.
It is also common to questions such as: are we drowning in volume of data? Is this growth in data overwhelming our ability to extract useful information or insight? Is the potential value of the increased data lost by our inability to manage and comprehend it? Does having more data mean more information – or less due to analysis overload? Do the diversity of formats, quality, and sources further hinder data use?
Labels:
data,
exascale,
strategy,
supercomputing
Monday, 8 August 2011
Summer season big changes - football or supercomputing?
The world of supercomputing has gone mad.
So it seems as I catch up on the news around the HPC community after a week's vacation. Just today the news of IBM walking away from half a decade's work on Blue Waters and the story of an unknown organisation [now revealed to be NVidia] tempting Steve Scott to leave his Cray CTO role have been huge news but thinking back over the summer months there has been more.
The immediate comparison to me is that of the European football summer season (soccer for my American readers). Key players are signed by new clubs, managers leave for pastures new (or are pushed), and ownership takeover bids succeed or fail. It feeds a few months of media speculation, social gossip, with occasional breaking news (i.e. actual facts) and several major moves (mostly big surprises, but some pre-hyped for long before). But clubs emerge from the summer with new teams, new ambitions, and new odds of achieving success.
The world of HPC has such a summer I think.
So it seems as I catch up on the news around the HPC community after a week's vacation. Just today the news of IBM walking away from half a decade's work on Blue Waters and the story of an unknown organisation [now revealed to be NVidia] tempting Steve Scott to leave his Cray CTO role have been huge news but thinking back over the summer months there has been more.
The immediate comparison to me is that of the European football summer season (soccer for my American readers). Key players are signed by new clubs, managers leave for pastures new (or are pushed), and ownership takeover bids succeed or fail. It feeds a few months of media speculation, social gossip, with occasional breaking news (i.e. actual facts) and several major moves (mostly big surprises, but some pre-hyped for long before). But clubs emerge from the summer with new teams, new ambitions, and new odds of achieving success.
The world of HPC has such a summer I think.
Labels:
blue waters,
cray,
hpc,
ibm,
intel,
ncsa,
people,
strategy,
supercomputing
Friday, 24 June 2011
ISC11 Review
ISC11 - the mid-season big international conference for the world of supercomputing - was held this week in Hamburg.
Here, I update my ISC11 preview post with my thoughts after the event.
I said I was watching out for three battles.
Review: None of this is definitive, but my gut reaction is that MIC won this battle. GPU lost. Fusion didn't play again. My feeling from talking to attendees was that MIC was second only to the K story, in terms of what people were talking about (and asking NAG - as collaborators in the MIC programme - what we thought). Partly because of the MIC hype, and the K success (performance and power efficient without GPUs), GPUs took a quieter role than recent years. Fusion, disappointingly, once again seemed to have a quiet time in terms of people talking about it (or not). Result? As I thought, manycore is now realistically meaning more than just NVidia/CUDA.
Review: Exascale won this one hands down, I think. Some lone voices still tried to talk about desktop HPC, missing middles, mass usage of HPC and so-on. But exascale got the hype again (not necessarily wrong for one of the year's primary "supercomputing" shows!)
Review: Hardware still got more attention than software. Top500, MIC, etc. Although ease-of-programming for MIC was a common question too. I did miss lots of talks, so perhaps there was more there focusing on applications and software challenges than I caught. But the chat in the corridors was still hardware dominated I thought.
Review: Well, I got those two wrong! Flags were out in force, with Japan (K, Fujitsu, Top500, etc) and France (Bull keynote) waving strongly among others. And clouds were seemingly the question to be asked at every panel! But in a way, I was still right - flags and clouds do matter and will get people talking - but I mainatin that manycore, exascale vs desktop, and the desperation of software all matter more.
What did you learn? What stood out for you? Please add your comments and thoughts below ...
Here, I update my ISC11 preview post with my thoughts after the event.
I said I was watching out for three battles.
GPU vs MIC vs Fusion
The fight for top voice in manycore/GPU world will be one interesting theme of ISC11. Will this be the year that the GPU/manycore theme really means more than just NVidia and CUDA? AMD has opened the lid on Fusion in recent weeks and has sparked some real interest. Intel's MIC (or Knights) is probably set for some profile at ISC11 now the Knights Ferry program has been running a while. How will NVidia react to no longer being the loudest (only?) noise in GPU/manycore land? Or will NVidia's early momentum carry through?
Review: None of this is definitive, but my gut reaction is that MIC won this battle. GPU lost. Fusion didn't play again. My feeling from talking to attendees was that MIC was second only to the K story, in terms of what people were talking about (and asking NAG - as collaborators in the MIC programme - what we thought). Partly because of the MIC hype, and the K success (performance and power efficient without GPUs), GPUs took a quieter role than recent years. Fusion, disappointingly, once again seemed to have a quiet time in terms of people talking about it (or not). Result? As I thought, manycore is now realistically meaning more than just NVidia/CUDA.
Exascale vs Desktop HPC
Both the exascale vision/race/distraction (select according to your preference) and the promise of desktop HPC (personal supercomputing?) have space on the agenda and exhibit floor at ISC11. Which will be the defining scale of the show? Will most attendees be discussing exascale and the research/development challenges to get there? Or will the hopes and constraints of "HPC for the masses" have people talking in the aisles? Will the lone voices trying to link the two extremes be heard? (technology trickle down, market solutions to efficient parallel programming etc.) What about the "missing middle"?
Review: Exascale won this one hands down, I think. Some lone voices still tried to talk about desktop HPC, missing middles, mass usage of HPC and so-on. But exascale got the hype again (not necessarily wrong for one of the year's primary "supercomputing" shows!)
Software vs Hardware
The biggie for me. Will this be the year that software really gets as much attention as hardware? Will the challenges and opportunities of major applications renovation get the profile it deserves? Will people just continue to say "and software too". Or will the debate - and actions - start to follow? The themes above might (should) help drive this (porting to GPU, new algorithms for manycore, new paradigms for exascale, etc). Will people trying to understand where to focus their budget get answers? Balance of hardware vs software development vs new skills? Balance of "protect legacy investment" against opportunity of fresh look at applications?
Review: Hardware still got more attention than software. Top500, MIC, etc. Although ease-of-programming for MIC was a common question too. I did miss lots of talks, so perhaps there was more there focusing on applications and software challenges than I caught. But the chat in the corridors was still hardware dominated I thought.
The rest?
What have I not listed? National flag waving. I'm not sure I will be watching too closely whether USA, Japan, China, Russia or Europe get the most [systems|petaflops|press releases|whatever]. Nor the issue of cloud vs traditional HPC. I'm not saying those two don't matter. But I am guessing the three topics above will have more impact on the lives of HPC users and technology developers - both next week and for the next year once back at work.
Review: Well, I got those two wrong! Flags were out in force, with Japan (K, Fujitsu, Top500, etc) and France (Bull keynote) waving strongly among others. And clouds were seemingly the question to be asked at every panel! But in a way, I was still right - flags and clouds do matter and will get people talking - but I mainatin that manycore, exascale vs desktop, and the desperation of software all matter more.
What did you learn? What stood out for you? Please add your comments and thoughts below ...
Friday, 17 June 2011
ISC 11 Preview
ISC11 - the mid-season big international conference for the world of supercomputing - is next week in Hamburg.
Will you be attending? What will you be looking to learn? I will be watching out for three battles.
GPU vs MIC vs Fusion
The fight for top voice in manycore/GPU world will be one interesting theme of ISC11. Will this be the year that the GPU/manycore theme really means more than just NVidia and CUDA? AMD has opened the lid on Fusion in recent weeks and has sparked some real interest. Intel's MIC (or Knights) is probably set for some profile at ISC11 now the Knights Ferry program has been running a while. How will NVidia react to no longer being the loudest (only?) noise in GPU/manycore land? Or will NVidia's early momentum carry through?
Exascale vs Desktop HPC
Both the exascale vision/race/distraction (select according to your preference) and the promise of desktop HPC (personal supercomputing?) have space on the agenda and exhibit floor at ISC11. Which will be the defining scale of the show? Will most attendees be discussing exascale and the research/development challenges to get there? Or will the hopes and constraints of "HPC for the masses" have people talking in the aisles? Will the lone voices trying to link the two extremes be heard? (technology trickle down, market solutions to efficient parallel programming etc.) What about the "missing middle"?
Software vs Hardware
The biggie for me. Will this be the year that software really gets as much attention as hardware? Will the challenges and opportunities of major applications renovation get the profile it deserves? Will people just continue to say "and software too". Or will the debate - and actions - start to follow? The themes above might (should) help drive this (porting to GPU, new algorithms for manycore, new paradigms for exascale, etc). Will people trying to understand where to focus their budget get answers? Balance of hardware vs software development vs new skills? Balance of "protect legacy investment" against opportunity of fresh look at applications?
The rest?
What have I not listed? National flag waving. I'm not sure I will be watching too closely whether USA, Japan, China, Russia or Europe get the most [systems|petaflops|press releases|whatever]. Nor the issue of cloud vs traditional HPC. I'm not saying those two don't matter. But I am guessing the three topics above will have more impact on the lives of HPC users and technology developers - both next week and for the next year once back at work.
What will you be looking out for?
Will you be attending? What will you be looking to learn? I will be watching out for three battles.
GPU vs MIC vs Fusion
The fight for top voice in manycore/GPU world will be one interesting theme of ISC11. Will this be the year that the GPU/manycore theme really means more than just NVidia and CUDA? AMD has opened the lid on Fusion in recent weeks and has sparked some real interest. Intel's MIC (or Knights) is probably set for some profile at ISC11 now the Knights Ferry program has been running a while. How will NVidia react to no longer being the loudest (only?) noise in GPU/manycore land? Or will NVidia's early momentum carry through?
Exascale vs Desktop HPC
Both the exascale vision/race/distraction (select according to your preference) and the promise of desktop HPC (personal supercomputing?) have space on the agenda and exhibit floor at ISC11. Which will be the defining scale of the show? Will most attendees be discussing exascale and the research/development challenges to get there? Or will the hopes and constraints of "HPC for the masses" have people talking in the aisles? Will the lone voices trying to link the two extremes be heard? (technology trickle down, market solutions to efficient parallel programming etc.) What about the "missing middle"?
Software vs Hardware
The biggie for me. Will this be the year that software really gets as much attention as hardware? Will the challenges and opportunities of major applications renovation get the profile it deserves? Will people just continue to say "and software too". Or will the debate - and actions - start to follow? The themes above might (should) help drive this (porting to GPU, new algorithms for manycore, new paradigms for exascale, etc). Will people trying to understand where to focus their budget get answers? Balance of hardware vs software development vs new skills? Balance of "protect legacy investment" against opportunity of fresh look at applications?
The rest?
What have I not listed? National flag waving. I'm not sure I will be watching too closely whether USA, Japan, China, Russia or Europe get the most [systems|petaflops|press releases|whatever]. Nor the issue of cloud vs traditional HPC. I'm not saying those two don't matter. But I am guessing the three topics above will have more impact on the lives of HPC users and technology developers - both next week and for the next year once back at work.
What will you be looking out for?
Thursday, 26 May 2011
Poll: legacy software and future HPC applications
I've added a new quick survey to the HPC Notes blog: "Which do you agree with more for developing the next generation of HPC applications?"
Is the argument about "protecting our invetsment" a load of nonsense? Or is throwing it all away and starting again irresponsible?
I've only allowed the two extremes as voting options - I know you might want to say "both" - but choose one!
See top right of the blog home page. Vote away ...
For clues on my own views, and some good audience debate, see the software panel video recording from the recent NCSA PSP annual meeting here: http://www.ncsa.illinois.edu/Conferences/2011Meeting/agenda.html
Blog on the topic to follow shortly ... after some of your votes have been posted :-)
Is the argument about "protecting our invetsment" a load of nonsense? Or is throwing it all away and starting again irresponsible?
I've only allowed the two extremes as voting options - I know you might want to say "both" - but choose one!
See top right of the blog home page. Vote away ...
For clues on my own views, and some good audience debate, see the software panel video recording from the recent NCSA PSP annual meeting here: http://www.ncsa.illinois.edu/Conferences/2011Meeting/agenda.html
Blog on the topic to follow shortly ... after some of your votes have been posted :-)
Thursday, 24 March 2011
Poll: exascale, personal or industrial?
I've added a new quick survey to the HPC Notes blog: "Which is more interesting - exascale computing, personal supercomputing or industry use of HPC?"
See top right of the blog home page. You can even give different answers for "reading about" and "working on"...
See top right of the blog home page. You can even give different answers for "reading about" and "working on"...
Labels:
exascale,
personal supercomputing
Investments Today for Effective Exascale Tomorrow
I contributed to this article in the March 2011 The Exascale Report by Mike Bernhardt.
"Initiatives are being launched, research centers are being established, teams are being formed, but in reality, we are barely getting started with exascale research. Opinions vary as to where we should be focusing our resources.
In this issue, The Exascale Report asks NAG's Andy Jones, Lawrence Livermore's Dona Crawford, and Growth Science International's Thomas Thurston where should we (as a global community) be placing our efforts today with exascale research and development?"
"Initiatives are being launched, research centers are being established, teams are being formed, but in reality, we are barely getting started with exascale research. Opinions vary as to where we should be focusing our resources.
In this issue, The Exascale Report asks NAG's Andy Jones, Lawrence Livermore's Dona Crawford, and Growth Science International's Thomas Thurston where should we (as a global community) be placing our efforts today with exascale research and development?"
Labels:
exascale,
interview,
software,
strategy,
supercomputing,
The Exascale Report
Friday, 18 March 2011
Performance and Results
[Originally posted on The NAG Blog]
What's in a catch phrase?
As you will hopefully know, NAG's strapline is "Results Matter. Trust NAG".
What matters to you, our customers, is results. Correct results that you can rely on. Our strapline invites you to trust NAG - our people and our software products - to deliver that for you.
When I joined NAG to help develop the High Performance Computing (HPC) services and consulting business, one of the early discussions raised the possibility of using a new version of this strapline for our HPC business, reflecting the performance emphasis of the increased HPC activity. Probably the best suggestion was "Performance Matters. Trust NAG." Close second was "Productivity Matters. Trust NAG."
What's in a catch phrase?
As you will hopefully know, NAG's strapline is "Results Matter. Trust NAG".
What matters to you, our customers, is results. Correct results that you can rely on. Our strapline invites you to trust NAG - our people and our software products - to deliver that for you.
When I joined NAG to help develop the High Performance Computing (HPC) services and consulting business, one of the early discussions raised the possibility of using a new version of this strapline for our HPC business, reflecting the performance emphasis of the increased HPC activity. Probably the best suggestion was "Performance Matters. Trust NAG." Close second was "Productivity Matters. Trust NAG."
Labels:
hpc,
multicore,
NAG,
parallel programming,
performance,
software
Thursday, 17 March 2011
The Addictive Allure of Supercomputing
The European Medical Device Technology (EMDT) magazine interviewed me recently. InsideHPC also has pointed to the interview here.
The interview discusses false hopes of users: "Computers will always get faster – I just have to wait for the next processor and my application will run faster."
We still see this so often - managers, researchers, programmers even - all waiting for the silver bullet that will make multicore processors run their application faster with no extra effort from them. There is nothing now or coming soon that will do that excpet for a few special cases. Getting performance from multicore processors means evolving your code for parallel processing. Tools and parallelized library plugins can help - but in many cases they won't be a substitute for re-writing key parts of the code using multithreading or similar techniques.
The interview discusses false hopes of users: "Computers will always get faster – I just have to wait for the next processor and my application will run faster."
We still see this so often - managers, researchers, programmers even - all waiting for the silver bullet that will make multicore processors run their application faster with no extra effort from them. There is nothing now or coming soon that will do that excpet for a few special cases. Getting performance from multicore processors means evolving your code for parallel processing. Tools and parallelized library plugins can help - but in many cases they won't be a substitute for re-writing key parts of the code using multithreading or similar techniques.
Thursday, 10 March 2011
Meeting HPC people
About a year ago, I wrote this article for ZDNet UK, describing what I thought were some of the key events in the supercomputing/HPC community.
I said: "Many people have rightly remarked that the HPC community really is that — a community — and that there is still a relatively high degree of connection between the various practitioners. In other words, despite its growing size and global reach, it feels like a small community. People know each other. Consequently, networking, whether technical or commercial, goes a long way to helping your business."
And: "Whatever your scale of technical computing, from multicore workstations to multi-thousand-node supercomputers, getting involved with the active HPC community can help you with your parallel computing goals. Online resources can help, but by far the most effective way of benefiting from the wider HPC community is by participating at the right events."
I listed some key events, with a comment about the nature and value of each.
I have now added a survey to this website (top right) to find out which events people plan to attend in 2011.
I may have missed out your favourite conference in the original article, or in the survey above, in which case I would like to hear about it too - maybe via the comments page here, or directly.
I hope to meet soome of you when out and about in the coming year ...
I said: "Many people have rightly remarked that the HPC community really is that — a community — and that there is still a relatively high degree of connection between the various practitioners. In other words, despite its growing size and global reach, it feels like a small community. People know each other. Consequently, networking, whether technical or commercial, goes a long way to helping your business."
And: "Whatever your scale of technical computing, from multicore workstations to multi-thousand-node supercomputers, getting involved with the active HPC community can help you with your parallel computing goals. Online resources can help, but by far the most effective way of benefiting from the wider HPC community is by participating at the right events."
I listed some key events, with a comment about the nature and value of each.
I have now added a survey to this website (top right) to find out which events people plan to attend in 2011.
I may have missed out your favourite conference in the original article, or in the survey above, in which case I would like to hear about it too - maybe via the comments page here, or directly.
I hope to meet soome of you when out and about in the coming year ...
Labels:
events,
hpc,
people,
supercomputing
Subscribe to:
Posts (Atom)