After a bit of griping about the terrible Wi-Fi offered by the Salt Palace, this year’s Supercomputing conference venue, the guys move on to a truly big issue: Are LINPACK and HPCC benchmarks useful? Should they be constantly re-evaluated? And shouldn’t you really test machines on the kinds of workloads they’re designed to run?
The catalyst for this discussion is the Blue Waters system, for which no LINPACK numbers have been submitted. Yes, it’s behind schedule, and sure, they’re busy doing the science… but is it also a shot across the bow? Are they rebelling against industry philosophy?
If they are, that’s a good thing, according to Henry – because a system is about what you plan to do with it, not how many flops you can get out of it. Rich agrees: if you get a giant LINPACK number on a system that has reliability issues, and you can’t output real science because all your time and money is invested in brute computation, what good is it? And the industry sectors doing meaningful work – where are their systems on the TOP500? They’re not playing this game.
Spoiler alert: Henry agrees with Dan on something. Really. It’s at the 10:00 mark, if you’ve got to see it to believe it. We hardly believed it ourselves.
Check the Radio Free HPC website for new episodes and more great content from Rich Brueckner of insideHPC, Henry Newman of Instrumental, Inc., and GCG’s Dan Olds.
