How NOT to Evaluate Disk Reliability

A few months ago Brian Beach, a Distinguished Engineer at cloud backup joint Backblaze, published a set of study-like blog postings relating their experiences with hard drive lifespan in their 25,000 plus spindle environment.

The blogs garnered quite a bit of interest due to the subject matter, and provocative titles like: “How Long Do Hard Drives Last?” “Enterprise Drives: Fact or Fiction?” and “What Hard Drive Should I Buy?” The blogs raise interesting questions and put forward controversial conclusions.

One of most contentious claims came from the first blog (Simon Sharwood covers it here) in which Beach asserts that consumer-grade hard drives are actually more reliable than their supposedly industrial strength (and definitely more pricey) enterprise cousins. According to their research, enterprise drives failed at an annual rate of 4.6% vs. 4.2% for the consumer versions.

The bottom line, according to Beach, is that consumer drives are a better choice (even after factoring in the longer enterprise warranty) due to their higher reliability and lower cost.

Read More

Is Lenovo Now “Super?”

Back in May 2013, when IBM was first rumored to be selling part of their System x division to Lenovo, I did a story on how this might impact IBM’s position in the HPC market. It’s a good time to revisit that analysis and see what the picture looks like now.

As I noted then, it’s damned hard to get solid market share and sales figures for HPC systems. Some of this is due to the general fogginess of measuring overall system sales. But it’s mostly because it’s so difficult to divide “HPC systems” from “non-HPC systems”, since the true definition of HPC should include not only traditional research applications but also compute-intensive enterprise workloads like predictive analytics and many Big Data-related tasks.

So with this in mind, I’ve use the latest TOP500 Supercomputer list as my stalking horse to get a look at how the IBM-Lenovo transaction will affect the standings at the top end. It’s not the perfect proxy for the broader HPC market, but it’s better than nothing, right?

Chart-Topping Vendors

Since the early 2000s, IBM and HP have swapped the lead in the number of systems placed on the TOP500. On the most recent list, HP landed 196 systems vs. IBM’s 164 system total. The next closest competitors are Cray with 48 systems and SGI with 17.

Lenovo’s New System x Biz: A server force to be reckoned with?

Only eight months ago the industry was rife with rumors that IBM was selling all or part of its x86 business to Lenovo. It took a while, but the deal has gone down and it’s now up to the lawyers and accountants to finalize the paperwork.

This deal has turned out to be (way) more comprehensive than anticipated. Lots of folks, including me, figured that if IBM was looking to sell they’d get rid of the low-end commodity part of the business. This would include the pizza boxes and towers that are the cheapest and least differentiated part of the line.

Rather than part out System x like a tired old jitney, IBM opted to sell the entire division – including the high-end, differentiated boxes that they’ve poured money into since the turn of the century. Here’s how an IBM briefing slide positioned what Lenovo is getting for their money:


 

Read More

All Student Cluster Competition, All the Time

The 2013 Supercomputing Conference is here, and you know what that means: we’re covering the Student Cluster Competition wall-to-wall. This is a live face-off of university students who have built their own supercomputers. They’re given real, scientific workloads to run and…

Read More

“Predictalator” Forecasts Sports Futures

Have you ever wanted to lay a bet down on the Las Vegas sports books? Or take a bundle of money from the suckers in your office Final Four betting pool? Is it possible to win sports bets consistently over the long haul?

According to the guys at PredictionMachine.com, the answer is yes. Their secret weapon? Numbers, baby – it’s all in the numbers. They’ve built a formidable prediction tool with their Predictalator, the name they’ve given their methodology and algorithms.

Over the last three years, the Predictalator has scored wins on better than 70% of their straight-up bets, and better than 50% vs. the Vegas spread. This is a combined record on all of the professional and college football, basketball, pro hockey, and Major League Baseball games they’ve picked over the last three years.

What’s interesting about their approach is that they aren’t looking at historical trends and trying to quantify their hunches. They’re taking a highly scientific and quantitative approach with sophisticated models and lots of simulations. In fact, their tool simulates each and every game 50,000 times before making a pick.

My pal Rich Brueckner and I had a chance to interview PredictionMachine.com founder Paul Bessire as part of our Radio Free HPC podcast a few weeks ago. In the program, Paul discusses the genesis of the Predictalator and how he turned his master’s degree in quantitative analysis into a full-time business.

Read More

Cray Turns Cluster Crank with ScaleMP

Does your performance in the data center suffer because you don’t have enough memory to really get the job done? Do you have apps that don’t perform well on clusters, or don’t parallelize at all? If this describes you or a loved one, read on, because Cray thinks they have the solution to what ails you.

Cray, with partner ScaleMP, recently announced two new systems that aim to cure your memory woes, in distinctly different ways.

Read More

Must-See Geek TV; Brainy science featuring explosions and Kumar

Are you sick of brainless, insipid, hackneyed, and utterly stupid TV shows? (I’m not, but some of you might be.) If you are, then direct your eyes and DVRs to a new show on Discovery Channel called The Big Brain Theory. The premise of the show sounds like typical reality fare: a set of contestants are competing to win a job in some organization. But this show isn’t typical at all – it is, in a word, amazing.

Contestants on The Big Brain Theory have to use real science to tackle truly difficult tasks. In last week’s premiere episode, teams watched a demonstration in which two small pickup trucks were crashed head-on at 35 mph. In the bed of each truck was a wooden box that exploded in the collision. The host explained that the crash generated 40g of force on the 160lb package, which was rigged to explode if subjected to 25g or more. The task: design a mechanism that will ensure that the package experiences less than 25g in an identical crash.

Read More

Radio Free HPC: Two-Fer Tuesday!

First up: this podcast. The Radio Free HPC team discusses Lustre and LUG 2013 with Brent Gorda. Now part of Intel in their High Performance Data Division, Gorda was CEO of Whamcloud when the company was acquired last summer. Gorda…

Read More

The IBM x-ectomy: HPC Impact

The increasing drumbeat tapping out news of IBM unloading all or part of their System x (x86-based servers) business onto Lenovo got me wondering about the HPC implications. What’s the possible impact on IBM, and the HPC market, if Big Blue dumps the x86 end of their system business?

These systems are big business, albeit at margins ranging from low to extremely low. As our pal TPM points out here, IBM has never been good at low-margin lines of business. However, IBM is pretty good at HPC, and HPC is a market where bunches of low-cost boxes are combined to build the largest, and some of the most expensive, systems in the world. And the innovations that drive HPC performance eventually find their way into enterprise and even consumer tech products.

It’s impossible to find good numbers on exactly how many HPC systems and individual models are in use today. In fact, it’s increasingly difficult to differentiate an HPC workload vs. an enterprise compute-intensive or Big Data workload. (I’d argue that they’re the same thing, with the only difference being the data they’re analyzing and the questions they’re answering.)

Since I couldn’t get solid numbers for analysis of the entire market, I worked over the latest Top500 list to see what’s what at the high end of HPC. As a system vendor, IBM has more systems (193, or 39%) on the Top500 list than any of their competitors. HP is in second place with 146 systems for a 29% share. IBM’s lead gets larger when you look at performance. Total Petaflop/s of IBM’s systems on the list equals 66.216 vs. Cray’s second place showing of 28.189 Pflop/s.

Read More