Radio Free HPC: Two-Fer Tuesday!

First up: this podcast. The Radio Free HPC team discusses Lustre and LUG 2013 with Brent Gorda. Now part of Intel in their High Performance Data Division, Gorda was CEO of Whamcloud when the company was acquired last summer. Gorda…

Read More

The IBM x-ectomy: HPC Impact

The increasing drumbeat tapping out news of IBM unloading all or part of their System x (x86-based servers) business onto Lenovo got me wondering about the HPC implications. What’s the possible impact on IBM, and the HPC market, if Big Blue dumps the x86 end of their system business?

These systems are big business, albeit at margins ranging from low to extremely low. As our pal TPM points out here, IBM has never been good at low-margin lines of business. However, IBM is pretty good at HPC, and HPC is a market where bunches of low-cost boxes are combined to build the largest, and some of the most expensive, systems in the world. And the innovations that drive HPC performance eventually find their way into enterprise and even consumer tech products.

It’s impossible to find good numbers on exactly how many HPC systems and individual models are in use today. In fact, it’s increasingly difficult to differentiate an HPC workload vs. an enterprise compute-intensive or Big Data workload. (I’d argue that they’re the same thing, with the only difference being the data they’re analyzing and the questions they’re answering.)

Since I couldn’t get solid numbers for analysis of the entire market, I worked over the latest Top500 list to see what’s what at the high end of HPC. As a system vendor, IBM has more systems (193, or 39%) on the Top500 list than any of their competitors. HP is in second place with 146 systems for a 29% share. IBM’s lead gets larger when you look at performance. Total Petaflop/s of IBM’s systems on the list equals 66.216 vs. Cray’s second place showing of 28.189 Pflop/s.

Read More

China Cluster Combat Concluded

The first annual Asia Student Cluster Challenge (ASCC) has raised the bar: a monster LINPACK score and the first-ever three-peat overall champion! Join the conversation at

Read More

GTC 2013: Car Crazy: State-of-the-Art Sat Nav and more

Is it easier to drive on a cartoon map or one with real 3D roads, buildings, and sometimes even people? Audi (and Google) believe that more real is more better, so they’ve incorporated Google Earth into Audi’s sat nav. (This feature is also available in some VW models as well.) It’s not a completely new feature, but it’s been upgraded substantially over the past few years to be much more than just nav with satellite photos.

In this short video Dave Anderson, NVIDIA’s Automotive Applications Manager, walks us through the nav feature on an Audi A7 that was parked on the show floor at GTC 2013. In our exercise, we charted a course from the San Jose convention center to downtown Detroit – a 2,400 mile trip.

Read More

GTC 2013: Car 2.0; Software-defined driving

What’s your next car dashboard going to look like? The answer may well be: however you want it to look. If NVIDIA has its way, car manufacturers and owners will have a much wider range of choices when it comes to dashboard displays, navigation sophistication, and personalizing car to owner.

The computer hardware foundation is NVIDIA’s Visual Computing Module (VCM), a small computer on a card with a Tegra processor, RAM, and solid state memory. This module is the brain of the car-to-user interface: it manages the instrument cluster, entertainment system, climate control, communications, and navigation tasks.

If NVIDIA’s VCM catches on, it could give rise to a new wrinkle in the auto market: an upgrade path. In the past, cool new features were introduced to the motoring public in new models every year or two. Sure, you could upgrade the entertainment and nav features with aftermarket products, but installing a fully integrated solution can be an expensive proposition. Many drivers take the cheap route by using their smartphones or other devices for directions or entertainment storage.

According to Dave Anderson, NVIDIA’s Automotive Applications Manager, NVIDIA is planning to keep the same packaging (including the pinout) for successive VCMs, so they’d have socket compatibility from generation to generation. Improvements could include more memory, more processing power, and new features/functions that the faster hardware would enable.


Read More

IBM Elective X-ectomy? Partial or Full?

Is IBM selling all or part of their x86 server biz to PC giant Lenovo? Nothing has been confirmed by either party, but it sure looks like there’s something in the works. So what might this deal look like? Begin speculation mode now…

I don’t see IBM selling off the entire System x division to Lenovo. What they’ll sell is the most commoditized and least differentiated boxes in their product line. This means goodbye to their single- and dual-socket tower servers for sure. It also means that they’ll probably bid farewell to almost all of the models in their M3 and M4 rackmount server lineup. Selling off this part of System x removes sixteen SKUs. So what will IBM keep?

Read More

Radio Free HPC: The Joy of RDMA?

In this follow-up podcast to the GPU Technology Conference, the RFHPC team mulls over a talk by GE’s Dustin Franklin, GPU app specialist. Dustin’s topic was GPU-direct RDMA; was this a first look at real-world RDMA with GPU-to-GPU communications? Follow…

Read More

Radio Free HPC: FPGAs and Stuff

In this episode of Radio Free HPC, Rich, Dan, and Henry discuss the recent buzz surrounding FPGAs. After being sidelined by accelerators, they’re increasingly being used in appliances. Big vendors are talking about FPGAs not only for appliances but for…

Read More

ASC13: Shanghai Smackdown

The first annual Asia Student Cluster Challenge (ASCC) culminates this week with a final round of competition that brings ten university teams to Shanghai for a live cluster-off. Learn about the challenge and meet the teams here, and join the conversation at

Read More

GTC 2013: Audi Drives Auto Technology; Steers course for safety/convenience

Do you hate driving in the city? Do you fly into a rage when you can’t find a parking spot after spending hours in heavy traffic on the way to your destination? Do you have trouble deciphering SatNav directions while paying attention to your surroundings? Or do you get easily distracted while driving and become a safety threat to all around you? Does the fact that your car can’t predict the future piss you off?

Audi, along with US-based research universities, has been studying just such problems. They provided a peek at their findings and solutions at the recently concluded GTC 2013 show in San Jose. The first finding is that more than 80% of the US population lives in some sort of urbanized area – a 12% increase over the last decade or so. City driving has become more difficult, and most accidents happen in cities – although, in the US at least, rural accidents account for most fatalities.

Audi also found that driver frustration and distraction play a large role in both urban and rural accidents. (I think they’d classify the “driving with their head up your ass” driver as “distracted.”) They did deep research into this topic, breaking accident statistics down into twelve categories such as “forward into decelerating object,” “road depart right/left,” “sideswipe left/right,” etc. These findings and others shaped their solution: “Audi Urban Intelligent Assist.”

The guys from Ingolstadt (Audi HQ) did a good job of laying out their vision for future Audi drivers. It starts, modestly enough, by giving Audi vehicles the ability to predict the future. They’ll also be personalizing cars for individual drivers, improving controls, and providing systems that give more and better automated assistance. These are great goals, but what do they really mean when the rubber meets the road?

Read More