Square Kilo Scope Pushes Limits

Ever hear of the Square Kilometer Array? It’s a plan to build the largest radio telescope in the world: 3,000 15-meter dishes that will take up a full square kilometer in total.

Right now, they’re putting the finishing touches on the plan and figuring out where to build it; South Africa and Western Australia are on the short list. They expect to begin preconstruction (ordering parts and stuff) in 2012. Actual construction should begin in 2016, and full operation in 2024.

When complete, it’s going to be 10,000 times more sensitive than the best radio telescope today, so it’s expected to generate some profound discoveries. The Big Questions that the SKA will help answer include the origins of the universe; the nature of Dark Matter and Dark Energy (which kind of creeps me out); and whether Einstein was right with his General Relativity Theory – we’ll know if space is truly bendy or not.

Read More

HP Gets Analytical; Opening another front against Oracle?

With all of the attention focused on the war raging between Oracle and Hewlett-Packard on the server front, a significant HP announcement in late June seemed to slip under the collective radar of the industry press.

On June 20, the company announced general availability of Vertica 5.0, the newest version of the Vertica Analytics Platform, along with some integrated appliance-like bundles combining Vertica with HP hardware. HP purchased the company earlier this year (Register story here), and it looks like Vertica is going to be HP’s key play in the burgeoning ‘big data’ market.

The foundation of the Vertica platform is the columnar database which, as the name implies, handles data in columns. This column-centric design can yield huge advantages vs. traditional row-oriented databases in certain situations – primarily read-centric data warehouses.

Read More

King K Super; Are GPUs Still GPU-riffic?

It’s been an eventful ISC. The Japanese sprang their K Computer on an unsuspecting HPC world, throwing down 8.126 Pflops on the table and raising the high-water performance mark by a factor of three. Just as surprising was the fact that they did it the old-fashioned way – with semi-proprietary processors, a custom interconnect, and no fancy accelerators.

Was it only six months ago when the Chinese, with their 2.56 Pflop Tianhe system, appeared to have locked down the top spot for at least a year or more with heavy use of GPU accelerators? This led many pundits (myself included) to say that the age of hybrid HPC was upon us, and that we probably wouldn’t see another non-hybrid system topping the chart anytime soon.

So is the K computer a signpost pointing to the resurgence of traditional CPU plus custom interconnect HPC? Or is it an aberration on the road to our hybrid future?

Read More

2010-11 Unix Vendor Preference Survey: Vendor Face-Off

Gabriel Consulting Unveils Key Findings of 2010-11 Unix Server Vendor Research IT Pros Judge Major Vendors on Technology, Support, and Customer Satisfaction Criteria BEAVERTON, Oregon – June 22, 2011 — Gabriel Consulting Group (GCG), an independent analyst firm, today released…

Read More

Webcast: Top500 Founders from ISC’11

As many of you know by now, there’s a new computer at the tippy top of the just-published Top500 list. I found out a bit earlier than most, via a webcast I was recording at 3:00 a.m. my local time on Sunday. It was a bit of a struggle to record it; their net connection kept breaking, and the sound quality on my side was like a Dixie cup and string, but I heard enough to know it was big news.

This system was a surprise in a lot of ways. No one expected it this soon, or expected it to triple the performance record. On the Register webcast (here), I have a conversation with Rich Brueckner (from InsideHPC.com) along with Top500 co-founders Dr. Hans Meuer and Dr. Jack Dongarra. We talk about the evolution of the list, the trends driving it and, of course, the K Computer.

Read More