Thus it was very neat to see an interesting article in HPCwire by Michael Feldman about the US National Science Foundation and Microsoft agreeing to offer researchers free access to the Windows Azure cloud platform.
“One of the major goals here is freeing scientists and engineers from being tied to local datacenters for their computational work. In general, NSF-funded researchers at universities are reliant on local systems – desktops, clusters and full-blown supercomputers – which, themselves, are often funded by the agency. But a lot of scientific applications are too big for desktops and too small for supercomputers, which means researchers are dependent upon compute and storage clusters housed in university facilities. The problem is that these institutions are not in the IT infrastructure business, so there is strong motivation to offload the procurement and management of these systems to someone else. . . .
“The typical application profile would be one that was data-heavy, highly-parallel, but didn't require tight communication between compute nodes or top 10 supercomputing-level capability. A lot of scientific computation falls into this category, especially that which is based on parallel algorithms for mining large datasets.”And, according to Microsoft’s Dan Reed, who leads Microsoft's eXtreme Computing Group: “The purpose of computing is insight, not numbers.”
Read the full article on HPCwire.com.