Friday, April 11, 2014

Call for Papers for CARLA 2014 Conference - First HPCLaTAM - CLCAR Joint Conference in Valparaiso, Chile.

Call for Papers - CARLA 2014:
Latin America High Performance Computing Conference 
First HPCLATAM – CLCAR Joint Conference
Organized by CCTVal (USM) & NLHPC (UCHILE)
20 - 22 October 2014, Valparaíso, Chile

Paper submission deadline: 15 June 2014
Notification of acceptance: 15 August 2014
Camera-ready papers due: 15 September 2014
Conference dates: 20 - 22 October 2014

Building on the success of the previous editions of the HPCLATAM and the CLCAR Conferences, in 2014 both major HPC Latin-American workshops will joint in CARLA 2014, to be held in Valparaíso Chile. This conference also includes the Third High Performance Computing School-ECAR 2014 (from 13-17 October, 2014).

The main goal of the CARLA 2014 conference is to provide a regional forum fostering the growth of the HPC community in Latin America through the exchange and dissemination of new ideas, techniques, and research in HPC. The conference will feature invited talks from academy and industry, short- and full-paper sessions presenting both mature work and new ideas in research and industrial applications. Suggested topics of interest include, but are not restricted to: 
- Parallel Algorithms and Architectures
- Parallel Computing Technologies
- High Performance Computing Applications
- Distributed systems
- GPU Computing: Methods, Libraries and Applications
- Grid and Cloud Computing
- Data Management and Visualizations and Software Tools
- Tools and Environments for High Performance System Engineering

Authors are invited to submit full and short papers written in English, according to the Springer LNCS style ( Full papers must have between 10 to 15 pages and short papers must have between 4 to 6 pages. Papers must not have been previously published or submitted elsewhere. Submissions will be handled electronically using the Easy Chair system ( All paper submissions will be carefully reviewed by at least two experts and returned to the author(s). The authors of accepted papers must guarantee that their paper will be presented at the conference. 
All accepted full papers will be included in the CARLA 2014 proceedings that will be published in the serie CCIS of Springer: Communications in Computer and Information Science ( In addition, authors of accepted full papers will be invited to submit extended versions to a Special Issue of Cluster Computing (Impact factor: 0.776):
All accepted short papers will be published electronically in the “Book of Ongoing Works”. 

Tuesday, February 25, 2014

Cloud for a smart economy and smart society at Cloudscape VI

Cloudscape VI came to a close just moments ago. The event, which was held at the Microsoft Centre in Brussels, Belgium, featured much discussion of the legal aspects of cloud computing, as well as the potential benefits of cloud computing for both business and scientific research. Bob Jones, head of CERN openlab, also gave an important update on Helix Nebula at the event and we’ll have full coverage of this, as well as the rest of the conference highlights, in next week’s issue of iSGTW.

Ken Ducatel, head of software and services, cloud computing at the European Commission, spoke at Cloudscape VI about the variety of business models still evolving around cloud computing. “There are a lot of business models and they’re very complex: there’s no one-size fits all solution,” he says.

Meanwhile, Linda Strick, coordinator of the Cloud for Europe project, spoke at the event about how the different nation states which exist in Europe can make the provision of public services via the cloud particularly difficult.  “We need to initiate dialogues between public sector and industry, and address concerns on data protection, security, legal, and contractual aspects,” says Strick.

In addition, the environmental impacts of cloud computing were discussed at the event. “ICT is enabling energy reduction through optimization, but ICT also consumes a lot of energy,” says Kyriakos Baxevanidis, deputy head of the European Commission’s Smart Cities and Sustainability unit within DG CONNECT. “If the cloud were a country, it would rank fifth in the world in terms of energy consumption.”

Other highlights of the event included a brief presentation from Simon Woodman from the University of Newcastle, UK, on e-Science Central, as well as information given on the EU Brazil Cloud Connect project, which kicked off earlier this month. You can read more about e-Science Central in our in-depth interview, here, and we’ll have more on EU Brazil Cloud Connect in next week’s issue of iSGTW.

Finally, it was announced at the event that that the popular Cloudscape conference series will soon be heading away from Brussels for the first time, with the first of Cloudscape Brazil conference set to be held later this year. Again, we’ll have more on this in next week’s issue…

Thursday, August 29, 2013

A Latin America Collage in High Performance and Large Scale Computing

Speakers and Contributions in CLCAR 2013
CLCAR 2013 in San José Costa Rica, shown an interesting "collage" of the High Performance and Large Scale Computing activity in the continent. Diversity of users, important scientific contributions and one thing in common: collaboration.

Collaboration among all scientific and academic institutions allows to develop new ideas and proposals. More and more the interaction Europe-LatinAmerica and North America-LatinAmerica is open. 

Some of the contributions in this version of CLCAR are addressed to global interests and to resolve some technical problems and open questions in related areas. In Costa Rica, this year, inspired by the most green country in the continent, Bioinformatics and biodiversity are the common subjects in the most part of the projects.

On the other hand, researchers and HPC Latin American community have meet supported by RedCLARA and some european institutions as CETA CIEMAT and Barcelona Supercomputing Center (BSC) to continue with the development of the Advanced Computing Service for Latin America and Caribbean, SCALAC (from spanish/portuguese acronym). This important meeting is the second face to face journey to address a large scale -advanced computing facility today, with the experience of the past projects EELA, EELA-2 and GISELA.

CLCAR 2013 continues until tomorrow. GridCast and ISGTW are from 2007 media partners of this important activity in Latin America.

Tuesday, August 27, 2013

PURA VIDA from Costa RICA: Starting CLCAR 2013 with Tutorials

CLCAR 2013 in San José Costa Rica
Pura vida is the traditional expression from many "good" thinks in Costa Rica. Acknowledgement, friendship, accordance, happiness... 

This year, the tutorial sessions of CLCAR 2013 began CLCAR with many "good topics": exploiting GPGPU architectures with OpenCL and CUDA, the BioCLCAR tutorial in HPC for biosciences and BONIC with LEGION tutorial. 

Attendees of CLCAR 2013 Tutorials 
Students and instructors from different countries of Latin America, joined to share knowledge in technologies, methodologies and collaboration experiences.  Starting from beginners students in large scale architectures to finish there are some minutes with a scratch level in HPC opportunities.

Tomorrow the version 2013 of CLCAR continues with the second day of the BioCLCAR tutorial and other specific subjects related with the exploitation of large scale architectures, e-science and collaboration. From 2007, the CLCAR is the "Conference" of high performance and scientific computing science of America Laitina.

The first day of CLCAR was a "PURA VIDA" day around collaboration, friendship and e-science".

Saturday, July 27, 2013

A spotlight on the gut at XSEDE'13

Possibly one of the most literally ‘in-depth’ talks I’ve attended at a computing conference came from Larry Smarr of the J. Craig Ventnor institute. He has got involved in biomedical research in the most personal way possible – by presenting his microbiome, or microbial profile, to be scrutinised in minute detail by researchers. The body has 10 times as many microbe cells as human cells. In fact, 99% of the DNA in your genome is in microbial cells not in human cells. “Topologically we’re a torus”, said Smarr. “Effectively, the bugs are all on the outside of your body, including your guts.”

Smarr’s interest in the microbiome started with increasingly severe but misdiagnosed stomach problems. Smarr was not impressed with the guesswork involved in treating his symptoms. He thought DNA sequencing should give a much more accurate picture of what was going on, eventually leading to a diagnosis of early Crohn's disease, an inflammatory bowel condition. With the cost of sequencing a genome fallen from 4 billion dollars per genome, to 4000 dollars, financing the research is not so much the issue – it’s picking out the microbial from the human, and then identifying what you have left. Research using the Gordon supercomputer at SDSC still found a significant proportion of DNA that was ‘unknown’. 

What is clear though is that Crohn’s disease has a devastating effect on the gut bacteria – the equivalent to a ‘mass extinction event’ in Smarr’s words. The traditional medical approach of waging war on the gut microbes using heavy duty drugs was not going to help – the better approach is to switch from full frontal attack to ‘gardening’. This means using new therapeutical tools to manage the microbiome and encourage ‘good’ bacteria. Diet is also an important factor. “Change the feedstock and you’ll change the shape of the garden,” advised Smarr. As he’s still donating all sorts of raw materials to the research programme on a regular basis, he should know.

(You can read more about microbial gardening on the CNN website – even the Economist has got in on the act with its cover page article ‘Microbes Maketh the Man’)

Biosciences Day at XSEDE’13 closed with a lively panel session featuring speakers from a range of high tech biomedical companies, including at least one lawyer. This is not as strange as it first sounds, because many of the issues affecting biomedical research come down to the ethics of sharing very personal data. “If you surprise a bioethicist with a question, the answer is always no” said Glen Orter of Dell. Alex Dickinson of Illumina Inc looked at the role of cloud computing in genomics, including two killer apps – whole genome assembly and variant calling, as well as single click sharing of very large data sets. His wish lists included cloud bioinformatics featuring openness and collaboration. “Interpretation is now the key, since analysis is no longer the bottle neck,” said Dickinson. This means thinking big and looking at phenotypes (how genes are expressed in the body) not just genotypes. “We want whole genomes and lots of them,” he announced.

Donald Jones of Qualcomm life talked about connected data, from wherever it orginates, inside the body, outside the body or from instruments. To bring in a Star Trek reference, this is the ‘tricorder challenge' to find simple to use multimeters for monitoring the body, like the blood glucose meters already used by diabetics. In the future, we’re likely to see increasing numbers of health related apps.
Darryl Leon from Life Technologies advised us to think holistically and also address the environmental costs of intensive computing – can we have low energy high energy computing? Nicholas Schork from the Scripps Institute said that sequencing genomes might be cheap at a few thousand dollars but interpretation can cost many times that amount. There are 4-6 million variants per person to explore using algorithms. “This is computationally intensive, and they may not do anything in the end or contribute to disease,” said Schork. There are a host of so-called idiopathic diseases where the cause is still unknown. Increasingly, the “exposome” will become important as well as the basic genome – the exposome is everything you’re exposed to in your life, including environmental factors. The more data we share on the exposome, theoretically the more information can be gleaned to lead to new treatments

The panel speculated that we might eventually see the equivalent of Facebook for sharing personal medical data – would you share your genome as blithely as as your holiday photos? But what about genetically identical twins where only one wants to publish their data, and the other prefers privacy?  Data might be compressed and encrypted – but this doesn't necessarily offer a full protection. Some people with Crohn's, like Larry Smarr, publish all their data in the hope of helping to find a cure. But others want total privacy – the challenge for the future will be to find a way to tread both these paths.

Biosciences Day at XSEDE'13

After the excitement of the XSEDE’13 kick-off next door to Comic Con, and the glamour of a 1940’s themed Chair’s dinner aboard the USS Midway, we moved into Biosciences Day. Touring the Midway, we squeaked across the blue and white checked lino of the mess deck (hearty hot meals were served 23 hours out of every 24) and ducked into the cramped confines of the sick bay with its three tier stacked bunk beds. The operating theatre and dental surgery were right next door to the machine shop. To the untrained eye, the equipment in all three looked broadly similar. (Given the gender bias of the crew, I’ll leave the identity of the two most requested elective operations to your imagination. They weren’t on the teeth). The basic nature of the medical treatment on offer to the crew however was a timely reminder of how far medicine has come in the last half century, particular now that high performance computing has joined the medic’s armoury.

David Toth, a Campus Champion at the University of Maryland Washington, talked us through the role XSEDE resources have played in finding potential drug treatments for histoplasmosis (a fungal infection), tuberculosis and HIV. In the days of the USS Midway, crew with a positive HIV test were immediately air lifted to shore, with a future behind a desk rather than at sea ahead of them. Toth’s group screened 4.1 million molecules using XSEDE resources, a task that would have taken 2 years on a single desktop PC. Some drug candidates are now being tested in the wet lab, with several promisingly showing the power to inhibit cell growth. “Supercomputers do a great job of outpacing the biologists,” said Toth. One enterprising feature of the work was that students each screened 10,000 molecules as part of their course, with one student striking lucky with a potential drug candidate.

Looking at biomedical challenges more broadly, Claudiu Farcas from the University of California in San Diego summarised some of the issues posed by data. For a start, there are many different kinds of data, from the genotype, RNA and proteome, on to biomarkers and phenotypes, right up to population level data, all with their own set of acronyms. Public repositories are often poorly annotated and are mostly unstructured as well as governed by limited and complicated data use agreements. “Researchers are encouraged to share but are not always enabled to do so,” warned Farcas.

A particularly thorny issue for biomedical data analysis is how to deal with sensitive personally identified information (PII). Researchers need to protect the privacy of patients by anonymising their data. It also needs to be cleaned up, compressed and aggregated so it can be analysed efficiently. But how best to do this? Bill Barnett, of Indiana University said earlier that biologists really don’t care what they use, as long as it works. Cloud computing can be tempting, but institutional clouds are often still at the early stages of being set up, and commercial sources might have “questionable” privacy protection.

The iDASH (Integrating Data for Analysis, Anonymization and SHaring) portal allows biologists to focus on the biology, not the data. It includes add-ons such as anonymisation of data, annotation, processing pipelines, natural language processing, and the possibility to preferentially select speedier GPGPU resource. According to Farcas, iDASH offers a secure infrastructure plus a platform for development, services and software, including privacy protection.

Wednesday, July 24, 2013

XSEDE’13 in San Diego – when super heros met supercomputers

There aren’t too many conferences where you meet Batman, Dr Who and the Wicked Witch of the West while still checking into the hotel. XSEDE’13 in San Diego this year overlapped with the famous Comic Con event right next door – so caped super heros marching past in the lobby was apparently part of the deal. Comic Con attracts over 120,000 participants every year, XSEDE slightly fewer at 700. But this is an ever rising number year on year, as project leader John Towns was pleased to point out. And I have a suspicion that the categories of ‘comic book nerds’ and ‘IT geeks’ are perhaps not entirely mutually exclusive sets…

XSEDE, the Extreme Science and Engineering Discovery Environment, is a National Science Foundation-supported project that brings together supercomputers, data collections, and computational tools and services to support science and engineering research and education. The annual XSEDE conference focuses on the science, education, outreach, software, and technology used by the XSEDE and other related communities worldwide.

The programme kicked off with a day of well attended tutorials – with a 200-strong host of students at the event, the training opportunities were well appreciated, as was the opportunity to showcase their work in poster sessions and competitions. Even younger participants were catered for by a morning robotics class, “Ready, Set, Robotical” which I was more than tempted to join.

Training is always a strong theme at XSEDE, and the challenges of providing online courses in parallel computing were discussed, as well as developing undergraduate and graduate programmes in computational science. Campus champions are the backbone of outreach on a regional basis, and XSEDE is now looking to expand the scheme into subject specific champions. This emerged as a theme for future collaboration between XSEDE and the European Grid Infrastructure in the Birds of a Feather session. EGI recently set up an EGI Champions scheme, including representatives from fields as disparate as life sciences, environmental sciences and maths. Other areas where EGI and XSEDE expect to work together include federated high throughput data analysis, federated clouds and user support. One use case already in progress covers HPC-HTC workflows for the computational chemistry community. This was one of the joint use cases that emerged in the call issued at the beginning of the year. So there's lots to watch out for, even now the caped crusaders have (mostly) left town.

Sunday, July 21, 2013

High Performance Computing, e-Science and Scalable Architectures Next a Volcano

36 Students from Mexico and some countries of central america have begun the participation in the 2013 version of the Supercomputing and Distributed Systems Camp. This year, the SCCAMP is conducted at the Instituto Tecnológico de Colima, in the beautiful town of Villa de Alvarez in the Colima State, México. As in 2011, the SCCamp is developed near of a volcano: the Colima Volcano.

SCCamp is an initiative of researchers to offer to undergraduate and master students a state-of-the-art knowledge upon High Performance and Distributed Computing topics. In 2013 version, students will learn topics in 7 days summer school, starting from large-scale infrastructures (Cluster, Grid, Cloud) to CUDA Programming.  Instructors are from several countries in the world: Brazil, Colombia, France, Germany, Greece, Mexico and Venezuela.

Past SC-CAMP were made in Colombia (2010), Costa Rica (2011) and Venezuela (2012). Every year an interesting special topic is proposed and selected. The special topic of this year is the use of reconfigurable architectures to scientific applications. 

The SCCAMP 2013  is supported by some international partners.  GridTalk and iSTGW are media partners of SCCAMP.

Thursday, July 18, 2013

iMENTORS goes live!

Mapping ICT accross Sub-Saharan Africa
Brussels, 18/07/2013iMENTORS goes live and is one step closer to becoming the most comprehensive crowdsourcing map on ICT infrastructures in Sub-Saharan Africa! Users are now able to register and create their profile on the platform and start sharing their knowledge and data directly on the map.
Co-funded by the European Commission’s DG CONNECT under the Seventh Framework Programme (FP7), iMENTORS is designed to enhance the coherence and effectiveness of international actors involved in e- infrastructures development projects and initiatives in Sub-Saharan Africa. iMENTORS launched in April 2012 by Stockholm University and Gov2u is a web-based platform serving as a knowledge repository for sharing and aggregating data on e-infrastructure projects throughout sub-Saharan Africa.
e-Infrastructures– electronic research infrastructures – are collections of ICT based resources and services used by the worldwide research and education community to conduct collaborative projects and generate, exchange and preserve knowledge.
iMENTORS strengthens the development of European policy in developing research infrastructures by providing donors and policy makers with a comprehensive tool to test hypotheses and understand trends in the field of e-infrastructures development assistance across Sub-Saharan African countries. It increases the value of data by providing more descriptive information about international e-infrastructure development and strengthens efforts to improve donors and recipients strategic planning and coordination. 
By identifying and mapping the majority of e-infrastructure projects in Sub-Saharan Africa, the project provides valuable knowledge of what is currently on the ground which in itself represents the first step to more informed future choices and decision making on e- infrastructures deployment. Additionally, the software tool deployed assists policy-makers and donor agencies in utilizing the information more effectively, by filtering and associating it with other variables to identify on trends in aid flows, areas of interests, allowing them to make more informed decisions. 
The platform also pushes for stronger cooperation between the different e-infrastructures stakeholder groups, hence providing decisive support in their efforts to develop globally connected and interoperable e-infrastructures. 
The dissemination activities that are planned in this project have raised the visibility of the e-infrastructures activity in Sub-Saharan African Region towards wider audiences, especially among the research communities from EU countries and international development aid agencies. By engaging key e-infrastructures stakeholders at the recipient country level, the project triggers debates on the relative effectiveness of donors and create the impetus to move to more effective coordination mechanisms.

For more information visit:

Monday, July 1, 2013

Asynchronous Parallel Computing Programming School in Bucaramanga, Colombia

With more than 70 students from different South America countries and the support of the Barcelona Supercomputing Center, Spain (BSC), the Asynchronous Computing Programming with MPI/OMPSs addressed to hybrid architectures school is developed in Bucaramanga, Colombia.  The school is organized by the High Performance and Scientific computing Laboratory of the Universidad Industrial de Santander (SC3 UIS) in Bucaramanga, Colombia, and continues until next Friday 5.

The school search to diffuse specific programming competences to the researchers, engineers and students which interact with the Latin-American and Caribbean Service of advanced computing (SCALAC from Spanish/Portuguese acronym), specifically with hybrid architectures as GUANE-1 the main HPC platform of the SC3 UIS (http:// )
Several applications to test in this school, are related with particular uses in science and engineering, for example, weather applications, bioinformatics and computational chemistry, astrophysics, condensed matter, energy and seismic. SCALAC joint all Grid and Advanced Computing experience received in some projects developed in the last 10 years in Latinamerica.

More information: 

Friday, June 28, 2013

iSGTW teams up with NUANCE to increase coverage of Africa

iSGTW is extremely pleased to announce that it has signed a memorandum of understanding with NUANCE, allowing the limited sharing of some content between the two publications.

NUANCE stands for ‘The Newsletter of the UbuntuNet Alliance: Networks, Collaboration, Education’ and is a publication we at iSGTW hold in high regard for its excellent coverage of national research and education networks (NRENs) in Africa.

At iSGTW, we hope that this exciting new partnership will allow us to increase our coverage of this region, where many exciting developments in the world of e-infrastructures are currently taking place.

You can read the latest edition of NUANCE on the UbuntuNet alliance website, here.

Wednesday, June 19, 2013

"Moore's Law is alive and well" — but is that enough?

On Monday, with the announcement of the new Top 500 list of the world's fastest supercomputers, we wrote briefly about the challenges computer scientists across the globe face in achieving exascale supercomputers by the end of the decade. To put the scale of this challenge into perspective, China's Milky Way 2 supercomputer, the fastest in the world today by a significant margin, is capable of reaching 34 petaFLOPS. Plus, there's the small matter of energy efficiency still to tackle if exascale supercomputers are going to become a realistic proposition.

Yesterday evening, Stephen S. Pawlowski of Intel gave a keynote speech at ISC'13 entitled 'Moore's Law 2020'. "People are always saying that Moore's Law is coming to an end, but transistor dimensions will continue to scale two times every two years and improve performance, reduce power and reduce cost per transistor," he says. "Moore's Law is alive and well."

"But getting to Exascale by 2020 requires a performance improvement of two times every year," Pawlowski explains. "Key innovations were needed to keep us on track in the past: many core, wide vectors, low power cores, etc."

"Going forward, scaling will be as much about material and structure innovation as dimension scaling". He cites potential future technologies, such as graphene, 3D chip stacking, nanowires, and photonics, as ways of achieving this.

Pawlowski argues for less focus on achieving a good score on the Top 500 list by optimising performance for the Linpack benchmark. Instead, he says, there needs to be more focus on creating machines suited to running scientific applications. "Moore's Law continues, but the formula for success is changing," concludes Pawlowski.

Monday, June 17, 2013

Top of the FLOPS at ISC’13

This week, almost 2,500 experts from industry, research, and academia have gathered in the German city of Leipzig for International Supercomputing Conference ’13 (ISC’13). The event played host today to the announcement of the new TOP500 list of the fastest supercomputers in the world. Milky Way 2 (known also as Tianhe-2), located at the National University of Defense Technology (NUDT) in Changsha, China, was announced the new winner. “The Milky Way 2 project lasted three years and required the work of more than 400 team members,” says Kai Lu, vice dean of the School of Computer Science at NUDT. Boasting over 3 million cores and with a peak performance of around 34 petaFLOPS on the Linpack benchmark, Milky Way 2 is nearly twice as fast as the previous winning supercomputer, Titan, at Oakridge National Laboratory, US. Titan has now slipped to number two spot on the list, with another US-based supercomputer, Sequoia, located at Lawrence Livermore National Labs, completing the top three. JuQUEEEN at the Jülich Supercomputing Centre in Germany was ranked as the fastest machine in Europe.

“Our projections still point towards reaching exascale systems by around 2019,” says Erich Strohmaier of the US Department of Energy’s Lawrence Berkley National Laboratory, who gave an overview of the highlights of the new Top 500 list. Strohmaier, however, warns that increasing the power efficiency of supercomputing systems will continue to be a major challenge over the coming years: “If we don’t start to have some new ideas about how to build supercomputers, we will truly be in trouble by the end of the decade.”

 “If you actually look at what people want to do, an exaflop is still not enough,” says Bill Dally of NVIDIA and Stanford University, California, US.  He capped off this morning’s programme with a keynote speech on the future challenges of large scale computing. “The appetite for performance is insatiable,” he says, citing work in a number of research fields as evidence that performance is currently still the limiting factor in terms of the exciting science which can potentially be done. “If we provide increased performance, people will always find interesting things to do with it.”

Latinamerican High Performance and Grid Computing Community calls for contributions to CLCAR 2013 in San José Costa Rica

The Latinamerican Conference on High Performance Computing (CLCAR, from spanish acronym) 2013 will be held this year in San José, Costa Rica. Since 2007, the Latin-American Conference on High Performance Computing (CLCAR) is an event for students, scientists and researchers in the areas of high performance computing, high throughput computing, parallel and distributed systems, e-science and applications, in a global context, but with special scope in latinoamerican propositions. GridCast is media-partner of this latinamerican activity.

The program and scientific committees are formed by experts and researchers from different countries and related domains. Competent people from various countries and institutes will carry out the process of evaluating the proposals. CLCAR 2013 to be held in San José,  Costa Rica, in August 26-30. 

CLCAR 2013 official languages are English, Portuguese and Spanish. People willing to present their proposals can present them mainly in two forms: Oral Presentations (Full Paper) and Posters (Extended abstract) until Sunday, June 23.

This year there are two activities proposed inside the conference: the first one, the bioinformatic and biochemistry researchers propose the bio-CLCAR, and the second, the CLCAR scientific visualization challenge. 

Different Proposals can be  submitted in  ENGLISH, PORTUGUESE or SPANISH only (full papers and extended abstracts).  Papers written in Spanish or Portuguese should have the title and its abstract in english too. The oral presentation may be in any CLCAR official language, but the slides will be in English anyway.  Selected posters from extended abstracts must be show in English.

For more information about CLCAR 2013, please visit the official site:

Praise for PRACE and the importance of building expertise in HPC

Yesterday, the PRACE Scientific Conference was held in Leipzig, Germany. It is one of several satellite events taking place alongside ISC'13, which gets underway in full today.

After a brief welcome address from Kenneth Ruud, chairman of the PRACE Scientific Steering Committee, Kostas Glinos, head of the European Commission's eInfrastructures unit, spoke about the vision for HPC in Horizon 2020.

"HPC has a fundamental role in driving innovation, leading to societal impact through better solutions for societal challenges and increased industrial competitiveness," says Glinos. "It's not just about exascale hardware and systems, but about the computer science needed to have a new generation of ICT."

"Only very few applications using HPC really take advantage of current petaFLOPS systems," he adds. "New computational methods and algorithms must be developed, and new applications must be reprogrammed in radically new ways." In addition, Glinos  highlighted the importance of public procurement of commercial systems for developing the next generation of IT infrastructures, which you can read more about in the recent iSGTW article ‘Golden opportunities for e-infrastructures at the EGI Community Forum’.

Finally, he spoke about the conclusions of the recent EU council for competitiveness: "HPC is an important asset for the EU... and the council acknowledges the very good achievements of PRACE over the years." For Horizon 2020, Glinos says: "We want to build on PRACE's achievements to advance further integration and sustainability." He argues for the importance of an EU-level policy in HPC addressing the entire HPC ecosystem, saying that the sum of national efforts is not enough – "we need to exchange and share priorities."

The conclusions of the EU council for competitiveness were also highlighted by Sergi Girona, chair of the PRACE board of directors. "We have to work together because we want to support science and industry, the development of HPC in Europe, and the development and training of persons," he says.

During his talk, Girona also gave an overview of PRACE in numbers: with its 25 member countries, PRACE has a budget of €530m for 2010-2015, including €70m of funding from the European Union. Girona explains that PRACE has now awarded more than 5 billion computation hours since 2010 and is currently providing resources of nearly 15 petaflops.

However, he emphasises that PRACE is about much more than simply providing access to HPC resources. "We don't just want to give access to computing resources; we want to support users at all stages – it is key to train people," he says. "We have created six training centres in Europe and have approved a curriculum with 71 PRACE advanced training centre courses for this year."

The importance of training was also highlighted by Glinos: "We need more expertise, so we intend to support a limited number of centres of excellence. Topics may relate to scientific or industrial domains, such as climate modelling or cancer research for example, or they may be 'horizontal', addressing wider challenges which exist in HPC. These centres of excellence need to be led by the needs of the users and the application owners."

Following Girona's talk, Wolfgang Eckhart of the Technical University of Munich, Germany, gave a presentation on his research in the field of molecular dynamics. He and his colleagues have been selected as winners of the PRACE ISC Award for their paper entitled '591 TFLOPS Multi-Trillion Particles Simulation on SuperMUC'. The award ceremony is set to take place later today.

The remainder of the conference consisted of a series of exciting presentations on research conducted using PRACE resources, ranging from high-resolution global climate models to molecular simulation, and from astrophysics to better understanding the building blocks of matter. You can read more about these on the PRACE Scientific Conference website, here.

Be sure to check back later this week for further updates from ISC'13.

Tuesday, June 11, 2013

IT as a Utility in Emerging Economies

Mobile is critical for IT in emerging economies.
(CC-BY-NC-SA AdamCohn, Flickr)
The ITaaUN+[1] workshop on IT as an infrastructure in emerging economies attracted social activists-cum-academics, academics-cum-industrial consultants, linguists, digital humanitists, and technology visionaries to the Association of Commonwealth Universities (ACU), where ACU Secretary General, John Wood, played host to the compact but vocal group. The agenda was to discuss the challenges and opportunities of IT, seen as a utility, in the majority world. John Wood is himself a veteran of e-infrastructures in the UK, having being Chief Executive of the Council for the Central Laboratory of the Research Councils, where he was responsible for RAL and Daresbury. Later, he held positions at Imperial – first as Principal of the Engineering Faculty, and later as Senior International Advisor. He now sits on the board of JISC and the British Library, and has advised numerous governmental and corporate organisations across the globe. But his experience at the ACU gives him a unique perspective on infrastructures that are in place already in the Commonwealth countries that are also developing countries (assuming provision of computational infrastructure in higher education institutions is an accurate barometer of infrastructure elsewhere in countries, which it usually is, to some degree).

Why the service/utility distinction though?

Jeremy Frey, Physical Chemist at the University of Southampton and one of the minds behind Chemistry2.0 application CombeChem, explained that there is a natural progression of a technology as it becomes part of the fabric of our lives. The transition: Revelation > Innovation > Specialist Tool/resource > Service > Utility – is one that the utilities that form the infrastructure of daily activity, such as electricity and telecommunications, have already progressed along. IT, and especially networked IT, is somewhere between the last two stages (and there will continue to be debates about whether and to what extent all utilities should be services or utilities, depending on the economic model in place). But, as the UN’s World Summit on Information Society (WSIS) has suggested, internet access is beginning to be established by governments as a basic human right, and so it will become increasingly considered a utility rather than a service – something we need rather than just something we want. And the impact that this will have will be felt nowhere more so than in the emerging economies of the developing world.

In 1965 the centre of gravity of global wealth was located on the plains of La Mancha, in Spain. This was, however, the last time that this topographical El Dorado[2] lay in Europe. As the years have passed, the centre of international wealth has meandered out across the Mediterranean, and zig-zagged through northern Algeria and Tunisia. Having bounced off Malta and back towards Tunis, it is now poised to zoom eastward once more, skirting past northern Libya to settle in the middle East some time around the middle of this century (remaining oil reserves notwithstanding).

The economic and social conditions that led to Europe being the dominant force in the world over the last few centuries can be attributed to a combination of factors: a great abundance of natural resources – particularly wood, coal and iron – suited to the manufacture of ever-more-complex tools; the concentration of historically competitive cultures within Europe itself (leading to wars, but also leading to a wealth of art, science and advanced technology); and the suitability for domestication and farming of indigenous fauna and flora. This last blessing, in combination with the large Eurasian landmass whose principal axis is East-West and therefore at the same latitude, allowed successful farming methods to spread very easily over a large area[3], dramatically increasing population size in Europe and therefore the means to successfully colonise new areas of the globe.

With war and recession, and with growing social justice – helped in part by the ease with which injustices can be brought to the World’s attention through social media – the old colonialism has crumbled and many of its worst crimes are behind us. However, some corporations exploit the cheap labour and (often) rare natural materials of developing nations in much the same way as colonial powers once did[4]. Sometimes this is for food: growing soya for cattlefeed, for example…but potentially at the expense of rainforest. Sometimes this is for palm oil: a food and a fuel, but at the expense of rainforest and peat bogs, potentially leading to local and global environmental problems. And in many African nations, landscapes are plundered for rare metals, destined for today’s mobile devices.

Today’s mobile devices are not, however, solely owned by rich people in rich countries. Competition between mobile device manufacturers and an increased acceptance of their ubiquity in our everyday lives has led to dramatic price drops in the developed world. And together with greater prosperity in the developing world, such devices are now becoming more affordable for a growing global middle class – mainly urban; often involved in the knowledge economy in some way. People in these countries are leapfrogging the PC/browser paradigm, and (for a large number) the first personal device they own that can access the web and email is in fact a webphone or smartphone.

Although there is still tremendous disparity between rich and poor in the world, developing countries are experiencing this growth of a middle class (defined as having $10-$100 of disposable income per capita per day in PPP terms), exactly at a time when those people are being empowered with devices that help them to access information and to make decisions. The pace of change is rapid. New technologies are initiating and catalysing social change (in the manner of the Arab spring), and opening up new possibilities in the realms of education and research, communications, and e-governance. In many ways, ever-cheaper technologies are helping to correct the economic unevenness that led Europe (and North America) to become unduly dominant over the last few centuries. Mobile web as a globally democratising tool, if you will.

For that reason, the potential for real change in emerging economies that might be realised by IT has attracted visionaries and projects from the developed world, eager to demonstrate the power and potential of technology to enact measurable social change quickly. E-Governance, for example, is an area where both the potential for positive change and the motivation to do so, especially in countries that are newly democratic, is very attractive. But at the ITaaUN+ workshop, individuals with direct experience of having participated in such programs suggested that this approach can seem like neo-colonialism. Why implement (or as it might seem, impose) e-voting tools, for example, when similar innovations haven’t always had brilliant success in long-established democracies? It could smack of using developing countries as test cases for technologies that haven’t already been tried and tested in the developed world.

There are, however, good examples of IT making positive changes to people’s everyday lives. Take microcredit, for instance. This allows people to send small amounts of currency using their mobile phones. By using SMS as the technology behind the system, microcredit uses a simple, tried-and-tested service that, though perhaps lacking the bells and whistles of wifi, is more reliable in sub-Saharan Africa, where internet connections cannot always be guaranteed. Distributed systems, whether for power generation through smaller, local solar and wind power; or computer networks themselves – might offer solutions that bring specific benefits to emerging economies, where geographical isolation can be a barrier to some technologies being more widely adopted. 

The discussion next led on to 3D printing. The 3D4D challenge, for example, looks at the benefits this rapidly developing technology could bring to emerging economies. From simple to complex devices (including medical tools), to an enabler of innovation, 3D printing could have a huge impact on emerging economies, although much interest has come so far from the developed world. The ability to make new parts and eventually more complex components could mean that emerging economies, which are already embracing a more sustainable approach to technology adoption through frugal innovation, manage to avoid some of the more wasteful consumer trappings of buying every new model that are prevalent in the West. As resources needed to make new mobile devices are depleted, the developed world could learn a thing or two from the emerging economies.

The link again for ITaaUN+ is:

...they're running workshops and events all the time, so it's worth taking a look.

[1] Information Technology (ok, you probably knew that bit) as a Utility Network-plus
[2] There are several maps tracking the movement of the centre of economic gravity, over a number of time-scales. I’m using one by Homi Kharas of the OECD development centre: working paper #285, “The Emerging Middle Class in Developing Countries”. Although it simplifies the calculation, by assuming the centre of each nation’s GDP is located on its capital city, this is a fairly reasonable assumption to make (better, in my opinion, than assuming that GDP is colocated with the geographic centre of each country). There are, of course, countries where the capital is not the wealthiest city – but in many cases these are close enough geographically (e.g. New York and Washington; Toronto and Ottawa) to not make too much difference. Sydney and Canberra are ‘close enough’ at the distance they are  from the centre; Milan–Rome and Köln-Düsseldorf–Berlin present more of a problem due to their proximity to the centre, but this is still the best solution I have seen.
[3] 1. This argument is from Jared Diamond’s Guns, Germs and Steel
2. Crops are generally sensitive to being north/south variance but will grow happily along the same latitude
[4] What corporations are not able to do is maintain control over entire nations, so that, provided governments in developing countries offer some degree of freedom to their citizens, individuals and groups are able to act in an entrepreneurial manner