A considerable amount of people of Earth somewhat have heard about the Large Hadron Collider which is presently located in Cern, Geneva. Although the existence of such massive scientifically significant mega structure is known by a lot of people, very few are aware of one question :
If the Large Hadron Collider is collecting a lot of data for better understanding physics, what kind of supercomputer is managing the data?
The answer to that question resides in grid computing. Grid computing is unlike super computer.They are super virtual computer. A supercomputer are resources such as computers or processors connected via high speed system bus whereas super virtual computer are connected via high speed networks.
What is the Worldwide LHC Computing Grid?
The Worldwide LHC Computing Grid (WLCG) is a distributed computing infrastructure to provide the production and analysis environments for the LHC experiments.
It is managed and operated by a worldwide collaboration between the experiments and the participating computer centres. The resources are distributed across the world for funding and sociological reasons.
WLCG is therefore a distributed, or grid-based, infrastructure - the most effective solution for meeting the data analysis challenge on this unprecedented scale.
Currently WLCG is made up of more than 140 computing centers in 35 countries to process, analyze and store data produced from the LHC, making it equally available to all partners, regardless of their physical location.
Tiers
The WLCG is composed of four levels, or “Tiers”, which are made up of the computer centres. The tiers are called Tier 0, Tier 1, Tier 2 and Tier 3. These tier sites process, store and analyse all the LHC data between them.Components
Massive multi-petabyte storage systems and computing clusters with thousands of nodes connected by high-speed networks are the building blocks of the WLCG centres.
Data processing
The challenge of analysing the volume of data produced at the LHC is an immense task. Two-stage processing using dedicated algorithms are in continuous development to reduce data 'noise', and assist in focusing on the most important data which could bring new physics discoveries.
Sadly, not enough info for the supercollider's super virtual computer since it is kept secret so that other competing organizations may reproduce such a scientific marvel but recently, the supercollider's committee decided to ask for public help to connect their pc's to the grid for extra resources. You can find the article here.
0 comments:
Post a Comment