![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhx2TqAWo4-UU3t1wLRcMtDEpWG08YV1X6vQBWXIuizLltNCwUqqHH7-Yz_apHQTDRDscThZjSgDb9OSBnd9ykgBgNC8zfMC5gCPsLXy2cWtG871r1WgTFUXnMB0basZs0ud3OL_qHYUCAo/s320/Slide1.png)
The CMS physics group at Notre Dame has created
Lobster, a data analysis system that runs on O(10K) cores to process data produced by the
CMS experiment at the
LHC. Lobster uses
Work Queue to dispatch tasks to thousands of machines,
Parrot with
CVMFS to deliver the complex software stack from CERN,
XRootD to deliver the LHC data, and
Chirp and
Hadoop to manage the output data.
By using these technologies, Lobster is able to harness arbitrary machines and bring along the CMS computing environment wherever it goes. At peak, Lobster at ND delivers capacity equal to that of a dedicated CMS Tier-2 facility!
(read more here)
No comments:
Post a Comment