Friday, May 1, 2015

CMS Analysis on 10K Cores with Lobster

The CMS physics group at Notre Dame has created Lobster, a data analysis system that runs on O(10K) cores to process data produced by the CMS experiment at the LHC.  Lobster uses Work Queue to dispatch tasks to thousands of machines, Parrot with CVMFS to deliver the complex software stack from CERN, XRootD to deliver the LHC data, and Chirp and Hadoop to manage the output data. By using these technologies, Lobster is able to harness arbitrary machines and bring along the CMS computing environment wherever it goes.   At peak, Lobster at ND delivers capacity equal to that of a dedicated CMS Tier-2 facility!     (read more here)

No comments:

Post a Comment