Advanced Grid Computing Technologies in ATLAS Data Management

A. Vaniachine

Argonne National Laboratory, 9700 S Cass Ave, Argonne, IL 60439, USA

In 2006, ATLAS, a general-purpose experiment at the Large Hadron Collider (LHC) at CERN, will start cosmic run data taking at the full DAQ rate. After the LHC turn on, the data taking at the full rate will result in unprecedented experimental data volumes of 10 PB/year.

To answer these unprecedented data management challenges, ATLAS is adopting the emerging Grid computing technologies on a planetary scale. In a recent world-wide collaborative effort spanning over 56 prototype tier centers in 21 countries on four continents, ATLAS produced more than 60 TB of simulated and reconstructed data for High Level Trigger studies. This effort provided a testbed for integration and testing of advanced Grid computing components in a production environment.

Several novel Grid technologies were used in ATLAS data production and data management for the first time. My presentation will review new Grid technologies introduced in the HEP production environment - the Chimera Virtual Data System automating data derivation, Virtual Data Cookbook services managing templated production recipes, virtual database services delivery for reconstruction on Grid clusters behind closed firewalls, and efficient Grid certificate authorization technologies for production database access control.