r17 - 10 Jun 2009 - 18:25:31 - NurcanOzturkYou are here: TWiki >  Admins Web > AnalysisQueueJobTests



From FacilityWGAPMinutesMar10 (cf: FacilityWGAP in the IntegrationProgram), Nurcan has proposed the following suite of job types:

Job type Athena release Input dataset Notes
SusyValidation 14.5.0 mc08 AOD we will start with this as daily tests, tested
D3PD making with TopPhysTools 14.5.0 mc08 AOD will be used at BNL Jamboree in March, tested
TAG selection 14.5.0 FDR2-c TAG TAG's has not been made yet for reprocessed dataset, tested
DB access job mc08 AOD from tauDPDMaker package, tested
AthenaRootAccess 14.5.0 mc08 AOD can also read DPD made from AODtoDPD making, to be set up
Data reprocessing data08_cos*, data08_1beam* DPD also uses dbReleases option, to be set up

The following sections provide additional instructions/details for each of the job types.

Tier 2 stress tests from the ANL ASC

Instructions for setting up jobs

SUSY validation job

Instructions to setup and run a stress test job from BNL acas machines using SusyValidation package, athena release 14.5.0, mc08 datasets:

1. Create directories:

mkdir StressTest
cd StressTest
mkdir 14.5.0
mkdir cmthome
mkdir mypanda

2. Copy the requirements file from my area and setup cmt:

cd cmthome
cp /usatlas/u/nurcan/StressTest/cmthome/requirements .
source /afs/cern.ch/sw/contrib/CMT/v1r20p20080222/mgr/setup.sh
cmt config

4. Install panda-client (use always the latest version of client):

cd ../mypanda
wget https://twiki.cern.ch/twiki/pub/Atlas/PandaTools/panda-client-0.1.37.tar.gz
tar xvfz panda-client-0.1.37.tar.gz
cd panda-client-0.1.37
python setup.py install --prefix=~/StressTest/mypanda
cd ../..

5. Setup athena, panda:

source cmthome/setup.sh -tag=14.5.0,setup,32
source mypanda/etc/panda/panda_setup.sh

6. Create WorkArea:

cd 14.5.0
cd WorkArea/cmt
cmt bro cmt config
cd ../run

7. Get jobOption file for SusyValidation package from release:

get_files -jo SUSYValidation_jobOptions.py

(if you like to skip step 8, copy the AOD from my area)

8. ON A SECOND WINDOW check input dataset availability:

source /afs/usatlas.bnl.gov/i386_redhat72/opt/dcache/dcache_root_client_config.sh
source /afs/usatlas.bnl.gov/lcg/current/etc/profile.d/grid_env.sh
source /afs/usatlas.bnl.gov/Grid/Don-Quijote/DQ2Clients/setup.sh
voms-proxy-init -voms atlas
dq2-list-dataset-replicas-container mc08.105403.SU3_jimmy_susy.recon.AOD.e352_s462_r541/

Copy a file locally to try athena first:

cd 14.5.0/WorkArea/run
dq2-get -f AOD.026552._00001.pool.root.1 mc08.105403.SU3_jimmy_susy.recon.AOD.e352_s462_r541/

Note from Shuwei Ye : It is inconvenient to have two separate sessions to run athena and grid/dq2-tools separately due to the python confict between athena env and grid env. The following script should help solve this confict and to enable you to run dq2-tools and athena in the SAME session. Please run the following command after setting up grid env: source /opt/usatlas/bin/removePythonFromGrid.sh

9. Go back to first window and try a job locally first. You are at 14.5.0/WorkArea/run directory.

- Make a InputCollections.py file with this line in it:
ServiceMgr.EventSelector.InputCollections = ["AOD.026109._00001.pool.root.1"]
- run athena: athena SUSYValidation_jobOptions.py InputCollections.py
- check that output file is made, SUSYValidation.root, you can look at it in root if you like

10. Run pathena on one AOD file in the container dataset for testing purposes:

pathena --inDS mc08.105403.SU3_jimmy_susy.recon.AOD.e352_s462_r541/ --outDS user09.YourGridName.testSusyVal.`uuidgen` --nfiles 1 --burstSubmit ANALY_SWT2_CPB,ANALY_MWT2,ANALY_AGLT2,ANALY_SLAC,ANALY_NET2 SUSYValidation_jobOptions.py

11. Next time you log in just do step 5.

A stress test is run on the following 4 datasets for this job:

mc08.105401.SU1_jimmy_susy.recon.AOD.e352_s462_r541/  (40 files - makes 3 jobs)
mc08.105403.SU3_jimmy_susy.recon.AOD.e352_s462_r541/  (40 files - makes 3 jobs)
mc08.106400.SU4_jimmy_susy.recon.AOD.e352_s462_r604/  (479 files - makes 25 jobs)
mc08.105404.SU6_jimmy_susy.recon.AOD.e352_s462_r541/  (39 files - makes 3 jobs)

D3PD? making job with TopPhysTools

Reference : https://twiki.cern.ch/twiki/bin/view/Atlas/JamboreeMarch2009

1. If you have set up the SUSYValidation job already as above just do the step 5, otherwise repeat the steps till step 7. Please always use the latest version of panda client, panda-client-0.1.41 was needed to be able to use --processingType option.

2. Go to 14.5.0 directory and check out necassary packages:

cd 14.5.0
# EventView dependency
cmt co -r EventViewBuilderUtils-14-05-00 PhysicsAnalysis/EventViewBuilder/EventViewBuilderUtils
cmt co -r EventViewInserters-14-05-01 PhysicsAnalysis/EventViewBuilder/EventViewInserters
cmt co -r EventViewTrigger-14-05-05 PhysicsAnalysis/EventViewBuilder/EventViewTrigger
cmt co -r EventViewUserData-14-05-01 PhysicsAnalysis/EventViewBuilder/EventViewUserData
cmt co -r EventViewConfiguration-14-05-00 PhysicsAnalysis/EventViewBuilder/EventViewConfiguration

# D23PD maker utilities
cmt co -r EventViewD23PDUtils-00-00-02 PhysicsAnalysis/D23PDMakerUtils/EventViewD23PDUtils

# egamma tools
cmt co -r egammaUtils-00-04-09 Reconstruction/egamma/egammaUtils
cmt co -r egammaInterfaces-00-00-16 Reconstruction/egamma/egammaInterfaces

#Check out the following for the latest LumiCalc
cmt co LumiBlockComps LumiBlock/LumiBlockComps
3. Compile the packages in one go:
cd WorkArea/cmt
cmt bro cmt config
cmt bro gmake -j2
4. Go to your run area and get the jobOption file:
cd ../run
get_files -jo EventViewD23PDUtils/D3PDMaker_topOptions.py
5. Try locally first before submitting to pathena, edit your jobOption file to specify an input file:
if not "InputCollections"   in dir():      InputCollections=["/direct/usatlas+scratch/akira/user.RichardHawkings.0108173.topmix_Muon.AOD.v2/user.RichardHawkings.0108173.topmix_Muon.AOD.v2._00151.pool.root"]
6. Run athena:
athena D3PDMaker_topOptions.py
7. Check that the output file is created: D3PD? .aan.root

8. I tried to run on following input datasets:

- user.RichardHawkings.0108173.topmix_Egamma.AOD.v2 (158 files, not available at NET2)

- mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r579_tid028664 (1005 files, available at all sites, the contanier dataset however has 2 missing tid datasets at NET2)

9. Submit to pathena:

pathena --inDS mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r579_tid028664 --outDS user09.YourGridName.testD3PDMaker.`uuidgen` --nfiles 100 --site ANALY_SLAC D3PDMaker_topOptions.py

10. Notes: A sub-job with 20 input files from T1 dataset takes about 12 hours. A sub-job with 6 files from topmix sample takes about 5 hours. D3PDMaker? _topOptions.py provides an extensive example including trigger information, cluster information, track information and more as noted in the wiki page of this job linked on the table.

A stress test is run on the following datasets for this job:

mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r579_tid028664 (1005 files - makes 52 jobs)
mc08.105200.T1_McAtNlo_Jimmy.recon.AOD.e357_s462_r579/ (2109 files - makes 107 jobs)

TAG selection job

Reference: http://atlaswww.hep.anl.gov/twiki/bin/view/ASC/TAGJobANLASC1, see also https://twiki.cern.ch/twiki/bin/view/Atlas/EventTagTutorials

1. Setup athena and panda-client as in step 5 of the SUSY validation job.

2. Go to 14.5.0/WorkArea/run directory and get the jobOption file:

get_files AnalysisSkeleton_topOptions.py

3. Get a TAG file:

dq2-get -f fdr08_run2.0052300.physics_Egamma.merge.TAG.o3_f48_m27._0001.1 fdr08_run2.0052300.physics_Egamma.merge.TAG.o3_f48_m27

4. Edit AnalysisSkeleton_topOptions.py file to add:

ServiceMgr.EventSelector.InputCollections = [ "fdr08_run2.0052300.physics_Egamma.merge.TAG.o3_f48_m27._0001.1"]
ServiceMgr.EventSelector.RefName= "StreamAOD"
ServiceMgr.EventSelector.CollectionType = "ExplicitROOT"
ServiceMgr.EventSelector.Query = "NLooseElectron>1"

5. Insert associated AOD's into PoolFileCatalog.xml:

dq2-ls -L BNL-OSG2_DATADISK -P -R srm://dcsrm.usatlas.bnl.gov^dcap:// fdr08_run2.0052300.physics_Egamma.merge.AOD.o3_f48_m27

6. Run athena:

athena AnalysisSkeleton_topOptions.py

7. If successful run with pathena:

pathena --inDS fdr08_run2.0052300.physics_Egamma.merge.TAG.o3_f48_m27 --outDS user09.YourGridName.testTAG.`uuidgen` --site ANALY_SLAC --nfiles 1 AnalysisSkeleton_topOptions.py

A stress test is run on the following dataset for this job:

fdr08_run2.0052300.physics_Egamma.merge.TAG.o3_f48_m27 (2 files - makes 2 jobs (1 build, 1 runAthena))
fdr08_run2.0052300.physics_Bphys.merge.TAG.o3_f48_m27    (1 file - makes 2 jobs (1 build, 1 runAthena))
fdr08_run2.0052300.physics_Jet.merge.TAG.o3_f48_m27         (12 files - makes 2 jobs (1 build, 1 runAthena))
fdr08_run2.0052300.physics_Minbias.merge.TAG.o3_f48_m27   (4 files - makes 2 jobs (1 build, 1 runAthena))
fdr08_run2.0052300.physics_Muon.merge.TAG.o3_f48_m27      (2 files - makes 2 jobs (1 build, 1 runAthena))

DB access job

Reference: a job from Katharina Fiekas, referred by Sasha Vanyashin. This job uses AtlasProduction- and TauDPDMaker package. From Sasha: This job makes a dozen of database connections to access various conditions from the Tier-1 Oracle server. The geometry is read from the SQLite replica. We do not replicate the Geometry DB to the Tier-1 Oracle servers yet. However, use of the IOV service is limited, since this is an analysis job for the simulated data.

1. Setup release:

cd StressTest
cp  /usatlas/u/nurcan/StressTest/ .
source setup.sh
Here is the setup.sh file:
source ${KitBasePath}/cmtsite/setup.sh -tag=,AtlasProduction,runtime
source ${KitBasePath}/AtlasProduction/
export CVSROOT=/afs/usatlas/software/cvs
source ../mypanda/etc/panda/panda_setup.sh

2. Check out necassary packages:

cmt co -r EventViewUserData-14-05-00-16 PhysicsAnalysis/EventViewBuilder/EventViewUserData
cmt co -r TauDPDMaker-00-04-19 PhysicsAnalysis/TauID/TauDPDMaker

3. Compile:

cd WorkArea/cmt
cmt bro cmt config
cmt bro gmake -j2

4. Go to your run area and get the jobOption file:

cd ../run
get_files -jo aodtodpd_ganga_example.py

5. Run athena: First make a file called InputCollections.py with the following in it (you can use the same AOD file as used in previous jobs as above):

InputCollections = ["/usatlas/u/nurcan/StressTest/14.5.0/WorkArea/run/SU1file/AOD.026553._00023.pool.root.1"]
and run athena:
athena aodtodpd_ganga_example.py InputCollections.py

6. If athena run is successful (output TertiaryTauDPD_DefaultName.root created) then run with pathena:

pathena --inDS mc08.105200.T1_McAtNlo_Jimmy.merge.AOD.e357_s462_r635_t53/ --outDS user09.YourGridName.testDBaccess --site ANALY_AGLT2 --nfiles 1 aodtodpd_ganga_example.py

7. This job successfully runs at ANALY_AGLT2, ANALY_MWT2, ANALY_NET2, ANALY_BNL_ATLAS_1. However it fails at ANALY_SLAC and ANALY_SWT2_CPB. The remedy was to use PyUtils/AthFile.py to be able to process correctly files with URL's root:// and dcap://. Sebastien Binet made a tag that can be used for Release 14 jobs: Tools/PyUtils-00-03-56-02. Below are the instructions to get this package. This is being tested currently at SWT2 and SLAC.

8. This package is available in SVN, not in CVS:

export SVNROOT=svn+ssh://lxplusAccountName@svn.cern.ch/reps/atlasoff
svn co $SVNROOT/Tools/PyUtils/tags/PyUtils-00-03-56-02  Tools/PyUtils

9. Then go through the same steps as above starting from step #3.

A stress test is run on the following dataset for this job:

(200 files - makes 201 jobs (1 build, 200 runAthena), each job uses one input file)

AthenaRootAccess job (not setup yet)

Reference: https://twiki.cern.ch/twiki/bin/view/AtlasProtected/PatTutorial140210

Data reprocessing job (not setup yet)

Reference: https://twiki.cern.ch/twiki/bin/view/Atlas/DataPreparationReprocessing

-- RobertGardner - 16 Mar 2009

About This Site

Please note that this site is a content mirror of the BNL US ATLAS TWiki. To edit the content of this page, click the Edit this page button at the top of the page and log in with your US ATLAS computing account name and password.


Powered by TWiki
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback