r12 - 08 Jun 2006 - 14:32:30 - VivekJainYou are here: TWiki >  AtlasSoftware Web > AnalysisSupportMuonAODTutorial
Atlas_Medal_t.jpg

U.S. ATLAS

Muon AOD Tutorial

Note that this tutorial is based on the various tutorials that Ketevi has given over the years. Here we will focus on understanding how to do analysis on the AOD with the most recent releases of Atlas software in preparation for the CSC/DC3 data analysis with specifically looking at Muons in the AOD. We will also cover the idea of the so called "Athena aware" ntuples as well as the Atlantis event display

In this tutorial we will take a very practical approach. That is, I am assuming that most people do not want to learn every detail about how the Atlas framework works but rather would like to get started in doing some analysis. In my experience one could spend years studying the framework and still not understand it fully. If you really must know here are some links to more information. But proceed with caution ...

Computing Information - Welcome to the Jungle

For more information on the Atlas Framework there are several resources:

And General Computing Information Can be found at:

For information on mailing lists and how to subscribe to them check out Mailing Lists. I suggest that you subscribe to at least the following lists:

There are of course many many others you can join. Take a look at what you might be interested in and join them. Yes, you will get a lot of noise but it only takes a few minutes a day to quickly scan the email to see if it has any relevance to you. Plus now it seems as if the archives are actually searchable!

Logging into BNL

> ssh atlasgw00.bnl.gov -l username

> rterm -i

Preparing your account to use the ATLAS Releases

ATLAS software is divided into packages? , and these are managed through the use of a configuration management tool, CMT. This is used to copy ("check out") code from the main ATLAS repository and handle linking and compilation. At CERN you must prepare your account, in the following way:

> cd $HOME

 Vivek's comments: Since we will be running release 11.0.41 and working in an area called workarea/DC3, 
at this point, do the following before anything else

> mkdir workarea
> mkdir workarea/DC3
> mkdir workarea/DC3/11.0.41 

> mkdir cmthome
CMT configures your working environment through a "requirements file". To have a clean directory structure, I recommend keeping your requirements file in a directory like the one created above: cmthome. Keep in mind that this isn't really needed - it can be anywhere in your directory since it is the script files that it sources and the environmental variables that it sets that are really important.

Here is an example of a requirements file:

# Set the site where you are working. For BNL set:
set CMTSITE BNL
set SITEROOT /afs/usatlas.bnl.gov
# For portables:
#set CMTSITE EXTSITE
# Other possible values are: LBNL, BNL
 
# Set the base location of all release for your site:
macro ATLAS_DIST_AREA ${SITEROOT}/software/dist
 
# Set the location of your preferred development area, where packages
# will be checked out, (change "${HOME}/MyTest" to be any directory
# path or soft link under your home directory):
macro ATLAS_TEST_AREA "" \
   11.0.41    "$HOME/workarea/DC3"
 
# Choose to run with optimized code by default
apply_tag opt

# Set standard Atlas environment
use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)

Lets go through the file line by line. The first to lines simply set the environmental variables CMTSITE and SITEROOT to appropriate values. We are working at BNL and the root directory of atlas software at BNL is at /afs/usatlas.bnl.gov . The third line sets the base location for all of the various releases. If you look at that area you can find the all the releases that are available at BNL. The next line macro ATLAS_TEST_AREA helps set your local working area. In my case I have a directory called workarea/DC3/ off my home area. The next line sets the default type of build (optimized) that will occur when making your code. The final line sets various standard Atlas environmental variables.

In order to setup your environment for the FIRST time, setup the most recent version of CMT:

Vivek's comments: Copy the requirements file from Kevin :

> cd cmthome
> cp ~black/cmthome/requirements .

> source /afs/usatlas.bnl.gov/cernsw/contrib/CMT/v1r18p20060301/mgr/setup.csh

> cmt config

this will create setup files in your cmt homearea. To setup a particular release and working area:

> source ~/cmthome/setup.csh -tag=11.0.41,opt

This line sets up the release 11.0.41 and your working area. Look at the requirements file and see where the 11.0.41 is mentioned - this sets the working directory and release you will be working with. To make sure that everything is kosher do the following:

To complete setting up your run time environment, you need to do ( Vivek put this up here )

> source $AtlasArea/Control/AthenaRunTime/*/cmt/setup.csh

> echo $CMTPATH

/direct/usatlas+workarea/black/DC3/11.0.41:
/afs/usatlas.bnl.gov/software/dist/11.0.41:
/afs/usatlas.bnl.gov/offline/external/Gaudi/0.16.1.8
:/afs/usatlas.bnl.gov/offline/external/LCGCMT/LCGCMT_36_2

and


> echo $LD_LIBRARY_PATH
/usr/lib:/lib:/direct/usatlas+workarea/black/DC3/11.0.41/InstallArea/i686-slc3-gcc323-opt/lib:
/afs/usatlas.bnl.gov/software/dist/11.0.41/InstallArea/i686-slc3-gcc323-opt/lib
:/afs/usatlas.bnl.gov/offline/external/Gaudi/0.16.1.8/InstallArea/i686-slc3-gcc323-opt/lib
......

There will be many more lines after this. The key thing to notice is that now the variable CMTPATH includes your working directory as well, the library path includes an area in your working area. Basically, this means that if you now try to check out and build code it will build it in the location of your LD_LIBRARY_PATH and when you try to run Athena it will first look at your local area and then at the global release area.

Understanding this is essential to understanding how the Atlas Framework works. The Atlas framework has ONE program and it is called Athena. There are many many different packages which are modules that get run by the framework. The code is built in a series of sharable libraries. In fact to simply run Atlas code you don't need to check out anything at all. Just set up the atlas environment and have a "jobOptions" which tells Athena which packages and algorithms to run.

Whe you check out a piece of code, modify it and compile it since your local area is in the front of the library path it will now pick up your local copy rather than what comes with the release.

AOD Structure and More or Everything you ever wanted to ask about the AOD but were afraid to ask

Event Types

The AOD is one file type in the Atlas Hierarchy. According to the Atlas TDR the rough model goes something like this

  • RAW Data - Events as output by the Event Filter (The final level of the Atlas Trigger). The model assumes that the events will have an average size of about 1.5 MB arriving at a rate of about 200 Hz. As indicated by the name, this is all the data that is fit to print. The data format is a "byte-stream" - that is a organized set of packed bits containing all the information about the event.

  • ESD (Event Summary Data) - This is the output of reconstruction (large format) in a "high-level" C++ object oriented format. It is intended that the enough information could be stored in the ESD so that reconstruction could be rerun from the ESD. The target size is about 0.5 Mb. The "final format" has yet to be decided - as one would like to keep as much information as possible to be able to rerun reconstruction with new calibration and alignment constants but it is unclear that this is doable with the space limitation. Most likely this will end up storing hits on tracks, perhaps hits near tracks, and calorimeter cells in physics objects or near them. However, this is still open to debate....

  • AOD (Analysis Object Data) - This is the reduced event representation output of reconstruction suitable for most physics analysis. The basic idea is that the average end user (analyzer) will not need nor want to know about hits near physics objects but rather just the final event data objects used for analysis. It is envisioned that this will be the starting point for most physic analysis - about 100 kB.

  • TAG - This is an event-level meta-data which is a thumbnail to give efficient identification and selectgion of events of interest to a given analysis. These will be stored in a relational database. The basic idea is that if you are only interested in events which have two or more muons there is no need to unpack every AOD and check rather this will have a simple list of the objects in the events so that the event selection process can be done very efficiently - about 1 kB per event.

AOD Structure

A detailed description of the different "containers" that are available from the AOD can be found at AOD Keys. Here we will focus on what is available for the muons. In this tutorial we will be looking at the CSC/DC3 AODs which has been restructrued from the Rome era AODs.

Muons in the AOD

There are various reconstruction programs which can produce muons. The ones of importance to us here are the "high Pt" algorithm Moore and the low Pt algorithm. These muons are merged and placed inside one MuonContainer? . There is also a container for muons created with another reconstruction algorithm Muonboy, MuTag? , and Staco which we won't discuss here. At anyrate, once inside the AOD the muons are all stored in the same class so it doesn't matter what the author is when writing code to analyze the output.

The class that stores the muons can be found at: AOD Muon Class.

Here is a list of some of the most important functions and what they return:

  • px()

  • py()

  • pz()

  • pt()

  • eta()

  • phi()

  • isCombindedMuon() -This method retruns true if the muon is the result of the combined fit of an inner detector track and a Muon Spectrometer standalone track AND it is matched to the inner detector trach with the best matching chi2

  • isStandAlone() - Returns true if it was a track found by the Muon Spectrometer standalone tracking

  • isLowPtReconstructedMuon() - Returns true if it is a muon that was found by the lowPt reconstruction algorithm

  • numberOfMDTHits()

  • numberOfCSCEtaHits()

  • numberOfCSCPhiHits()

  • numberOfRPCEtaHits()

  • numberOfRPCPhiHits()

  • numberOfTGCEtaHits()

  • numberOfTGCPhiHits()

  • numberOfBLayerHits()

  • numberOfPixelHits()

  • numberOfSCTHits()

  • numberOfTRTHits()

  • numberOfTRTHighThresholdHits()

  • energyLoss() -returns the energy deposition in the calorimeter and its error

  • matchChi2() - If matched to an inner detector track the matching Chi2

  • matchNumberDoF() - Number of degrees of freedom in the track matching Chi2

  • fitChi2()- chi2 of the track fit

  • fitNumberDoF() - Number of degrees of freedom in the track fit

Test to see if there are available associated track objects

  • hasInDetTrackParticle()

  • hasMuonSpectrometerTrackParticle()

  • hasMuonExtrapolatedTrackParticle()

  • hasCombinedMuonTrackParticle()

Access to the Track Particles

  • inDetTrackParticle()

  • muonSpectrometerTrackParticle()

  • muonExtrapolatedTrackParticle()

  • combinedMuonTrackParticle()

The information available from the Track Particle Class can be found at TrackParticle

Checking out an Example Analysis Package

Today we will use the sample analysis package that Ketevi has setup which acts as a skeleton. It has a bit of information inside of the code already and an example of how to extract information for the electrons. We will use this as our starting point and then add muons into the mix.

First check out the package. Go to your working directory (if you can't remember do an echo $CMTPATH and the first directory on the list - which should be in your local area - is your working directory). In this example, the working directory should be $HOME/workarea/DC3/11.0.41

Note: In this example we are checking out a specific version of the package, namely 00-05-11. If you were to simply check out the package it would give you the head version. In general this is not compatible with the version you are using

cmt co -r UserAnalysis-00-05-11 PhysicsAnalysis/AnalysisCommon/UserAnalysis

Now compile

cd PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-05-11/cmt
Edit the "requirements" file and comment out this line

use AnalysisRunTime            AnalysisRunTime-*      PhysicsAnalysis/AnalysisCommon

Furthermore, add this line just above the first line that says: apply_pattern

apply_tag  ROOTMathLibs

Then do this:

cmt config
source setup.csh
cmt broadcast gmake

We would also like to be able to produce files which are readable by the Atlantis event display. It turns out that for this particular release and the information we would like to display we need to check out one more package. In a perfect world we wouldn't have to check this package out since we don't intend to modify it. However, it so happens that this particular release comes with a release of this package which does not allow us to produce the type of of objects in the xml files for Atlantis. So we must do:

Return to your working directory

cmt co -r JiveXML-00-04-46-03 graphics/JiveXML

cd graphics/JiveXML/JiveXML-00-04-46-03/cmt
cmt config
source setup.csh
cmt broadcast gmake

Vivek's comments: The last step takes a while. Get some coffee

If your wondering, couldn't we have compiled these two packages at the same time. The answer is yes! In this case since we edited the requirements file for one of the packages I wanted to keep the instructions as simple as possible. However, one can check out multiple packages and then go to the cmt area of one of them and do the cmt broadcast gmake. Note that in order to compile just the package in the which directory structure you are in it is sufficient to just type the typical gmake command. The cmt broadcast gmake command takes advantage of the cmt system and your path variables to know to search for make files under the working area directory and to try to make all the files. If you are checking out a bunch of packages this is clearly more efficient.

To make sure everything worked try:

Vivek's comments: In your working directory make a sub-directory called run.

> cd $HOME/workarea/DC3/11.0.41
> mkdir run

get_files PDGTABLE.MeV
get_files HelloWorldOptions.py
athena.py -b HelloWorldOptions.py

Stuff like this should be printed to the screen:

HelloWorld           INFO execute()
HelloWorld           INFO An INFO message
HelloWorld           WARNING A WARNING message
HelloWorld           ERROR An ERROR message
HelloWorld           FATAL A FATAL error message
AthenaEventLoopMgr   INFO   ===>>>  end of event 9    <<<===
HistorySvc           INFO Service finalised successfully
ChronoStatSvc.f...   INFO  Service finalized succesfully
ToolSvc              INFO Removing all tools created by ToolSvc
ApplicationMgr       INFO Application Manager Finalized successfully
ApplicationMgr       INFO Application Manager Terminated successfully

Vivek: Now go to $HOME/workarea/DC3/11.0.41/PhysicsAnalysis/AnalysisCommon/UserAnalysis/UserAnalysis-00-05-11/share

Now edit yoour joboptions (AnalysisSkeleton? _jobOptions.py) file found in the share directory to run over some AODs:

Vivek: Remove the line that is currently there and add one or more of these files

EventSelector.InputCollections = [
  "/direct/usatlas+workarea/black/DC3/files/csc11.005601.Zprime_mumu_pythia_SSM1000.recon.AOD.v11004103._00002.pool.root.1",
  "/direct/usatlas+workarea/black/DC3/files/csc11.005601.Zprime_mumu_pythia_SSM1000.recon.AOD.v11004103._00003.pool.root.2",
  "/direct/usatlas+workarea/black/DC3/files/csc11.005601.Zprime_mumu_pythia_SSM1000.recon.AOD.v11004103._00004.pool.root.1",
  "/direct/usatlas+workarea/black/DC3/files/csc11.005601.Zprime_mumu_pythia_SSM1000.recon.AOD.v11004103._00019.pool.root.5",
                                 ]

Now go to the src directory: PhysicsAnalysis? /AnalysisCommon/UserAnalysis/UserAnalysis-00-05-11/src/

and the header directory PhysicsAnalysis? /AnalysisCommon/UserAnalysis/UserAnalysis-00-05-11/UserAnalysis

there is one piece of code AnalysisSkeleton? .cxx and AnalysisSkeleton? .h

Lets take a look at them:

AnalysisSkeleton.cxx

AnalysisSkeleton.h

Note that there are four public functions (besides the constructor and destructor)

  • CBNT_initialize()
  • CBNT_finalize()
  • CBNT_execute()
  • CBNT_clear()

The initialize function is executed only once every time you run Athena. Lets take a look at what it does. First it sets up access to the "StoreGate Service". This the fancy way in which Athena writes objects to and from memory. It also sets up a few other services the histograming service and the AnalysisToolsService? . Finally, it adds the branches to the ntuple we will create and defines some histograms.

The finalize method essentially does nothing while the clear function initializes the branches and variables in our root tuple.

Currently the execute method only does one thing, it calls a private function "electron" Take a good look at what is done in the method. We will be using this as our basis for a similar method for muons.

First it retreives two containers the "TruthParticleContainer" and the "ElectronContainer". If it is able to retrieve both of these containers it loops through the electron container and extracts the information from the AOD filling the histograms and ntuple.

Exercises

  • Run the package as is and make sure it executes without crashing. To do so make a run directory off of your working area ( Vivek: We have already done this). Copy the file AnalysisSkeleton? _jobOptions.py that you edited above to this area. Note that there are two ways one can do this. Simply explicity copy it or use the get_files command. In this case using get_files is overkill, but its a nice utility for finding files when you know they exist somewhere in the release but not where.

  • Write a new function which access the muon container "MuidMuonCollection" and add variables to the ntuple and a few histograms. See if you can fill them and run over a hundred events or so. You will have to add to both the header and source file of the UserAnalysis? package and recompile.

  • Compute the invariant mass of dimuon pairs in the muon AOD. Look, you have just found the Z'. Now repeat with data and win fame and glory (or rather wait for me to do it first and then confirm my results).

  • Use TruthParticleContainer? in a similar way that Ketevi has done for the electrons. There is a fancy tool which Ketevi has developed which, as part of its functionality, can match a reconstructed particle to its Monte Carlo truth particle in simulation. Here this is a delta R matching procedure which, if you come from hadron collider physics you have written many many times yourself.

  • Now that you have matched so called "truth" muons to reconstructed muons to reconstructed muons make a few plots studying the efficiency, fake rate, and resolutions. To make things simple, in this sample there are two real high Pt muons from the Z'. For our purposes just assume that these are the two only real muons - of course there are many different sources of fakes, so called "real fakes" (this is actually psuedo-official muon perfomance terminology so don't blame me)- mostly muons from pion/kaon decay and heavy quark semileptonic decays. There are the so called "fake fakes" like showers and punch through and finally pattern recognition failure.

  • Turn on the production of XML files for the Atlantis Event display. We will take a look at what muons look like in this display in a couple different views. You can do this by adding the following to the end of your jobOptions file:
         include( "JiveXML/JiveXML_jobOptionBase.py" )
         include( "JiveXML/DataTypes_AOD.py" )
         ToolSvc.EventData2XML.Muonkey = "MuidMuonCollection"
         include( "JiveXML/DataTypes_Reco.py" )
         
    If you are so inclined you can go look up exactly what these are doing. But basically this is telling Athena to run the JiveXML? package, specifiying that you are looking at an AOD and telling it that you would like to look at the MuidMuonCollection? collection for the muons as well as other other reconstructed quantities. Run over a number of events and produce some xml files.

Atlantis Event Display

There are at least three general event displays that one can use to visualize events in the detector. The most portable, light wieght, and most user friendly is called Atlantis. It turns out that you can run the event display from your laptop on either a linux, windows, or mac platform. Because it is decoupled from the Atlas framework this means that you don't need anything related to Atlas software on your computer to run the display program other than the progam itself and the xml files that it reads. This allows it to be very portable, run at least an order of magnitude faster than any Atlas framework module, and user friendly (i.e. it will take you less only a few minutes to get it running and half an hour to start using it usefully rather). It does have one disadvantage - it is decoupled from the Atlas framework. Most importantly, the geometry model of Atlantis is simple. Really simple. Really, really simple. This is great because it means that the graphics rendering is lightening fast (though it does remind me playing pong) but not so great because some times the geometry model is so basic that detector hits actually are not on the detector elements themselves. Because of the complexity of the atlas muon spectrometer, this can be frustrating. However, I would still maintain that for looking at events from a "physics point of view" it is the best that we have and even from the point of view of looking at reconstruction objects - segments and tracks - it is still pretty good. If you are deeply involved in the core muon software and need to know the difference between layer 1 and layer 4 of foam or the exact position of the cable that reads out the CSM in the BOL sector 12 then you are out of luck. But it will certainly serve our purposes here.

  • Unpack the file

  • Startup Atlantis

  • The Muons from the AOD appear in a few different views. In the default view (looking head on) they appear as red blobs in the muon system. In the r-phi plane you can see them as red circles. And finally, in the lego plot they appear as thin red lines

  • Play around with the display for a while. See if you can change views, zoom, and click on objects

  • Check out the Wiki and the "static" webpage Static

-- KevinBlack - 08 May 2006

About This Site

Please note that this site is a content mirror of the BNL US ATLAS TWiki. To edit the content of this page, click the Edit this page button at the top of the page and log in with your US ATLAS computing account name and password.


Attachments


jpg Atlas_Medal_t.jpg (18.3K) | KevinBlack? , 08 May 2006 - 16:23 |
 
Powered by TWiki
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback