r5 - 21 Jul 2009 - 14:11:27 - KaushikDeYou are here: TWiki >  Admins Web > MinutesDataManageJul21

MinutesDataManageJul21

Introduction

Minutes of the US ATLAS Data Management meeting, July 21, 2009
  • Previous meetings and background : IntegrationProgram
  • Coordinates: Tuesdays, Noon Central
    • (309) 946-5300, Access code: 735188; Dial *6 to mute/un-mute.

Attending

  • Meeting attendees: Shawn, Armen, Charles, John, Michael, Wensheng, Patrick, Hiro
  • Apologies: Pedro, Wei
  • Guests:

Topics for this week

  • Local site movers - Charles
    • UC, IU, NE, ITB: working well
    • AGLT2: under development for Chimera
    • BNL: Hiro and Paul working on using pnfsid
  • Storage status - Armen
    • BNL: space on all space tokens look ok, new extension to Thorr's being commissioned, maybe new 800 TB by end of week
    • Continuing work on Bestman reproting
  • BNL worker node storage retirement plan - Michael
    • Going according to plan, in 1 week all acas storage will be gone when batch nodes upgrade to SL5
    • Armen, Pedro working on details
  • BNL deletion rate
    • Central deletion should be limited to 1 Hz
    • Hiro's deletion should be limited to 2 Hz
  • Incorrect GUID's - Hiro
    • Some Fast Reprocessing files had wrong GUID from pilot job recovery
    • Hiro fixed some files manually
    • Pilot will be fixed in next release
    • Kaushik will let sites know which files are important to be fixed - otherwise ignore problem
  • Dataset consistency
    • Action item from last week - run Charles ccc at all Tier 2 sites, and look at results, to decide if we need to take this issue up with DQ2 team
      Tier 2 Result/comment Link to log file
      AGLT2 Standard run at AGLT2. Note only COMPLETE datasets are checked (see hot issue below) run_ccc.log and ccc_20090720_005616.log
      run_ccc_jul21.log and ccc_20090721_005553.log
      NET2
      MWT2 http://www.mwt2.org/sys/ccc
      SLACT2 I found it is a little hard to use ccc.py out of the dcache world. So I am converting my own disaster recovery tools to do the LFC=>storage consistency check. SLACMissingFilesJul21
      SWT2
    • Discussion AGLT2 - increasing number of LFC orphans (in pnfs but not LFC), and some ghosts (in LFC but not dcache)
    • Discussion MWT2 - using a new version, better html output, few problems found
    • Charles - 30 min time cutoff to avoid transient issues, seems to be enough for now
    • Charles - central deletion is making this complicated, can we get status from DQ2 while dataset is being deleted?
    • Kaushik - will follow up on the issues found via email to DDM team
  • Missing subscriptions - Hiro, Shawn
    • AGLT2 - don't understand what was wrong, but working now, maybe hanging connections to CERN?
    • Action item - continue debugging, follow up next week
  • Hot issues
    • AGLT2 has had a problem posted about a user dataset (see https://rt-racf.bnl.gov/rt/Ticket/Display.html?id=13599) User was unable to get dq2-get to work for them on dataset group09.PhotonAnalysis.mc08.105802.JF17_pythia_jet_filter.merge.AOD.e347_s462_d153_r643_t53.NTUP.rel15301.PAU-00-00-10.v1 I would like to understand if there is a DQ2 problem here. Using dq2-list-file-replicas for this dataset fails.
  • AOB


-- KaushikDe - 20 Jul 2009

About This Site

Please note that this site is a content mirror of the BNL US ATLAS TWiki. To edit the content of this page, click the Edit this page button at the top of the page and log in with your US ATLAS computing account name and password.


Attachments


log run_ccc.log (587.3K) | ShawnMckee, 20 Jul 2009 - 15:43 | Top-level AGLT2 run log for ccc.py
log ccc_20090720_005616.log (2.2K) | ShawnMckee, 20 Jul 2009 - 15:43 | Today's logfile from running ccc.py at AGLT2
log run_ccc_jul21.log (129.8K) | ShawnMckee, 21 Jul 2009 - 11:23 | Run ccc log from July 21
log ccc_20090721_005553.log (2.2K) | ShawnMckee, 21 Jul 2009 - 11:24 | ccc.py log from July 21
 
Powered by TWiki
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback