Commissioning TAG Skim

Introduction

https://twiki.cern.ch/twiki/bin/view/Atlas/CommissioningTag#Using_TAG_COMM_for_spring_09_rep%20. https://twiki.cern.ch/twiki/bin/view/Atlas/CosmicCommissioningReconstructionStatus#Using_TAG_to_Read_from_ESD

Setup


First failed attempt

Unable to use 14.2.23
. /osg/app/atlas_app/atlas_rel/14.2.23/cmtsite/setup.sh -tag=14.2.23.2,AtlasTier0,32,opt
configuration error:
#CMT> The tag 14.2.23.2 is not used in any tag expression. Please check spelling
AtlasLogin: Configuration problem - /nfs/osg/app/atlas_app/atlas_rel/14.2.23/AtlasTier0/not_numbered non-existent
https://twiki.cern.ch/twiki/bin/view/Atlas/AtlasLogin

. /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh -tag=14.2.24.4,AtlasTier0,32,opt
. /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh

export CMTPATH=$CMTPATH:`pwd`

export PATHENA_GRID_SETUP_SH=/share/wlcg-client/setup.sh
source /ecache/marco/test_pathena/pandabin/etc/panda/panda_setup.sh

pathena --fileList=TAG.root --shipInput --outDS=user09.MarcoMambelli.test.COMM_TAG.090601.1 readesdusingtag.py                                                                                             
Pathena is returning an error:
  test_pathena test090601 (in /ecache/marco/test_pathena/test090601)                                                                                                 
 (current) P=LCGCMT
 ERROR : cmt gave wrong info

Second attempt

From Sergey Panitkin:
This is a suggestion from Tadashi.
"First, .../AtlasTier0RunTime/cmt/setup.sh
needs to be sourced after changing CMTPATH. Second, the current directory
needs to be added to the top of CMTPATH. Otherwise, cmt doesn't setup environment variables to use the current directory."

  . /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh -tag=14.2.24.4,AtlasTier0,32,opt
  export CMTPATH=`pwd`:$CMTPATH
  . /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh
  export PATHENA_GRID_SETUP_SH=/share/wlcg-client/setup.sh
  source /ecache/marco/test_pathena/pandabin/etc/panda/panda_setup.sh
  pathena --fileList=TAG.root --shipInput --outDS=user09.MarcoMambelli.test.COMM_TAG.090602.1 readesdusingtag.py

This time pathena starts but no output is found.
[uct3-edge5] /ecache/marco/test_pathena/test090601 > pathena --fileList=TAG.root --shipInput --outDS=user09.MarcoMambelli.test.COMM_TAG.090602.1 readesdusingtag.py 
INFO : extracting run configuration
INFO : ConfigExtractor > Input=COLL
INFO : ConfigExtractor > InputFiles TAG.root 
ERROR : No output stream was extracted from jobOs or --trf. If your job defines an output without Athena framework (e.g., using ROOT.TFile.Open instead of THistSvc) please specify the output filename by using --extOutFile. Or if you define the output with a relatively new mechanism please report it to Savannah to update the automatic extractor

The output files have been suggested by Jack that ran locally at CERN Then I forgot to fix the query line (that was failing also in local execution)
http://panda.cern.ch:25980/server/pandamon/query?job=1009435754

EventSelector.sysInitialize()                       FATAL  Standard std::exception is caught 
EventSelector.sysInitialize()                       ERROR Could not construct TTreeFormula object, probably because of incorrect ROOT predicate syntax in expression `PixelTracks==1&&SCTTracks==1&&TRTTracks==1&&TileMuonFitter>0&&MooreTracks>0&&ConvertedMBoyTracks>0&&NpixSPs>2' ( POOL : "RootCollectionQuery::execute" from "RootCollection" )
ServiceManager                                      ERROR Unable to initialize service "EventSelector"
ProxyProviderSvc                                    ERROR ServiceLocatorHelper::createService: can not create service EventSelector of type EventSelectorAthenaPool
ProxyProviderSvc                                    ERROR  getting Address Provider EventSelectorAthenaPool/EventSelector
JobOptionsSvc                                       ERROR Unable to set the property 'ProviderNames' of 'ProxyProviderSvc'. Check option and algorithm names, type and bounds.

New submission after fixing the formula. Note that getGUIDfromColl not supported causes Pathena not being able to find the actual input files:
  • I have to check where ESD are and send the job there: ANALY_BNL_ATLAS_1
  • there may be jobs on more files than the ones needed (they will produce no output)
[uct3-edge5] /ecache/marco/test_pathena/test090601 > pathena --site=ANALY_BNL_ATLAS_1 --fileList=TAG.root --shipInput --outDS=user09.MarcoMambelli.test.COMM_TAG.090604.3 --extOutFile myMonitoringESD.root readesdusingtag.py 
INFO : extracting run configuration
INFO : ConfigExtractor > Input=COLL
INFO : ConfigExtractor > InputFiles TAG.root 
INFO : archiving source files
INFO : archiving InstallArea
INFO : checking symbolic links
INFO : uploading source/jobO files
WARNING : getGUIDfromColl is not supported in 14.2.24
INFO : submit to ANALY_BNL_ATLAS_1
===================
 JobID  : 2437
 Status : 0
  > build
    PandaID=1009435662
  > run
    PandaID=1009435663-1009435754

Local Run

Download

dq2-get to copy the input ESD from BNL after checking the dataset size (25GB). It took less than 14 min.
> time dq2-get -L UCT3 -s BNL-OSG2_DATADISK -P data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52
Querying DQ2 central catalogues to resolve datasetname data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52
Datasets found: 1
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52: Querying DQ2 central catalogues for replicas...
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52: Using complete replica at given site
Querying DQ2 central catalogues for files in dataset...
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52: Complete replica available
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52: Using site BNL-OSG2_DATADISK
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52: Querying local file catalogue of site BNL-OSG2_DATADISK...
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo05._0032.1: Getting SRM metadata for srm://dcsrm.usatlas.bnl.gov:8443/srm/managerv2?SFN=/pnfs/usatlas.bnl.gov/BNLT0D1/data08_cos/ESD/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo05._0032.1
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo03._0027.1: Getting SRM metadata for srm://dcsrm.usatlas.bnl.gov:8443/srm/managerv2?SFN=/pnfs/usatlas.bnl.gov/BNLT0D1/data08_cos/ESD/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo03._0027.1
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo05._0032.1: is cached.
...
data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52._lb0000._sfo05._0004.1: validated
Processing data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52 with PoolFileCatalog.xml
Finished

real    13m45.830s
user    4m56.476s
sys     3m53.510s
Insert the files in PFC.xml
for i in ../data08_cos.00087140.physics_RPCwBeam.recon.ESD.o4_f52/*; do pool_insertFileToCatalog $i; done
To add the 185 files took about 18-19 minutes.

Failed local attempts

Errors:
EventSelector                                        INFO ----- EventSelectorAthenaPool Initialized Properly
EventSelector                                        INFO EventSelection with query PixelTracks==1&&SCTTracks==1&&TRTTracks==1&&TileMuonFitter>0&&MooreTracks>0&&ConvertedMBoyTracks>0&&NpixSPs>2
RootCollection     Info Closing open collection 'TAG.root'
RootCollection     Info Opening Collection File TAG.root in mode: READ
RootCollection     Info File TAG.root opened
RootCollection     Info Root collection opened, size = 183527
Error in :  Bad numerical expression : "NpixSPs"
EventSelector.sysInitialize()                       FATAL  Standard std::exception is caught 
EventSelector.sysInitialize()                       ERROR Could not construct TTreeFormula object, probably because of incorrect ROOT predicate syntax 
in expression `PixelTracks==1&&SCTTracks==1&&TRTTracks==1&&TileMuonFitter>0&&MooreTracks>0&&ConvertedMBoyTracks>0&&NpixSPs>2' ( POOL : "RootCollectionQuery::execute" from "RootCollection" )
ServiceManager                                      ERROR Unable to initialize service "EventSelector"
ProxyProviderSvc                                    ERROR ServiceLocatorHelper::createService: can not create service EventSelector of type EventSelectorAthenaPool

CORAL/Services/ConnectionService  Warning Failure while attempting to connect to "sqlite_file:sqlite200/ALLP200.db": CORAL/RelationalPlugins/sqlite ( C
ORAL : "Connection::connect" from "/ecache/marco/test_pathena/test090601b/sqlite200/ is not writable" )
CORAL/Services/ConnectionService     Info  Connection to service "/nfs/osg/app/atlas_app/atlas_rel/14.2.24/DBRelease/6.1.1/sqlite200/ALLP200.db" establ
ished. Id=17d6f8a2-4fab-11de-913d-00e08143888c
CORAL/Services/ConnectionService     Info New session on connection to service "/nfs/osg/app/atlas_app/atlas_rel/14.2.24/DBRelease/6.1.1/sqlite200/ALLP
200.db" started for user "". Connection Id=17d6f8a2-4fab-11de-913d-00e08143888c
RalSessionMgr     Info Start a read-only transaction active for the duration of the database connection
RelationalDatabase     Info Instantiate a R/O RalDatabase for 'COOLONL_INDET/COMP200'
RelationalDatabase     Info Delete the RalDatabase for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Delete the RalSessionMgr for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Commit the read-only transaction active for the duration of the database connection
RalSessionMgr     Info Disconnect from the database server
IOVDbConnection                                      INFO *** COOL  exception caught: The database does not exist
IOVDbConnection                                      INFO Create a new conditions database: COOLONL_INDET/COMP200
RalSessionMgr     Info Instantiate a R/W RalSessionMgr for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Connect to the database server
IOVDbConnection                                     ERROR *** COOL  exception caught: No physical Update connection for "COOLONL_INDET" is available. (
 CORAL : "ReplicaCatalogue::replicasForConnection" from "CORAL/Services/ConnectionService" )
IOVDbConnection                                     ERROR Coudn't create a new conditions database: COOLONL_INDET/COMP200
IOVDbSvc                                            FATAL dbConnection is not correctly initialized. Stop.
ServiceManager                                      ERROR Unable to initialize service "DetectorStore"
GeoModelSvc                                         ERROR ServiceLocatorHelper::createService: can not create service DetectorStore
GeoModelSvc                                         FATAL DetectorStore service not found!
ServiceManager                                      ERROR Unable to initialize Service: GeoModelSvc
Py:Athena            INFO leaving with code 33: "failure in initialization"
ApplicationMgr                                      FATAL finalize: Invalid state "Configured"
ApplicationMgr                                       INFO Application Manager Terminated successfully

Successful local execution

Running at MWT2_UC. Using locally installed kit. Copy of COMP200.db and renaming as ALLP200.db
. /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh -tag=14.2.24.4,AtlasTier0,32,opt
export CMTPATH=`pwd`:$CMTPATH
. /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh
athena readesdusingtag.py >& log090609a &        

Hi Elizabeth,                                                                                                                                                             
I did not see any loadCondDBSetup (I grepped for DBSetup).                                                                                                                
                                                                                                                                                                          
I'm running locally out of the distribution kit:                                                                                                                          
. /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh  -tag=14.2.24.4,AtlasTier0,32,opt                                                                                 
export CMTPATH=`pwd`:$CMTPATH                                                                                                                                             
. /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh                                                                                
athena readesdusingtag.py >& log &                                                                                                                                        
                                                                                                                                                                          
Beside the readesdusingtag.py (joboption), I copied in the local directory the 2 subdirectories described in the instructions:                                            
InDetRecExample and RecExCond, containing 1 python file each                                                                                                              
                                                                                                                                                                          
Then I copied locally the databases as suggested in the instructions (ALLP200.* is a link to COMP200.* as requested):                                                     
> ls sqlite200/                                                                                                                                                           
ALLP200.db  ALLP200.tags  COMP200.db  COMP200.tags                                                                                                                        
> ls geomDB/                                                                                                                                                              
geomDB_sqlite_7.1  larHV_sqlite                                                                                                                                           
                                                                                                                                                                          
The only difference I can think from the example is that I'm using 14.2.24 instead of 14.2.23 (it was not working with 14.2.23 distribution kit).                         
                                                                                                                                                                          
The lines (in the joboption) that may determine the conditons chosen could be:                                                                                            
globalflags.DetDescrVersion.set_Value_and_Lock('ATLAS-GEO-03-00-00')                                                                                                      
globalflags.ConditionsTag.set_Value_and_Lock('COMCOND-ES1C-000-00')                                                                                                       
                                                                                                                                                                          
Should I change these?                                                                                                                                                    
                                                                                                                                                                          
                                                                                                                                                                          
David,                                                                                                                                                                    
I'll try your line on the pathena execution. At themoment I was trying the local one.                                                                                     
                                                                                                                                                                          
Thanks,                                                                                                                                                                   
Marco                                                                                                                                                                     
 

> Hi,                                                                                                                                                                     
> I have a job using commissioning tags to extract events from ESD It is described here:                                                                                  
> https: //twiki.cern.ch/twiki/bin/view/Atlas/CommissioningTag#Using_TAG_COMM_for_spring_09_rep%20                                                                        
> https: //twiki.cern.ch/twiki/bin/view/Atlas/CosmicCommissioningReconstructionStatus#Using_TAG_to_Read_from_ESD                                                          
>                                                                                                                                                                         
> I was able to run locally using COMP200 following the instructions here (and some help):                                                                                
> https: //twiki.cern.ch/twiki/bin/view/Atlas/CoolTroubles  and                                                                                                           
> https: //twiki.cern.ch/twiki/bin/view/Atlas/AthenaDBAccess#Switching_to_use_local_SQLite_fi                                                                             
>                                                                                                                                                                         
> To run locally I copied all the input files (TAG, ESD) and the database files:                                                                                          
> mkdir sqlite200                                                                                                                                                         
> cp /afs/cern.ch/user/a/atlcond/coolrep/sqlite200/COMP200.db sqlite200/ALLP200.db                                                                                        
> cp -r /afs/cern.ch/atlas/project/database/DBREL/packaging/DBRelease/current/geomDB geomDB                                                                               
>                                                                                                                                                                         
>                                                                                                                                                                         
> In Pathena I tried different --dbRelease options but all failed:                                                                                                        
> --dbRelease=ddo.000001.Atlas.Ideal.DBRelease.v06060101:DBRelease-6.6.1.1.tar.gz                                                                                         
> --dbRelease=ddo.40000.Atlas.conditions.CDRelease.v0000001:CDRelease.41620-44519.v0000.tar.gz                                                                            
> --dbRelease=ddo.20000.Atlas.conditions.CDRelease.v0000001:CDRelease.28940-28997.v0000.tar.gz                                                                            
> --dbRelease=ddo.000001.Atlas.Ideal.DBRelease.v060901:DBRelease-6.9.1.tar.gz (running now)                                                                               
>                                                                                                                                                                         
> Some of the failed jobs:                                                                                                                                                
> http: //panda.cern.ch:25980/server/pandamon/query?job=*&jobDefinitionID=2439&user=Marco%20Mambelli&days=3                                                               
> http: //panda.cern.ch:25980/server/pandamon/query?job=*&jobDefinitionID=2440&user=Marco%20Mambelli&days=3                                                               
> e.g:                                                                                                                                                                    
> http://panda.cern.ch:25980/server/pandamon/query?job=1010750339                                                                                                         
>                                                                                                                                                                         
>                                                                                                                                                                         
> Any idea on how to add COMP200 (instead of ALLP200) in Pathena?                                                                                                         
>                                                                                                                                                                         
> Thank you,                                                                                                                                                              
> Marco                                                                                                                                                                   
>                                                                                                                                                                         

Hi Elizabeth,                                                                                                                                                             
I had one offline suggestion from David but I'm having problems and I'm not understanding if it will solve the db access:                                                 
https://twiki.cern.ch/twiki/bin/view/Atlas/DataPreparationReprocessing                                                                                                    
                                                                                                                                                                          
All the examples I found or received in suggestions (including David's one) use the DBRelease files, that are quite small and don't think include the COMP200 database    
that is 1.6 GB                                                                                                                                                            
                                                                                                                                                                          
If I understood a previous email a workaround would be to package my own custom huge DBRelease and send it to the execution site with dq2.                                
Is this the only solution?                                                                                                                                                
It is doable but I may encounter limitation on the disk space available to analysis jobs.                                                                                 
                                                                                                                                                                          
Is it there any way to configure the job to access the Oracle DB?                                                                                                         
My jobs will run at BNL (ESDs are there) and I think they have a local replica of the Oracle DB.                                                                          
Anyone I could contact to check this?                                                                                                                                     
                                                                                                                                                                          
Thank you,                                                                                                                                                                
Marco                                                                                                                                                                     

Summary for Fred

Job definition

If I understood correctly the job is using commissioning tags to extract events from ESD. Input, joboption and extra files are defined in these twiki pages:

Those pages instruct to copy to the run directory the content of the attached option-extra.tgz: it contains 2 subdirectory InDetRecExample and RecExCond. Then they instructed to use the joboption attached as readesdusingtag.py.orig. I modified it and for the Pathena job I'm using the one attached as pathena_readesdusingtag.py.txt. For the local execution (the one that works) I'm using the one attached as local: local_readesdusingtag.py.txt.

The TAG file I'm using is data08_cos.00087140.physics_RPCwBeam.merge.TAG.o4_f52_m19._0001.1, attached as TAG.root.

Local execution

I was able to run locally using COMP200 following the instructions here (and some help):
  • https: //twiki.cern.ch/twiki/bin/view/Atlas/CoolTroubles and
  • https: //twiki.cern.ch/twiki/bin/view/Atlas/AthenaDBAccess#Switching_to_use_local_SQLite_fi
In short, to run locally I copied all the input files (TAG, ESD) and the database files in the run directory:
mkdir sqlite200                                                                                                                                                         
cp /afs/cern.ch/user/a/atlcond/coolrep/sqlite200/COMP200.db sqlite200/ALLP200.db                                                                                        
cp -r /afs/cern.ch/atlas/project/database/DBREL/packaging/DBRelease/current/geomDB geomDB                                                                               

Then to run the job:
. /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh -tag=14.2.24.4,AtlasTier0,32,opt
export CMTPATH=`pwd`:$CMTPATH
. /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh
athena readesdusingtag.py >& log090609a &

And the joboption is the attached: local_readesdusingtag.py.txt

Grid execution

I've been trying different combination and so far nothing works:

Currently the joboption, including your suggested change is the attached pathena_readesdusingtag.py.txt

Setup like:
cd /ecache/marco/test_pathena/test090601
. /osg/app/atlas_app/atlas_rel/14.5.2/cmtsite/setup.sh -tag=14.5.2.4,AtlasProduction,32,opt
export CMTPATH=`pwd`:$CMTPATH
. /osg/app/atlas_app/atlas_rel/14.5.2/AtlasProduction/14.5.2.4/AtlasProductionRunTime/cmt/setup.sh 
export PATHENA_GRID_SETUP_SH=/share/wlcg-client/setup.sh
source /ecache/marco/test_pathena/pandabin/etc/panda/panda_setup.sh
pathena --site=ANALY_BNL_ATLAS_1 --fileList=TAG.root --extOutFile myMonitoringESD.root --shipInput  --dbRelease=ddo.000001.Atlas.Ideal.DBRelease.v06060101:DBRelease-6.6.1.1.tar.gz readesdusingtag.py --outDS=user09.MarcoMambelli.test.COMM_TAG.090612.6
pathena --site=ANALY_BNL_ATLAS_1 --fileList=TAG.root --extOutFile myMonitoringESD.root --shipInput readesdusingtag.py --outDS=user09.MarcoMambelli.test.COMM_TAG.090612.8
Produced the failed job:
Py:Athena            INFO continued trace of "RecExCond/RecExCommon_flags.py"
 -+- 27 631 A       except Exception:
 -+- 27 632 A          logRecExCommon_flags.warning("Exception extracting streams from input file %s " % firstValidFi )
Py:RecExCommon_flags WARNING Exception extracting streams from input file TAG.root.sub_collection_4.root 
 -+- 27 633   
 -+- 27 634   
 -+- 27 635 A       logRecExCommon_flags.info("Extracted streams %s from input file " % streams )
Py:Athena            INFO continued trace of "RecExCommon/RecExCommon_topOptions.py"
Py:Athena            INFO continued trace of "./readesdusingtag.py"
Shortened traceback (most recent user call last):
  File "/direct/usatlas+OSG/atlas_app/atlas_rel/14.5.2/AtlasCore/14.5.2/InstallArea/share/bin/athena.py", line 485, in 
    include( script )
  File "./readesdusingtag.py", line 105, in 
    include ("RecExCommon/RecExCommon_topOptions.py")
  File "/direct/usatlas+OSG/atlas_app/atlas_rel/14.5.2/AtlasProduction/14.5.2.4/InstallArea/jobOptions/RecExCommon/RecExCommon_topOptions.py", line 66, in 
    include ( "RecExCond/RecExCommon_flags.py" )
  File "/direct/usatlas+OSG/atlas_app/atlas_rel/14.5.2/AtlasEvent/14.5.2/InstallArea/jobOptions/RecExCond/RecExCommon_flags.py", line 636, in 
    logRecExCommon_flags.info("Extracted streams %s from input file " % streams )
NameError: name 'streams' is not defined
Py:Athena            INFO leaving with code 8: "an unknown exception occurred"
Exception exceptions.TypeError: "argument of type 'NoneType' is not iterable" in  ignored

Setup like:
. /osg/app/atlas_app/atlas_rel/14.2.24/cmtsite/setup.sh -tag=14.2.24.4,AtlasTier0,32,opt
export CMTPATH=`pwd`:$CMTPATH
. /osg/app/atlas_app/atlas_rel/14.2.24/AtlasTier0/14.2.24.4/AtlasTier0RunTime/cmt/setup.sh
export PATHENA_GRID_SETUP_SH=/share/wlcg-client/setup.sh
source /ecache/marco/test_pathena/pandabin/etc/panda/panda_setup.sh
pathena --site=ANALY_BNL_ATLAS_1 --fileList=TAG.root --extOutFile myMonitoringESD.root --shipInput  - dbRelease=ddo.000001.Atlas.Ideal.DBRelease.v06060101:DBRelease-6.6.1.1.tar.gz readesdusingtag.py --outDS=user09.MarcoMambelli.test.COMM_TAG.090612.5
pathena --site=ANALY_BNL_ATLAS_1 --fileList=TAG.root --extOutFile myMonitoringESD.root --shipInput readesdusingtag.py --outDS=user09.MarcoMambelli.test.COMM_TAG.090612.7
Produced the failed job:
EventSelector                                        INFO ----- EventSelectorAthenaPool Initialized Properly
EventSelector                                        INFO EventSelection with query PixelTracks==1&&SCTTracks==1&&TRTTracks==1&&TileMuonFitter>0&&MooreTracks>0&&ConvertedMBoyTracks>0
RootCollection     Info Closing open collection 'TAG.root'
RootCollection     Info Opening Collection File TAG.root in mode: READ
RootCollection     Info File TAG.root opened
RootCollection     Info Root collection opened, size = 183527
IOVDbSvc                                             INFO  
RalDatabaseSvc     Info Instantiate the RalDatabaseSvc
RalSessionMgr     Info Instantiate a R/O RalSessionMgr for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Connect to the database server
CORAL/Services/ConnectionService     Info Loading default plugin for coral::IRelationalService: CORAL/Services/RelationalService
CORAL/Services/RelationalService     Info Found plugin for RDBMS technology "frontier" with native implementation
CORAL/Services/RelationalService     Info Found plugin for RDBMS technology "mysql" with native implementation
CORAL/Services/RelationalService     Info Found plugin for RDBMS technology "oracle" with native implementation
CORAL/Services/RelationalService     Info Found plugin for RDBMS technology "sqlite" with native implementation
CORAL/Services/RelationalService     Info Default implementation for RDBMS technology "frontier" is native
CORAL/Services/RelationalService     Info Default implementation for RDBMS technology "mysql" is native
CORAL/Services/RelationalService     Info Default implementation for RDBMS technology "oracle" is native
CORAL/Services/RelationalService     Info Default implementation for RDBMS technology "sqlite" is native
CORAL/Services/ConnectionService     Info Loading default plugin for coral::ILookupService: CORAL/Services/XMLLookupService
CORAL/Services/ConnectionService     Info Loading default plugin for coral::IAuthenticationService: CORAL/Services/XMLAuthenticationService
CORAL/Services/ConnectionService  Warning Failure while attempting to connect to "sqlite_file:sqlite200/ALLP200.db": CORAL/RelationalPlugins/sqlite ( CORAL : "Connection::connect" from "/home/tmp/Panda_Pilot_19570_1244843073/PandaJob_1011362423_1244843077/workDir/sqlite200/ is not writable" )
CORAL/Services/ConnectionService     Info  Connection to service "/home/tmp/Panda_Pilot_19570_1244843073/PandaJob_1011362423_1244843077/workDir/DBRelease/6.6.1/sqlite200/ALLP200.db" established. Id=7f455b40-579a-11de-acfa-00a0d1e7fd38
CORAL/Services/ConnectionService     Info New session on connection to service "/home/tmp/Panda_Pilot_19570_1244843073/PandaJob_1011362423_1244843077/workDir/DBRelease/6.6.1/sqlite200/ALLP200.db" started for user "". Connection Id=7f455b40-579a-11de-acfa-00a0d1e7fd38
RalSessionMgr     Info Start a read-only transaction active for the duration of the database connection
RelationalDatabase     Info Instantiate a R/O RalDatabase for 'COOLONL_INDET/COMP200'
RelationalDatabase     Info Delete the RalDatabase for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Delete the RalSessionMgr for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Commit the read-only transaction active for the duration of the database connection
RalSessionMgr     Info Disconnect from the database server
IOVDbConnection                                      INFO *** COOL  exception caught: The database does not exist
IOVDbConnection                                      INFO Create a new conditions database: COOLONL_INDET/COMP200
RalSessionMgr     Info Instantiate a R/W RalSessionMgr for 'COOLONL_INDET/COMP200'
RalSessionMgr     Info Connect to the database server
IOVDbConnection                                     ERROR *** COOL  exception caught: No physical Update connection for "COOLONL_INDET" is available. ( CORAL : "ReplicaCatalogue::replicasForConnection" from "CORAL/Services/ConnectionService" )
IOVDbConnection                                     ERROR Coudn't create a new conditions database: COOLONL_INDET/COMP200
IOVDbSvc                                            FATAL dbConnection is not correctly initialized. Stop.
ServiceManager                                      ERROR Unable to initialize service "DetectorStore"
GeoModelSvc                                         ERROR ServiceLocatorHelper::createService: can not create service DetectorStore
GeoModelSvc                                         FATAL DetectorStore service not found!
ServiceManager                                      ERROR Unable to initialize Service: GeoModelSvc
Py:Athena            INFO leaving with code 33: "failure in initialization"
ApplicationMgr                                      FATAL finalize: Invalid state "Configured"
ApplicationMgr                                       INFO Application Manager Terminated successfully

-- MarcoMambelli - 02 Jun 2009

  • TAG.root: tag file data08_cos.00087140.physics_RPCwBeam.merge.TAG.o4_f52_m19._0001.1
I Attachment Action Size Date Who Comment
TAG.rootroot TAG.root manage 7 MB 15 Jun 2009 - 16:02 MarcoMambelli tag file data08_cos.00087140.physics_RPCwBeam.merge.TAG.o4_f52_m19._0001.1
local_readesdusingtag.py.txttxt local_readesdusingtag.py.txt manage 3 K 12 Jun 2009 - 22:38 MarcoMambelli  
option-extra.tgztgz option-extra.tgz manage 2 K 15 Jun 2009 - 15:38 MarcoMambelli extra files used in joboption
pathena_readesdusingtag.py.txttxt pathena_readesdusingtag.py.txt manage 4 K 12 Jun 2009 - 22:38 MarcoMambelli  
readesdusingtag.py.origorig readesdusingtag.py.orig manage 3 K 15 Jun 2009 - 15:43 MarcoMambelli original joboption
readesdusingtag.py.txttxt readesdusingtag.py.txt manage 3 K 12 Jun 2009 - 22:39 MarcoMambelli  
Topic revision: r5 - 15 Jun 2009, MarcoMambelli
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback