Meeting October 3
Meeting
Tuesday 10/3 at 2.30pm at Argone
Phone conference ESNet Call: 510-665-5437
When: October 03, 2006, 02:30 PM America/Chicago
Meeting ID: 22333
Agenda
- Make the point of the activities and difficulties in moving forward
- DB deployment
- Data transfer
- Distributed analysis testing
- Current ATLAS situation
- Plan of action
Notes
Attending: Robert, Marco, Jack, Jerry, Ed
Discussion
In this section I tried to report the discussion. Feel free to correct or add to this.The most important is the action section.
Jack did tests on the DB taht worked correctly. He created the tables and recorded some sample TAGs.
Data transfers were problematic for several reasons: the transfer tools had glitches, files in Nordugrid were not accessible, files were outside our cloud (area of visibility).
Some testing was done by Jack and the results are here:
UsingPathena (last update Aug 15)
DDM user tools are currently under development. They not always work reliably but they will improve and the DDM Operation (support) group can help in day-to-day needs. Available resources are:
To request DDM available files at the MWT2 the default procedure for users should be to submit a BNL RT ticket (
https://rt-racf.bnl.gov/rt/ ) to MWTier2 queue. MWT2 support will take care of it and notify back. Developers of this group can also use the direct way, documented in
DQ2Subscriptions
We want to avoid to avoid to be penalized or blocked in our development by not working tools. So we should use DQ2 tools for the extent they works. Block out spots for production services. Develop something that is site specific.
E.g. Get some Tier2 hosted AOD. Make a proof of principles on those few files.
Ed's suggestions to handle this:
- our way, developing our own tools
- checking which working tools/scripts are used by local physicist like Erik, Ambreesh, ...
- Farm-out some work with collaborations. Tech-X in colorado has a data skimming project for CMS currently in Phase2. We could start a new Phase-1 project to extend to ATLAS what they are doing for CMS.
Robert objected that new projects take time, overhead and add external dependencies, even when extending existing components. We need something quickly, developed by ourselves starting from the scripts that Erik, Ambreesh, ... use.
[Ed/Robert] It is good to start completely locally, running ATHENA and maybe extend later.
[Robert] We can leverage our experience in packaging standalone basic self contained components for Tier2 or Tier3. Servers and likewise client pieces. A middle framework to look at a Dataset and loop over it. Focus on selecting events and get files.
For Data Service purposes we should use UC_VOB DDM server not to interfere with production.
Some interesting resources:
Actions
- Assess the CSC11 Datasets (AOD files) to transfer (Jack)
- Perform the transfer 1 (Marco)
- DS subscription
- Follow up with DDM Operations or Direct transfer if the previous fails in a couple of days
- Generate Tags and populate DB (Jack)
- Reassess the Rome Datasets (AODs and TAGs) to transfer (Jack)
- Perform the transfer 2 (Marco)
- DS subscription
- Follow up with DDM Operations or Direct transfer if the previous fails in a couple of days
- Flash out an Athena algorithm for skimming (and document its use) (Jack)
- Provide a user tool to easily register local files to a local SE (and DDM server) (Marco)
- Perform a full test as suggested by Robert: (Marco and Jack)
- peform a query against this tag database to define an event collection to skim.
- get example Athena code which can take as input the event collection (Jack)
- write a script based on the above to do the skim, storing the output somewhere.
- summarize in some manner the result of the skim
- how many files and events were in the original dataset.
- how many events were in the collection
- how many events made it to the output dataset.
- ideally, create a histrogram or ntuple demonstrating this process (it is possible to copy and example from Erik or Ambreesh):
- set of plots created from the initial dataset
- set of plots created from the skimmed output
I (Marco) assigned the tasks mainly to Marco and Jack as said during the meeting. Feel free to add your names.
You can add sections lates in this documents (after Followups) or links to new documents. This will be more lasting then documenting the work done using emails.
Followups
--
MarcoMambelli - 03 Oct 2006