OSG WN Client 3.1 Tarball

OSG now has a relocatable release of OSG Worker Node Client 3.1 via a tarball distribution.

This is important since this will allow one to deploy OSG WN Client 3.1 via NFS or CVMFS to all worker nodes at a site. The alternative is an RPM based distribution which must be installed on every node at the site.

The OSG Tarball Clients can be found at OSG Work Node Client Tarball. This includes both the Full Client as well as the Worker Node Client which is a subset of the Full Client distribution.

At MWT2 the tarball is installed in CVMFS on the MWT2 stratum-0 server, uct2-cvmfs.mwt2.org, in the repository osg.mwt2.org. The worker nodes are reconfigured to use this deployment for wn-client by changing the link at /share/wn-client.

Installation of WN-Client in CVMFS

The MWT2 Stratum-0 server is uct2-cvmfs.mwt2.org, with Stratum-1 servers at iut2-cvmfs.mwt2.org and mwt2-cvmfs.campuscluster.illinois.edu. This server handles one repository, osg.mwt2.org (shadow location is /cvmfs/osg.mwt2.org). Currently the $OSG_APP, CA and WN-Client 1.2 releases are stored in this repository at /cvmfs/osg.mwt2.org/[app, CA, wn-client]

The WN-Client 3.1 needs to be installed in the same repository but in a way that does not affect the currently installed products.

The locations chosen were /cvmfs/osg.mwt2.org/osg/sw,[CA, osg-client, osg-wn-client], /cvmfs/osg.mwt2.org/mwt2/[app, sw, wn-client]

The following steps were performed on the MWT2 Stratum-0 server, uct2-cvmfs.mwt2.org

Install location

The tarballs were installed into /cvmfs/osg.mwt2.org/osg/sw. There are several steps involved in downloading, unpacking and configuring these products for deployment, all of which are documented on OSG tarball twiki OSG Work Node Client Tarball.

Since OSG releases newer versions of the client over time, the following structure was chosen to ease deployment of multiple versions.


Where $EL is the RedHat EL release (el5 or el6), $ARCH is the architecture (i386 or x86_64) and $VERSION is the tarball version (such as 3.1.18-1). A soft-link, current, will be placed at the same level and will usually link to the most recent version.

Visually the structure looks like

                    /current --> 3.1.18-1
                    /current --> 3.1.18-1
                    /current --> 3.1.18-1
                    /current --> 3.1.18-1

This structure will also allow numerous test releases to be installed without disturbing older releases. Should a problem develop with an new release, it will be very easy to revert to a previously known working version by simply changing the soft-link for current. One can also manually select older (or newer) releases.

A MWT2 installer script was written to aid in this download and install process to keep it consistent. The attached script osg-wn-client.sh will perform all the steps necessary to install a new release and mark it current. The base location for the installation is located at the beginning of the script.


Local modifications to the installed product

This script performs several other useful tasks.

The OSG installation contains a version/architecture specific setup.[sh,csh] which includes a call-out to a local "setup-local.[sh,csh]"

The MWT2 installer script creates these local setup files. Currently these local setups only contain a call-out to a "global" setup-local via

[ -f $myPath/osg-wn-client/setup-local.sh ] && source $myPath/osg-wn-client/setup-local.sh

Bug fix in OSG 3.1.18-1

Currently there is a typo in the version specific setup.[sh,csh] files provided by the OSG installation. These files make additions to $PYTHONPATH to allow the LFC bindings to be found in this location. The entries for "site_packages" must be changed to "site-packages". An OSG ticket has been filed and this error. It will be corrected in the next release, OSG 3.1.19-1. Until then, one needs to manually edit the setup files. These files are located at


Global setup-local.[sh,csh]

The MWT2 installer script creates a global version of setup.local.[sh,csh]. This intent of the global setup-local is to make local modifications which will not get overwritten and lost during upgrades, reinstall and reconfigurations. For MWT2, it is necessary to change the X509 definitions so that the site wide Certificate Authority can be found. By default, the tarball releases are setup to use a CA contained within the WN Client installation.

The contents of the MWT2 global setup-local.sh to redirect to a central CA is

export X509_CERT_DIR="/etc/grid-security/certificates"
export X509_VOMS_DIR="/etc/grid-security/vomsdir"

Master setup.sh

To help select an installed product, a "Master" setup.sh was created at the "osg-wn-client" level. When source'd, this script will determine the EL release of the platform, the architecture, a selected version (current is the default) and then source the appropriate $EL/$ARCH/$VERSION/setup.sh, setup-local.sh and global setup-local.sh. Thus one would only need to source a single file to properly setup OSG WN Client 3.1 on a node out of the repository.

The source chain that is invoked is

source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/setup.sh <version>

where the default version is "current". Other options might be "3.1.18-1", etc

In turn the other setups are source'd. The source chain would be in order

  1. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/setup.sh
  2. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/$el/$arch/$version/setup.sh
  3. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/$el/$arch/$version/setup-local.sh
  4. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/setup-local.sh

Full Client

There is also a tarball release for the Full Client package. The attached MWT2 installer script osg-client.sh will install and setup the Full Client in the same manor as WN Client. This package is a good replacement for the older WLCG-Client 1.2 distribution.

The Master setup.sh file for the Full Client can be found at setup.sh. It is located at

source /cvmfs/osg.mwt2.org/osg/sw/osg-client/setup.sh <version>

The source chain will be identical to osg-wn-client, except that it invokes setup scripts in the osg-client tree.


When a job starts on a node, the pilot sources $OSG_GRID/setup.sh. $OSG_GRID is defined on an OSG gatekeeper with the value for "grid_dir" located in /etc/osg/config.d/10-storage.ini. On MWT2 this value is

grid_dir = /share/wn-client

On each node in the MWT2 cluster, /share/wn-client is soft-linked to "/cvmfs/osg.mwt2.org/mwt2/wn-client. The attached setup file, setup.sh is then deployed via the osg.mwt2.org repository.

This setup file performs a few checks (such as only run once per job, $PATH setup correctly, etc) and then sources the Master OSG WN Client setup.sh specifying the current release.

The source chain would be in order

  1. source /share/wn-client.sh --> /cvmfs/osg.mwt2.org/mwt2/wn-client/setup.sh
  2. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/setup.sh current
  3. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/$el/$arch/current/setup.sh
  4. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/$el/$arch/current/setup-local.sh
  5. source /cvmfs/osg.mwt2.org/osg/sw/osg-wn-client/setup-local.sh

AtlasWN replacement

The AtlasWN package was created to provide DQ2 Client 2.x to the pilot on SL5 system. This package included an installation of DQ2 2.3. Since DQ2 2.x requires python 2.6, a system installed python 2.6 was linked into the callers $PATH in a way to keep the other job in a python 2.4 environment.

SL6 is fully python2.6 compliant and the AtlasWN package is now longer needed. However the pilot still requires DQ2 Client for ANALY jobs.

On MWT2-SL6, the AtlasWN package was completely removed however the pilot callout to AtlasWN is still used to setup DQ2 Client.

The pilot invokes


The simplest setup files invokes the DQ2 Client setup in the atlas.cern.ch repository. It would contain the lines


# Fetch a DQ2 Client
source /cvmfs/atlas.cern.ch/repo/sw/ddm/2.4.0/setup.sh

For MWT2, because of backward support for SL5 nodes, the setup.sh is a bit more complicated. The attached setup.sh is used on MWT2.


Currently there is a bug in runGen.py used by user analysis jobs. This script attempts to use the LFC bindings outside the pilot envionment. Unfortunately the $LD_LIBRARY_PATH has been modified by "asetup". An attempt is made to use a version of libstdc++.so.6 from the Atlas release which is not compatible with the "lfc-python" binding additions. It is currently necessary to "Pre-Load" this library to avoid using the incompatible version. This is accomplished with a simple environment variable

export LD_PRELOAD="/usr/\$LIB/libstdc++.so.6"

On MWT2 the attached setup.sh was put into the MWT2 CVMFS repository at /cvmfs/osg.mwt2.org/mwt2/sw/Atlas/setup.sh. It his then invoked from the attached AtlasWN setup.sh described above.

Once runGen.py has been updated, this setup sccript can be removed.

Atlas requirements for running on SL5

Atlas requires that LFC bindings be present on a node for the pilot to run properly. However, on EL5 systems, this can be a problem. The DQ2 package required by the pilot must be 2.4 or higher. This release of DQ2 requires a python 2.6 environment. However, the pilot and supporting components must run in a python 2.4 environment


The LFC bindings are available in the packages lfc-python and lfc-python26. These packages must come from the OSG repositories. Those contained in other repositories such as EPEL might not work.

The OSG repositories are installed and setup with

rpm -Uvh http://dl.fedoraproject.org/pub/epel/5/i386/epel-release-5-4.noarch.rpm
yum -y install yum-priorities
rpm -Uvh http://repo.grid.iu.edu/osg-el5-release-latest.rpm

After which you can install the lfc-python packages

yum -y lfc-python lfc-python26


The pilot requires access to the Atlas DQ2Client commands. It uses both the command line features (such as dq2-ls, dq2-list-files, etc) and imports the DQ2 bindings (such as dq2clientapi.py) for access to such items as the "TierOfAtlas".

The DQ2Client package required is 2.3 or later. Conveniently such a product exists within the "atlas.cern.ch" repository. It is located at


However, on an SL5 system, python2.4 is the default and needed by the Atlas pilot, but DQ2 Client 2.x requires python2.6 To work around this issue, the DQ2 commands must be "wrapped" in a python 2.6 environment.


As a workaround on SL5 systems only (SL6 is python2.6 compliant), the DDMwrap script was developed. The script is designed to intercept any DQ2 command, setup DQ2 Clients 2.x, execute the command in a python2.6 environment and then return to the caller in the original python2.4 environment.

Attached are two scripts, DDMlink.sh and DDMwrap.sh

To use DDMwrap, you first create directory to hold these scripts and the "softlinked" DQ2 commands. Download and store both scripts into this directory. The script DDMlink.sh creates the various softlinks for the verious DQ2 commands to the DDMwrap script.

DDMwrap deployment

At MWT2, the DDMwrap utility was deployed in the MWT2 CVMFS repository osg.mwt2.org in the location /cvmfs/osg.mwt2.org/mwt2/sw/DDMwrap

[root@uct2-cvmfs sw]# cd /cvmfs/osg.mwt2.org/mwt2/sw [root@uct2-cvmfs sw]# mkdir DDMwrap [root@uct2-cvmfs sw]# cd DDMwrap

(down files into DDMwrap)

[root@uct2-cvmfs DDMwrap]# sh DDMlink.sh [root@uct2-cvmfs DDMwrap]# ll total 8 lrwxrwxrwx 1 root root 10 Apr 21 07:58 ddm-dq2pythonpath -> DDMwrap.sh lrwxrwxrwx 1 root root 10 Apr 21 07:58 ddm-dq2version -> DDMwrap.sh lrwxrwxrwx 1 root root 10 Apr 21 07:58 ddm-ldlibrarypath -> DDMwrap.sh . . . lrwxrwxrwx 1 root root 10 Apr 21 07:58 dq2-sources -> DDMwrap.sh lrwxrwxrwx 1 root root 10 Apr 21 07:58 dq2-usage -> DDMwrap.sh lrwxrwxrwx 1 root root 10 Apr 21 07:58 dq2-whoami -> DDMwrap.sh

DDMwrap testing

To test DDMwrap, its installation location must be added to $PATH

[ddl@uct2-int ~]$ export PATH=$PATH:/cvmfs/osg.mwt2.org/mwt2/sw/DDMwrap

When any DQ2 command which is located in this directory is invoked, DDMwrap is invoked.

[ddl@uct2-int ~]$ python -V
Python 2.4.3
[ddl@uct2-int ~]$ dq2-whoami
[ddl@uct2-int ~]$ dq2-ls
Usage: dq2-ls [ -h/--help | options ] <PATTERN>
[ddl@uct2-int ~]$ ddm-python -V
Python 2.6.5
[ddl@uct2-int ~]$ python -V
Python 2.4.3

In the above example, the process starts in a python2.4 environment. The dq2-whoami and dq2-ls commands can be executed in a python2.6 environment. The "ddm-python" is an internal DDMwrap command which runs python inside the DDMwrap context. As you can see it is running python2.6. However even after running the various commands, the process always keeps its original context.

The full list of internal DDMwrap commands are

ddm-version Version of DDMwrap
ddm-dq2version Version of DQ2 Clients DDMwarp invokes
ddm-dq2pythonpath $PYTHONPATH additions for DQ2 Client
ddm-ldlibrarypath $LD_LIBRARY_PATH additions for DQ2 Client
ddm-python Invoke any python command within DDMwrap (ie python 2.6)
ddm-pythonpath $PYTHONPATH for python 2.6

I Attachment Action Size Date Who Comment
DDMlink.shsh DDMlink.sh manage 2 K 29 May 2013 - 15:25 DaveLesny  
DDMwrap.shsh DDMwrap.sh manage 3 K 29 May 2013 - 15:25 DaveLesny  
osg-client.shsh osg-client.sh manage 4 K 29 May 2013 - 14:47 DaveLesny  
osg-wn-client.shsh osg-wn-client.sh manage 4 K 29 May 2013 - 14:46 DaveLesny  
setup-local.shsh setup-local.sh manage 183 bytes 29 May 2013 - 15:25 DaveLesny  
setup.sh.AtlasAtlas setup.sh.Atlas manage 136 bytes 06 Jun 2013 - 19:43 DaveLesny  
setup.sh.atlaswnatlaswn setup.sh.atlaswn manage 1 K 06 Jun 2013 - 19:44 DaveLesny  
setup.sh.osg-clientosg-client setup.sh.osg-client manage 563 bytes 06 Jun 2013 - 19:44 DaveLesny  
setup.sh.osg-wn-clientosg-wn-client setup.sh.osg-wn-client manage 568 bytes 06 Jun 2013 - 19:37 DaveLesny  
setup.sh.wn-clientwn-client setup.sh.wn-client manage 591 bytes 06 Jun 2013 - 19:10 DaveLesny  
Topic revision: r10 - 10 Jun 2013, DaveLesny
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback