Tucson Test Stand (TTS)

This exists to provide details on various aspects of the TTS. It will continue to be expanded as functionality and usage increases.

Data Repositories and Policy

The TTS has the repositories for generated data setup in exactly the same manner as the summit. When a frame is successfully simulated, a new file is created and ingested to the butler. Generated frames follow a standard simulation environment data policy where all data older than 30 days can be deleted without notice.

Also contained in the same repository are special collections of on-sky data that are used for the unit testing of scripts and/or software. This data does not follow the 30-day retention policy and remains until it is no longer needed.

Adding New Data to the LATISS Repository

The butler test data collection for LATISS is named LATISS-test-data-tts.

The raw data files are accessed from lsst-login.ncsa.illinois.edu and organized by date in /lsstdata/offline/instrument/LATISS/storage/.

There are multiple variations on how one could perform the data transfer and ingestion. The following steps describe a highly manual example of how one could perform the task.

  1. Starting from a terminal, ssh into lsst-login.ncsa.illinois.edu using your NCSA Kerberos credentials.

The following steps copy the desired data from /lsstdata/offline/instrument/LATISS/storage/ to a local scratch directory (e.g. /scratch/srp/LATISS-test-data-tts/YYYY-MM-DD)

  1. In a new tab, ssh into auxtel-archiver.tu.lsst.org using your Rubin SSO credentials then change directories to where the data will be stored

    cd /data/lsstdata/TTS/auxtel/oods/gen3butler/raw/LATISS-test-data-tts
    

    Note that writing files to this directory requires the proper privileges. One can also use the saluser account.

  2. Use secure-copy to bring the data from NCSA to the TTS.

    scp -r <NCSA_username>@lsst-login.ncsa.illinois.edu:/scratch/<NCSA_username>/LATISS-test-data-tts/* .
    

    This will copy the data from NCSA into the directory /data/lsstdata/TTS/auxtel/oods/gen3butler/raw/YYYY-MM-DD

The next series of steps performs the ingestion of the data to the butler repository. It also shows how to verify it was ingested properly.

  1. Pull a recent T&S development (base) container which contains the DM stack to have the ability to use butler utilities. The difference between this and the SQuaRe provided container is the use of the saluser uid/gid.

    docker pull lsstts/base-sqre:develop
    
  2. Run the container, passing the data directory into the container, then setup the lsst_distrib tools:

    docker run -ti --volume /data:/data --volume /repo:/repo lsstts/base-sqre:develop
    source /opt/lsst/software/stack/loadLSST.bash
    setup lsst_distrib
    
  3. Ingest the data to the TTS butler repository (/repo/LATISS); (note that warning messages such as, “Dark time less than exposure time. Setting dark time to the exposure time,” may appear and should be evaluated whether or not they are appropriate.)

    butler ingest-raws -t symlink /repo/LATISS /data/lsstdata/TTS/auxtel/oods/gen3butler/raw/LATISS-test-data-tts/2022*
    
  4. Associate the files to a the LATISS-test-data-tts collection. For a small number of files this can be done manually very rapidly.

    butler associate /repo/LATISS LATISS-test-data-tts -d raw --where "exposure.day_obs=20220316 AND instrument='LATISS'"
    
  5. Check that the files are now part of the collection.

    butler query-datasets /repo/LATISS --collections LATISS-test-data-tts
    
  6. Can query all collections to verify that the LATISS-test-data-tts collection is visible