Check out our poster on Neal’s passive seismic network for a general overview.
As of June 2012 we have collected data sets every 45 days or so since May 2011. That’s a lot of data so we need to perform a series of processing steps to get the data to where it belongs. Ultimately, we want our data to be transferred from each flash drive (2 per station) to our Augustine network here at BSU and to Iris’s Database Management Center. Below is a list of the steps that need to be taken:
- For a description on how to perform a service date go here.
- After a service date has been performed we should have 2 flash drives for each station that was serviced. These flash drives need to be written to Iris’s MacOX laptop using the provided card reader and Neo software. Neo will convert the data into zip files and store it on the MacOX in the directory /Volumes/Data/NealHotSprings/PSprocessed/Dump/. Note that once the zip files are complete we must make a directory within /Dump corresponding to the given service date. For example, typing the command mkdir June2012 in /Dump will make a directory called June2012. Then, we move our zip files into that directory using the command mv *.zip June2012. Be sure that the number of zip files in the new directory corresponds to the number of flash drives that were processed. And be sure to check that the file sizes are consistent with the service sheets. If there is a file that is only, say, 13KB (really small) then there may have been a problem when Neo created that zip file. Also, within /Dump make directories called servicesheets. This is where the digitized service sheets should go.
- Next, the zip files must be transferred to firstname.lastname@example.org. To do this, first log in to Augustine and navigate to /pal/fieldcamp/2011/passive_seismic/data. Create a directory within data corresponding to the service date of interest. We use the convention that the directory is named yyyymmdd, so for a June 16, 2012 dataset we would name the directory 20120616. Next, create directories within 20120616 (for example) called raw and servicesheets. Once the directories in augustine have been created go back to the Mac’s directory /Volumes/Data/NealHotSprings/PSprocessed/Dump/June2012 (for example) and send them to augustine using the command scp *.zip $email@example.com:/pal/fieldcamp/2011/passive_seismic/data/$servicedate/raw, where ‘$servicedate’ refers to whichever service date directory we are processing. In this case it would be 20120616. Also, use the scp command to transfer the digitized service sheets to augustine.
- Now the data is safe. It is located on the MacOX laptop, on augustine which will create backups, and on the flash drives (at least until the next service data). Next, perform a flow of processing steps to 1. add data to our database for research purposes, and 2. transform the zip files to the format Iris wants them. The script for this step is at /pal/fieldcamp/2011/passive_seismic/codes for augustine users, or click here.
Finally, we need to FTP (file transfer protocol) the data to Iris’s Database Management Center. Open a shell on augustine and type gui_DoFTP. Select Full Operation and navigate to the day_volumes directory in the particular service date directory. Select OK (use the default “FTP passive mode”).
After completing the steps above we can analyze the data using the Antelope Relational Database System: Datascope.