Joint Archive for Sea Level of the University of Hawaii Sea Level Center and the US National Oceanographic Data Center Dept. of Oceanography, University of Hawaii at Manoa 1000 Pope Rd. MSB 307 Honolulu, Hawaii 96822 USA


In connection with the IOC-UNEP-WMO pilot Monitoring Activity on Sea Level Changes and Associated Coastal Impacts in the Indian Ocean, a workshop was proposed by Dr Satish Shetye of the National Institute of Oceanography in Goa, India, to the IOC Group of experts on Global Sea Level Observing System (GLOSS) at Bordeaux France, in February 1995. A primary component of the workshop is the "hands-on" training session (HOTS) for technical staff of regional data collection and analysis centres in the Indian Ocean wish also Level (CMAS). The proposal was approved and the meeting was set for 21 November 1 December, 1995 at the Survey of India's National Tidal Data Center in Dehru Dun, India. Since a complete report of the workshop is in preparation by Dr Shetye, this review only focuses on the HOTS.

The participants were invited from countries with active CMAS programs and eight Indian Ocean countries were able to attend: Bangladesh, India, Kenya, Madagascar, Tanzania, Malaysia, Republic of Maldives, and Mauritius (Appendix 1). They were well chosen based on their backgrounds in science and computers.

The HOTS focused on the use of the TOGA Sea Level Center/National Oceanographic Data Center's sea level processing software for IBM PC compatible microcomputers. This GLOSS-endorsed, user-friendly package was completed in 1991, is available in English and Spanish, and has been distributed to over 120 recipients The package is available on diskette or through anonymous FTP on the Internet. For more information, contact Mr. Patrick Caldwell.

A series of lectures were given on calibration of sea level data, tidal analysis and prediction, quality control, decimation to daily and monthly values, and oceanographic interpretation. The lectures included many examples of problems and solutions encountered while reducing data sets at the University of Hawaii. Each lecture was usually followed by the HOTS during which the participants worked on data sets from their own regions. For those that did not bring their own hourly data, the TOGA/PSMSL CDROM prepared by the USNODC was available for extracting desired data sets. Software was provided to convert from the TOGA archive format of the CDROM to the TOGA processing format as used by the package.

Although the participants had varying backgrounds in the computer usage and data analysis, they all were able to proceed through the primary steps of the package. A few unanticipated technical glitches normally arise while learning to use the package, so it was advantageous to have the author on hand for answering questions and making suggestions on various file management techniques.


In the lectures on calibration, two main points were emphasised: the importance of geodetic surveys and the necessity of linking the gauge values to a fixed datum using tide staff readings, switch information, or other calibration methods.

First, a case study was presented on how geodetic levelling data and comparisons to the nearest tide gauges were used to remove a false trend caused by a sinking pier on which a tidal station resided. Within this example, the dangers of misinterpreting the secular sea level trend from time series which are not of adequate length was emphasised. It was also pointed out how conventional float and well gauges used in redundancy at a given site provide low cost, high quality time series with minimal gaps over very long time spans, since the simplicity of the instrumentation requires minimal technical repairs and minimal technical expertise to maintain.

Second, specific instructions were given on how to us the TOGA/NODC software for preparing scatter diagrams of the tide staff readings and corresponding gauge data and for reducing this information to determine the temporary calibration adjustment constant. The necessity to apply a single calibration constant based on the monthly temporary constants to extended time periods without physical change to the gauge was highlighted. The participants successfully tested the scatter diagram and the adjustment constant calculation programs. to 31 days, and plots consecutivc days even across month boundaries, which the primary plot program, HOURYR.EXE, does not.

Technical problems

Several glitches arose during use of the TOGA/NODC package which should be shared with future users to reduce the occurrence of such problems. If users have encountered other difficulties not mentioned below, please contact Mr. Patrick Caldwell.

The primary problem is getting the originator's data into the format as defined by the package: the TOGA processing format for hourly data. If a user is having problems, the best check is to run program CHUNIT.EXE within subdirectory. This program changes the units of the data and will bomb when a problem in the format is encountered Thus, the user can look at the output file to find the date on which the problem occurred, then go back and fix the original file. At the Joint Archive for Sea Level, programs for converting foreign formats into the TOGA processing format are plentiful. Thus if one would like an automated method, please send Mr. Caldwell a sample of the data format with a description of the file naming methodology and a conversion program can be provided.

The tidal analysis program, TIDEANL.BAT, will bomb if a grossly wild data point exists in the observed data for the time period chosen for the analysis. If this occurs, either replace the wild point with a missing data flag (9999) or choose another time period.

A problem can arise with the gap interpolation program, GAPFALL.BAT, if the units of millimetres, the package's default, are not used and the following is not adjusted. When running the tidal prediction software, note the file PRDVP.DIN on \SLPRC\TIDE\PRD must be edited to reflect the units chosen. The other options are centimetres and feet implied to the hundredths. The filtering program, FILTHR.EXE, requires the input hourly data to be complete blocks of one year of either data or missing data flags. This requirement caught several users off guard. The program FILLVM.EXE on \SLPRC\FILT is available for making monthly blocks of missing data flags, which can be inserted into a yearly file lacking the given time span.

The input date to program EXPLOT.EXE must be explicitly equal to the date format as seen in the input file of residuals. For example if the year-month-day to start the time axis is seen in the residual file as 950101, then the zero's must be typed when interactively queried during execution of EXPLOT.EXE.

A minor point is that when any of the executables are ran, one must be sure that the output files that arc created do not exist prior to initiating the job. Within the MS-DOS FORTRAN environment, this causes the program to abort.

Final remarks

The technical workshop was very successful due to the high enthusiasm among its participants and faculty, the excellent computing facilities and hospitality provided by the Survey of India, the financial and administrative support of IOC (and NOAA for Mr. Caldwell), and lots of luck (very long travel distances and timed between temporary shutdowns of both the US Government and the Survey of India). The participants maintained a serious attitude during the HOTS and stayed longer than suggested on most days. They have taken home skills that can provide higher quality data from their regions for their own application and for contributions to the international data banks thus the scientific community. This enhancement of expertise is a stepping stone to learning additional analytical techniques, which in turn will upgrade the CMAS activities of each region.