Next: Mixing IRAF and Starlink Applications - FIGARO under IRAF
Up: Data Analysis Applications
Previous: How to Piece Together Diffracted Grating Arms for AXAF Flight Data
Table of Contents -- Index -- PS reprint -- PDF reprint
M.R. Rosa1, R. Albrecht1,
W. Freudling and R.N. Hook
Space Telescope European Coordinating Facility,
European Southern Observatory, D-85748 Garching, Germany
1Affiliated to the Astrophysics Division of the
Space Science Department of the European Space Agency
The volume of data flowing from the Next Generation Space Telescope (NGST) will in all likelihood exceed by huge factors that currently being received from HST. The anticipated operational scenarios will at the same time enforce even more the reliance upon calibration pipelines, automatic data analysis pipelines and software supported observation planning at the users level.
The NGST core science program serves as an example of the type of science exposures regularly to be obtained with such an instrument. This program implements the recommendations of the HST-and-Beyond (Dressler) Report with emphasis on targets at high redshifts. The main observing modes will be very deep imaging and low resolution multi (several hundreds) object spectroscopy. Targets are typically galaxies, barely resolved stellar clusters and luminous point like sources such as supernovae in high redshift galaxies. Embedded sources and faint cool objects in the immediate neighborhood of bright stars, i.e., planets and very low mass stars, will certainly also have a share in the schedule. The study assumed a 10 % mission time overhead for calibrations, implying maximum retrieval of information from raw data. This low overhead and stringent demands can only be met if loss-less data combination and analysis is combined with robust, noise-free calibration strategies.
The budget to be allocated for NGST operations including observation planning, data calibration and data analysis is ultimately linked to the complexity of the observational procedures and to the requirements imposed by operational aspects. To name a few: field rotation and plate scale change between successive re-observations of the same field, field position dependent PSFs, non-availability of calibration reference sources during certain periods, low ceilings on overheads available for calibration observing time.
We present below three areas of software development for the current HST data exploitation which lend themselves directly to the support of optimum NGST science output without charging NGSTs budget .
The combination of the rather broad PSF in the IR, the faint limit of the NGST, and the fact that the deep images will be crowded with fore- and background galaxies will make photometry with NGST image data difficult.
We have developed a preliminary NGST PSF generator based on Tiny Tim, the widely used PSF generator for the HST (Krist & Hook 1997). At this time the PSF is generated for the NGST ``reference design". Under operational conditions such PSFs will be predicted using future developments of the code relying on information about the wavefront errors. Such highly accurately simulated PSFs will allow the application of techniques which have been developed and are successfully in use for HST data. These are in particular:
One of the most important scientific motivations of NGST is the spectroscopy of very faint galaxies detected in a previous deep imaging exposure. The current instrument plans include a multi-object spectrograph fed by a micro-mirror array. This device will provide flexible, software controlled apertures of arbitrary shapes and locations within the imaged field.
The operational concept will routinely make use of the deep survey images of the field for subsequent multi-object spectroscopy of objects selected according to special criteria (eg. blue color drop-outs). Because of the huge number of sources to be selected and identified, and because the field will be accessible in general only for a few weeks, object selection and the configuring data for the micro-mirror array have to be performed within a very short time (days at most). This can only be achieved by completely automatic image analysis and subsequent object classification.
The software requirements are very similar to those for batch processing of slitless spectroscopy, which uses direct imaging for object detection, wavelength calibration and weighting of spectrum extraction by size and orientation of the objects (Freudling 1997). A completely automatic ``pipeline'' to process such data for the HST NICMOS camera has been developed (Freudling & Thomas 1997), and this program (Calnic-C) could be very easily adapted to become both the micro-mirror configuration program and the spectral extraction code.
The multi-object spectroscopy concept mentioned above serves the scientific requirements very well. However, such highly flexible instrument configurations are very demanding on calibration. It is obvious that classical calibration concepts will not be able to cope with an almost unlimited variety of on/off target apertures that can be constructed across the field. Clearly, one can not possibly hope to obtain useful sensitivity curves by observing standard stars even in only a limited subset of these slit-lets. The aperture-mirror configuration for the field to be studied will destroy the current ``instrument setup'' that was unique to a particular position in the sky. Along the same lines, dispersion relations and wavelength zero points will become obsolete as soon as the micro-mirrors are reorganized.
The predictive calibration methods currently developed for HST (Rosa 1994) and ESO VLT (Rosa 1995) instrumentation are, however, ideally suited to this situation. The kernels are instrument software models based on first principles (e.g., grating equations). Configuration data, usually measurable engineering quantities, are regularly verified on dedicated calibration exposures for a few selected instrument configurations. Such models demonstrably permit very accurate predictions of dispersion relations and sensitivity curves for modes not actually covered by calibration data (Rosa 1997; Ballester & Rosa 1997).
Once ``calibrated'' on empirical calibration data, the software models can be made integral parts part of the data calibration and data analysis pipelines in two ways:
Predictive calibration and forward analysis are currently being explored for HSTs STIS. Utilizing these methods NGST operations in multi-object spectroscopy mode will need to allocate only very small amounts of time for specific calibrations. Only wavelength zero-point shifts need to be verified observationally for a given instrumental setup. Aperture specific sensitivity curves require only infrequent checks on configuration data that have been obtained for a suitable subset of the micro-mirrors from standard stars.
Ballester, P., & Rosa, M.R., 1997, A&AS, in press (ESO prepr No 1220)
Freudling, W., 1997, ST-ECF Newsletter, 24, 7
Freudling, W., & Thomas, R., 1997, ``http://ecf.hq.eso.org/nicmos/calnicc/caln icc.html''
Hook, R.N., & Lucy, L.B., 1993, ST-ECF Newsletter, 19, 6
Hook, R.N., & Lucy, L.B., 1994, in ``The Restoration of HST Images and Spectra'', R.J. Hanisch & R.L. White, Baltimore: STScI, 86
Krist, J.E. & Hook, R.N., 1997, ``The Tiny Tim Users Manual'', STScI
Lucy. L.B., 1994, in ``The Restoration of HST Images and Spectra'', R.J. Hanisch & R.L. White, Baltimore: STScI, 79
Rosa, M.R., 1994, CAL/FOS-127, Baltimore: STScI
Rosa, M.R., 1995, in ``Calibrating and Understanding HST and ESO Instruments'', P. Benvenuti, Garching: ESO, 43
Rosa. M.R., 1997, ST-ECF Newsletter, 24, 14
Next: Mixing IRAF and Starlink Applications - FIGARO under IRAF
Up: Data Analysis Applications
Previous: How to Piece Together Diffracted Grating Arms for AXAF Flight Data
Table of Contents -- Index -- PS reprint -- PDF reprint