The 2008 Project
ISOCAM Parallel Point Source Catalogue
Tutor. Stephan Ott
The ISOCAM Parallel Point Source Catalogue is a catalogue realized through the observations of ISO (Infrared Space Observatory), and consists of 16640 sources observed in the infrared light at 6.7 micron.
My work consisted of the comparison of this catalogue with other point source catalogues in different wavelengths. The first step was to compare it with 2MASS (J, H and K bands), which was possible to extrapolate the flux at 6.7 micron through the Kurucz model. Then I compared the catalogue with the fluxes obtained through the Spitzer Flux Estimator for Stellar Point Sources and, for the final step, I compared the catalogue with the real Spitzer fluxes at 5.6 micron and 8.0 micron.
|Duilio Farina |
Università degli Studi di Torino
Università degli Studi di Lecce
|Study of the RGS spectrum of NGC 4051 observed by XMM-Newton |
Tutor. Matteo Guainazzi & Achille Nucita
According to the well established standard model for AGNs, the main properties of the energy output from the central part of an active galaxy is driven by the accretion of a super massive black hole. The observed radiation is influenced by the black hole surrounding regions, i.e. the geometry of the accretion flow, the density of matter, the presence of gas clouds, of an accretion torus, of an halo and of an accretion disk. The view of sight has an important role because, depending on it, different signatures could be seen. By comparing the analysis from observed spectra and the theoretical CLOUDY models, the overall scenario could be understood. In our work, we reduced RGS data from XMM-Newton of NGC 4051 when it was in an unusual low flux state (November 2002) and used the package XSPEC to make a fit of them. From the XMM-Newton observation (EPIC and OM data) of the high state (May 2001), several observables can be obtained. All this information enters the CLOUDY code producing grids of models depending on U, n_e and n_H. The resulting grid and a comparison with the observed lines allow a fit to the data (it has been used a single gas component as a first approximation). We searched for the best model that fits our data. A comparison between the observed and expected CLOUDY lines was done. One single ionization component fits quite well the data. The next step is a fit of the overall predicted model to the data. From the best fit parameters, we can obtain information about both the size and location of the emitting clouds.
Study of Root Finding Algorithms for Science Planning
Tutor. Miguel Almeida & David Frew
In Planetary Science planning the geometry of the observations play a key role. In most cases, these geometries cannot be described using analytical functions. Instead, a set of pre-simulated data is used to identify when specific conditions occur. In order to allocate the required pointing resources to fulfill the observation requests, the required geometries are derived into geometrical events, that later can be used to drive the science operations. Each of these events, along with the data required to compute them, can be considered as a numerical function, not known a-priori and therefore, an event finding application is required. This application uses different root finding algorithms, with different computational costs and accuracies. In order to optimize the computational costs, a root quality estimator is required. This tool would provide not only the percentage of roots that each method gets, but also the computational costs which carry out computing them.
|Victor Bayona Revilla |
Universidad Autónoma de Madrid
Technical Educational Institute of Crete
|Applying web technologies and GRID architectures to XMM-Newton scientific data reduction software |
Tutor. Carlos Gabrial & Aitor Ibarra
The Scientific Analysis System (SAS) is the XMM-Newton data reduction software able to convert raw data to scientifically useful products, like calibrated event lists, images, spectra, source lists, etc. Mainly coded in C++ and Fortran 90, SAS is a complex software application dealing with data from diverse X-ray and optical / UV instruments, designed to run on a standalone computer, and heavily dependent of third party software and libraries. It is used more or less regularly by almost 2000 astronomers and has helped to do the scientific analysis by the vast majority (if not all) of the 1600 refereed scientific publications based on XMM-Newton data so far.
The Remote Interface for Science Analysis (RISA) is an interface (still under development) to all SAS functionalities, through a Client/Server application, running SAS workflows in a Grid Architecture.
The project that I am working on consists of the design and implementation of new functionalities within the RISA application. To this purpose, use of new technologies (GRID technology, Virtual Observatory…) is needed to produce a reliable and useful astrophysical application, pioneering in several aspects.
X-ray binary light-curve analysis in optical and X-ray
Tutor. Maria Diaz & Arvind Parmar
A low-mass X-ray binary is a system with either a neutron star or black hole accreting from a low-mass companion star via an accretion disk. About 10 of these systems show dips in their light curves recurrent at the orbital period of the system and share the property of being observed at high inclination. The observation angle makes the dippers particular interesting since a number of characteristics (e.g. complex spectral changes during the dips or occasional disappearance of dips) are observed which are not easily explained. The aim of this project was to analyse the simultaneous optical and X-ray data sets available for all the dipping sources observed with XMM-Newton. Optical and X-ray simultaneous data are rare and thus the proposed data sets provide us with an unique opportunity to understand the complex relations between dips, variability of the source and reprocessing of emission from X-rays to optical.
|Alex Kolodzig |
Universidad Complutense de Madrid
University of Leicester
|XMM-Newton EPIC contamination monitoring |
Tutor. Martin Stuhlinger & Matteo Guainazzi
The XMM-Newton Science Operation Center (XMM-SOC) at ESAC currently operates three scientific instruments on board XMM-Newton. Starting from reception of the observer's proposal to the final delivery of calibrated scientific products to the observer (and in general to the astronomical community), the activities of the SOC comprise all the necessary steps for ensuring high quality and reliability of the scientific data. This includes day-to-day operations, calibration of the instruments, user support and scientific research. One of the main instruments onboard XMM-Newton is the European Photon Imaging Camera (EPIC). EPIC consists of two MOS and one pn-CCD camera. Both camera systems operate in the energy rang from 0.2-15 keV and provide spatial, energy and timing information of the detected X-ray photons. The European EPIC Consortium carries out the calibration of the EPIC camera. The EPIC Instrument Dedicated (EPIC-IDT) team at ESAC coordinated by the EPIC Calibration Scientist participates in the calibration efforts and transfers all important calibration information into Current Calibration Files (CCF) and/or software products for a major scientific analysis software package, the Scientific Analysis System (SAS), needed by the observers for optimum scientific exploitation of the XMM-Newton observations performed. In order to track possible contamination on the cameras a tool was developed to monitor upper limits of contamination using the isolated neutron star RXJ1856-3754.
Design and implementation of an S/C attitude checker for planetary missions
Tutor. Jorge Diaz del Rio & David Frew
Attitude data are fundamental for Scientific Data Analysis. Currently, Flight Dynamics only provides predicted attitude for all phases of a planetary mission. It has been proved, by in-flight experience, that this prediction is, in most cases, accurate enough for analysis purpose, but sometimes deviations have been observed with respect to the measured attitude. In these cases, reconstruction has been provided. Attitude reconstruction is not a standardized practice and requires identification of those periods where is needed, implying additional work from Flight Dynamics. The Science Ground Segments (SGSs), with the support of the teams, should guarantee that the requirements on pointing accuracy and stability are met for all scientific and calibration observations. In this coordinating and supervising role, the SGSs should observe when these deviations occur and request the reconstruction, on behalf of the scientific teams, to Flight Dynamics team. This project involved the design and implemention of a prototype to perform the validation of the S/C attitude, based on the predictions provided by Flight Dynamics and the measured attitude obtained from Telemetry. In addition, the system will use the Instrument Timelines to identify those periods when the instruments were operating and compare them to the observed deviations.
|Juan Gonzalez |
Universidad de Vigo
|The speed of the hurricanes in not stars |
Tutor. Andy Pollock
The biggest and hottest stars in the Galaxy have very fast winds flying from the surface in all directions. The winds produce broad X-ray lines that can be used to measure how fast the winds are blowing by measuring the widths of the lines and using the Doppler effect. The winds are very fast with speeds sometimes up to more than 2000 km/s. This project invovled the analysis of several stars with high-resolution X-ray data whose lines were measured to find out how fast these very strong winds are blowing.
Unveiling obscured AGNs in close pairs of similar sized galaxies
Tutor. Nora Loiseau
The main objective of this analysis is the study of close pairs of similar sized interacting galaxies to obtain their physical properties through X-ray images and spectra, in order to be able to interpret a possible relationship between this phenomenon of interaction and the existence of active nuclei in the above mentioned structures, or, rather, to be able to confirm that the activation distance (between about 10 and 100 kpc) is instrumental in the activation of quiescent black holes during this intermediate stage of the merging evolution. The confirmation of such an activation distance is also relevant for categorizing binary quasars at higher redshifts that could be misinterpreted as cases of gravitational lenses. In this way, we would find a method to discover binary AGN and to characterize the galaxies activity according to the pair separation, their morphology, and other parameters that indicate the stage of the merging process.
|Domingo Valero |
Universidad Complutense de Madrid
Universidad Complutense de Madrid
|Reddening determination from ultraviolet spectra |
Tutors. Carmen Morales, Angelo Cassatella & Enrique Solano
Probably the most accurate method to determine the reddening correction is to use the interstellar dust feature around 2200 Angstroms. However, the accuracy on the colour excess E(B-V) obtained from the 2200 A bump with traditional methods is usually not better than ± 0.05 dex. Such an uncertainty implies a large uncertainty on de-reddened fluxes of about 51 percent at 1300 A and 18 percent at 3000 A. Last year the group developed a method of reddening correction without a prior knowledge of the spectral type. They applied the method to 317 UV spectra from IUE low resolution archive INES. The aim of this project was to enlarge the sample as much as possible with INES data and to analyze the reason of the deviations from the method of some stars (spectral peculiarities, binarity, anomalous extinction law direction, etc...). The method will be included in the Virtual Observatory to be used on any other UV spectra as well as the stars analysed and corrected from interstellar reddening.
INTEGRAL analysis of galactic centre faint sources compared to the famous microquasar Cygnus X-1
Tutor. Marion Cadolle Bel & Peter Kretschmar
This project involved the systematic analysis of the Galactic Center and data of Cygnus X-1. When some of the faint sources enter into outburst, light curves and spectra were extracted and modelled, analysing their evolution with time and comparing them with results from the archetypical source Cygnus X-1. The skills developed were in the INTEGRAL data analysis tools for two detectors (IBIS and JEM-X) as well as in IDL and other software packages commonly used in high energy astronomy.
|Bjoern Lehnert |
Universidad Autónoma de Madrid
Universidad Complutense de Madrid
|Molecular gas associated to galactic LBV stars |
Tutor. Ricardo Rizzo
The molecular gas associated to LBV stars is an observational field of increasing interest. Recent results show that we can learn about some aspects of this brief stage by observing low- and mid-J mm and submm lines of simple and rather complex molecules. Based on already acquired data, this project involved the reduction and further analysis of the data. The study also investigated new observational tasks, both in the mid-IR and submm range.
Preparation of AGIS validation with Nano-Jasmine data
Tutor. Uwe Lammers
The ESAC Gaia Team is implementing the Astrometric Global Iterative Solution (AGIS) in collaboration with other European groups that are part of the Gaia Data Processing and Analysis Consortium (DPAC). AGIS is the system that shall generate the core astrometric mission products of Gaia from the ~100 TB of raw observation data collected over the 5-6 years of operations. Until the launch of Gaia in late 2011 AGIS is being scientifically and technically validated though large-scale testing campaigns that use simulation data. With a foreseen launch in 2009 the Japanese astrometry mission Nano-Jasmine is a pre-cursor to Gaia (~2011) and Jasmine (~2014). The small (40x40x40 cm) satellite shall provide astrometry data at the level of Hipparcos in terms of accuracy (milli-arcsec) and total star numbers (a few 100.000). Nano-Jasmine's payload, optical design and observing strategies are similar to Gaia's. This involved investigating the possibiliy of using AGIS for the reduction of Nano-Jasmine data and thus, validating AGIS with real flight-data before Gaia is launched. The necessary changes to AGIS were assessed.
|Michelle Picardo |
Trinity College Dublin