Abel de Burgos
Complutense University of Madrid
Preparation of Science Cases for the CESAR Education Project
Tutors: Michel Breitfellner, Manuel Castillo and Javier Ventura
The aim of my traineeship was to develop several science cases about astrophysics that had to include the use of the optical and solar observatory that the CESAR project has. In this line I firstly developed a set of science cases to be done with the CESAR ESAC Solar Observatory to study the Sun's chromosphere and photosphere or its differential rotation among other features. I also developed a very interesting case that includes planetary observations and different planetary image acquisition procedures to get scientific results at any level of studies. For example, using the CESAR Robledo Observatory (optical), students will calculate the mass of Jupiter or the speed of light using the transitions of Jupiter's moons as Olaf Roemer did back in 1676. Moreover, I developed a science case about exoplanets to be done using the optical telescope, which will let students detect planets outside our Solar System.
Carlos III University of Madrid
Photometric redshifts within crowded galaxy cluster fields
Tutors: Bruno Altieri, Tim Rawle, Ivan Valtchanov and Herve Bouy
Modern distance calculations of deep universe galaxies are done through via redshift. However, spectroscopy processes need too many resources for each sky source, this means the use of photometric redshift estimation methods. This project has consisted on their application to massive galaxy clusters and getting inside the development of new ones. Those galaxies came from two different areas of study, lensed galaxies and cluster galaxies. All clusters and galaxies involved in our project were included previously in the Herschel Lensing Survey, an international imaging survey whose purpose is oriented to massive galaxy clusters in the far-infrared and sub-millimetre using the ESA's Herschel Space Observatory data. Three redshift estimation methods were used, two from first-generation based on template-fitting to SEDs (Spectral Energy Distribution) and the other one from newest area oriented to machine learning techniques. Finally, after an evaluation process, also a new software tool using GUI-programming was implemented.
Complutense University of Madrid
Search for bright nearby M dwarfs with Virtual Observatory tools
Tutors: Enrique Solano, José Antonio Caballero, Francisco Jiménez and Rosario Lorente
M dwarfs are the most common stars in the solar neighbourhood (66% of the stars at d < 10 pc). In particular, they are of interest for the exoplanet community looking for Earth-like planets in habitable zones. In this work I used Virtual Observatory tools to cross-match the Carlsberg Meridian 15 and the 2MASS Point Source catalogues to select candidate M dwarfs. The search was limited to bright targets to ease the spectroscopic follow-up with current and future instrumentation (e.g. CARMENES). VO tools were also used to derive effective temperatures and surface gravities from spectral energy distribution fitting and proper motions from different catalogues.
Polytechnic University of Madrid
"Where on Mars will the ExoMars Rover land?" - An interactive web map visualisation of the ExoMars Rover landing sites
Tutors: Nicolas Manaud and David Heather
The ExoMars 2018 mission will deliver a European rover and a Russian surface platform to the surface of Mars. Armed with a drill that can bore 2 metres into rock, the ExoMars rover will travel across the Martian surface to search for signs of life, past or present. But where to land? - The surface area of Mars is approximately 145 million square kilometres, almost the same area as the Earth's land masses. Imagine having to choose just one spot to land on and call home. Selecting the right place could mean the difference between meeting the scientific goals and failure.
The search for a suitable ExoMars rover landing site began in December 2013, when the planetary science community was asked to propose candidates. Eight proposals were considered during a workshop held by the ExoMars Landing Site Selection Working Group (LSSWG). By the end of the workshop, there were four clear front-runners. Following additional review, the four sites have now been formally recommended for further detailed analysis: Mawrth Vallis, Oxia Planum, Hypanis Vallis and Aram Dorsum. Scientists will continue working on the characterisation of these four sites until they provide their final recommendation in October 2017.
My project was aiming at drawing the attention and increasing the interest of the general public for the scientific and robotic exploration of Mars in Europe. The goal of the project was to design and build an interactive web map visualisation of the four recommended ExoMars landing sites. The visualisation was based on some of the ESA and NASA planetary imagery data and additional geospatial information used by the Landing Site Selection Working Group. It was designed in the way to be engaging for a non-expert public and facilitating the understanding of a few key concepts for the selection of the landing sites, including scientific and engineering constraints.
I have been working with the support of the ESA's Planetary Science Archive (PSA) Team at ESAC, and the CartoDB team based in Madrid and New York. I was also in contact with the ExoMars Landing Site Selection Working Group. My work mainly consisted in designing and prototyping the front-end web interface of the interactive map visualisation. It also includeed supporting the identification and preparation of all necessary geospatial and mapping data products for ingestion into the system back-end.
Ezquiel Lopez Lopez
University of Granada
Implementation of a data mining tool on the Cluster Science Archive and its application to the study of the Auroral Acceleration Region
Tutors: Arnaud Masson, Philippe Escoubet, Harri Laakso
A beta version of a data mining tool has been developed in the framework of the Cluster Science Archive (CAA). This tool was coded and tested by the CAA technical manager, Dr. Perry, RAL, UK. It consists of a set of IDL routines. However, it was never put online on the main CAA website, where more than 1700 users are registered. The CAA website was closed its public access by the end of October 2014.
But the interest of a (simplified) version of this tool remains high in the community. Hence, the goal of my project was to adapt a simplified version of this tool in the framework of the Cluster Science Archive (CSA). It consisted of a subset of these IDL routines that were chosen before the start of this project and developing an online documentation and FAQ's to ease the use of this tool.
The second part of the project was to use this tool. The scientific focus was put to investigate the 2009-2012 auroral acceleration data campaign of Cluster to dig out a list of events where density cavities were observed together with parallel electron beams propagating along the Earth magnetic field lines on multiple spacecraft. Once this was done, statistics were performed to better understand the topology of these density cavities, which remain largely unknown.
University of Stuttgart
Estimating the flux of micro-meteoroids in the Solar System using the IMEM tool
Tutors: Nico Altobelli
The ESA Interplanetary Meteoroid Model (IMEM) computes the micro-meteoroid (dust) environment that a spacecraft would encounter on a given interplanetary trajectory.
This model is fed by the data of various past missions which have carried in-situ micro-meteoroids detectors at different locations in the Solar System, as well as by considering the dynamical evolution of dust grains released by known sources, such as comets or asteroid collisions.
The reliability of this model depends on the knowledge of the micro-meteoroid sources and on the available data. In the outer Solar System, at and beyond Jupiter's orbit, the dust populations are believed to be significantly different from the ones in the inner Solar System - the data describing them, furthermore, is scarce. As new in-situ data have been acquired in the outer solar system by the Cassini-CDA dust detector, a comparison between the model predictions is necessary.
My initial task was to familiarize myself with the IMEM tool, not only with the graphical user interface but also with the command line interface, allowing the tool to be run from higher level programs. Then my job was to produce runs of the IMEM model for given S/C trajectory and different locations in the Solar System, in particular Jupiter and Saturn, predicting the flux and size distribution of micro-meteoroids. I have also participated in the processing of the Cassini-CDA data. Additionally I was working on visualization of spacecraft probes trajectories using Spice and Celestia for outreach purposes.
University of Turku
X-ray bursts and the connection to burst onset conditions
Tutors: Jari Kajava, Celia Sanchez-Fernandez and Erik Kuulkers
More than 100 binary stars that comprise of a neutron star and a low-mass companion star are known to produce type I X-ray bursts. These are bright X-ray flashes produced by thermonuclear burning of helium and/or hydrogen onto heavier elements deep within the atmospheres of the neutron stars. Observationally, there is a large diversity of X-ray bursts properties among the 100 known burst sources. The physical reasons behind this diversity are still largely unknown, which is why two large databases of X-ray bursts were built, using data from the ESA INTEGRAL mission, as well as the NASA RXTE mission. My job was to perform data-mining and other analysis tasks on these data, essentially acting as a young Sherlock Holmes, with the aim of detecting dependencies between the various physical parameters that can be extracted from the X-ray burst data, the conditions that prevailed before the onset of the bursts and from other burst independent parameters (such as the orbital binary period, spin period, neutron star magnetic field strength, chemical composition of the companion star, etc.).
Ramon Arquimedes Gomez Moya
ENSEEIHT INP Toulouse
Visualisation of data from Gaia's astrometric core solution
Tutors: Jose Hernandez and Uwe Lammers
Gaia's main result will be a highly accurate and complete catalogue of not less than 1 Billion stars generated with the Astrometric Global Iterative Solution (AGIS) system running at ESAC. The detailed output of AGIS consists of numerous scalar quantities computed for every object. As part of the catalogue validation the correctness and consistency of this data must be checked for which the creation of statistical plots (e.g. histograms and scatter plots) is an indispensable element. From the range of available astronomical open-source data analysis/plotting frameworks the following two should be inspected for their suitability in this regard, viz.
My first central task was to have Glue/Topcat seamlessly (as much as possible) read AGIS output data. After this has been achieved I have generated a number of standard plots with the option to produce publication-quality output. I have also investigated the performance scales with input data volume.
The outcome was demonstrated feasibility (in the form of a document) to generate statical plots from AGIS data with Glue/Topcat.
University of Pisa
Mobile application development for visualization of astronomical data
Tutors: Jonathan Cook, Jorgo Bakker, Álvar García, Bruno Merin
As part of on-going outreach projects at ESAC, we have compiled and curated a collection of multi-wavelength images of certain iconic regions in the sky all registered to exactly show the same part of the sky as observed by different space telescopes. This collection of images has enormous value for outreach to the general public but was only available through a web interface.
My project consisted in the development of a simple app for iOS and Android that would allow users to compare these images in real-time by moving their mobile devices. An example of such an app with astronomical data already developed for other maps is http://mqq.io/cmbmaps/.
I was also asked to contribute to the development of the currently published Herschel QuickLook App to make it conform with a new look and feel and to fix a few currently known bugs.
I have worked together with a team of three engineers and a scientist at ESAC and the outcome of my work resulted in the new app, that was posted in the AppStore and Android Market.
University of Masaryk, Brno
X-ray observations of the active galaxy NGC 985 with XMM-Newton
Tutors: Jacobo Ebrero and Maria Santos-Lleo
Active galactic nuclei (AGN) are powered by accretion of matter onto the supermassive black hole that resides in their centre. Their large bolometric output is powerful enough to drive ionized winds into the interstellar medium of the host galaxy and even into the intergalactic medium. The enormous amounts of matter and energy thus injected may affect the evolution of the central black hole and the galaxy in a process called cosmic feedback. These ionized outflows, also known as warm absorbers (WA) are detected through the absorption lines of ionized species, typically blueshifted with respect to the rest frame of the source, in the X-ray/UV continuum of the AGN. In spite of their potential importance, the origin, location, and impact of these winds are still a subject of debate and a hot topic in modern extragalactic astrophysics.
The project I was involved in studied X-ray data of NGC 985, one of the brightest AGN in the sky, obtained with XMM-Newton. X-rays provide a unique view of the violent processes that take place in the innermost regions of AGN and therefore they are essential to understand their physics. NGC 985 is known to host a multi-component WA with distinct ionization phases. The goal of this project was to reduce and make a preliminary analysis of the high-resolution X-ray spectrum of NGC 985 (proposal approved by the OTAC in AO13 and to be observed in January 2015), in order to obtain a global picture of the physical properties of the WA. Later the project was also extended to look for variability in the WA parameters in response to variations in the ionizing flux to constrain the location of the absorbing gas, something that has been achieved only in a handful of cases so far.
The project helped me to gain experience in scientific research and X-ray data analysis and the understanding of the physical processes in an AGN at high energies.
University of Zaragoza
e[PY]s: a python library for planetary science planning
Tutors: Jonathan McAuliffe and Mauro Casale
e[PY]s http://johnnycakes79.github.io/epys/ is a software library for the support of science operations preparation, simulation and analysis for BepiColombo and other planetary missions. The library consists of a number of python packages that reduce the time and effort needed for many of the data preparation and analysis tasks of advanced science operations planning. The functionality is spread across six different package modules.
- Draw makes pretty orbit graphics.
- Events provides a series of time/date utilities.
- Maps does things with maps and images.
- Orbit processes mission analysis orbit files.
- Read parses EPS output into useable data structures.
- Utils provides some more useful bits n' pieces
I was working on my project with the BepiColombo Science Ground Segment at ESAC, Madrid - on the following tasks:
- Expanding the python classes and manipulation routines for working with instrument, timeline and timeline prototype models.
- Compiling a set of procedural/tutorial IPython Notebooks for training others in the use of the library.
- Reviewing and completing the current in-line documentation (thereby gaining detailed knowledge on the inner-workings of the software as well as the main steps required in Science Operations Analysis for a planetary mission).
- Writing automated test scripts and ensuring that they run successfully in the context of the continuous integration and deployment environment.
- Re-factoring BepiColombo-specific code to be mission independent.
- Re-factoring as-yet-unincorporated routines into the library format and including them in the library.
- Identifying new functionality to expand the library.
AGH University of Science and Technology, Krakow
Predicting the Gaia Downlink Data Rate
Tutors: Hassan Siddiqui, Uwe Lammers, Neil Cheek, Emmanuel Joliet, Jose Osinde
The Gaia mission was launched at the end of 2013 to map the positions and locations of ~10^9 stars. It will typically downlink data volumes of 50-150 Gigabytes daily. The range depends significantly on which area of the sky Gaia will scan on a particular day. To ensure that the ground station coverage is adequate to download that data in sufficient time, the Gaia Science Operations Centre at ESAC have to provide the Mission Operations Centre, who in essence manage the ground station coverage, with estimates of the daily downlink rate. A software simulator has been developed for this purpose, but needs to be 'calibrated' - and the software model improved - based on the actual data volumes acquired so far.
Therefore the aim of my project was to calibrate the simulator based on actual observations, and produce reliable downlink rate estimates.
During the course of this project I have gained knowledge of the functioning of an operational science mission, become accustomed to the interactions with the Mission Operations Centre and the Science Operations Centre and have gained experience of software development in a professional environment.
Pedro Manuel Vallejo
University of Granada
CubeSat: Science mission proposal and AOCS demonstrator
Tutors: Julio Gallegos, Fernando Martin Porqueras, Tim Lock and Xavier Dupac
A CubeSat is a nano-satellite originally developed at Stanford and CalPoly (San Luis Obispo) Universities and was proposed as a vehicle to support hands-on university-level space education and opportunities for low-cost space access. At its most fundamental level, the CubeSat can be defined as a discrete but scalable, 1 kg 100x100x100 mm cuboid spacecraft unit; this is now commonly referred to as 1U CubeSat (can be combined to produce larger mass/volume, up to 3U, an even 6U are proposed).
All CubeSat missions to date may be considered to have had technological objectives to some degree, be it the demonstration of devices and system architectures developed inhouse, or demonstration of Non-Space-Rated (NSR) Commercial-Off-The-Shelf (COTS) component performance.
The prime objective of my project was to propose a scientific mission for a CubeSat (or a constellation) related to ESAC research and missions. Inside the worldwide CubeSat community, there are several proposals for the use of CubeSats in scientific missions; I had access to the CubeSat community through Julio Gallegos (associate professor at Universidad Europea de Madrid); and, the opportunity to integrate my proposed mission in the CubeSat community.
As part of this project I have designed an AOCS demonstrator and built a prototype. The AOCS demonstrator was autonomous (pre-programmed control law via USB) and had the possibility to be remotely commanded. Although, in the hardware part, I focused on the AOCS, other subsystems had to be still developed: on-board computer, propulsion, communications and structure using NSR-COTS.
Laura Susana Alvarez Mera
University of Seville
Improving the legacy value of Herschel SPIRE FTS data
Tutors: Ivan Valtchanov, Eva Verdugo, Tanya Lim
Herschel's Fourier Transform Spectrometer (FTS) observed thousands of sources during the 4 year lifetime of the observatory. The targets of these observations have significant emission in the far-infrared (FIR, 200-700 µm) and included Solar System bodies, Galactic regions with cold dust, and nearby and extremely distant galaxies at the edge of the visible Universe. The Herschel Science Archive is containing the final legacy spectra observed with the FTS and the derivative spectral features catalogue, all freely available for astronomical research. It is vital that the spectra in the archive have been corrected for all of the imperfections and effects introduced by the instrument and telescope and that the improved products and catalogues are easy to search and identify. My project was designed to work with the set of FTS spectra and line catalogues in the archive to improve their legacy values. It was an excellent opportunity to get to grips with a space instrument and its calibration issues, making use of a fascinating range of astronomical objects from across the Universe.
University of Tuebingen
Long-term activity of Be X-ray binaries
Tutors: Emilio Salazar and Peter Kretschmar
Be X-ray Binaries (BeXRB) are one of the most notable classes of transient X-ray sources. During their outbursts they will often be among the brightest sources in the X-ray and soft gamma-ray energy range. During these outbursts the sources are frequently subject of multi-wavelength observing campaigns.
In 2014 a successful trainee project created the BeXRB Activity Monitor, a tool that automatically obtains data from X-ray monitors and indicates if they seem to be active. While working well in general, this tool implements very simple algorithms to classify the source activity.
The idea behind the project I was involved in was to take the existing software framework, which is mainly written in Python, and enhance it with more elaborate methods to identify source activity and its actual significance, even in the case of noisy data including gaps. Using the enhanced tools I then studied the full available data set of monitoring data and characterizeed the past source behaviour systematically.
Technical University of Berlin
Mapping Solar activity across the Solar
Tutors: Erik Kuulkers and Olivier Witasse
In the context of Space Weather studies, it is interesting to investigate how Solar (wind) events affect planetary upper atmospheres. In particular, with the Mars Express spacecraft, we study how the atmosphere at Mars reacts to a Solar wind event, such as a coronal mass ejection. This requires the precise knowledge of the Solar wind propagation from the Sun to Mars. The goal my project was to gather information on the propagation of Solar events in the Solar system, by analysing data from radiation monitors onboard ESA's satellites (like INTEGRAL, XMM-Newton). Together with data from satellites which study the Sun, the radiation monitor data helped confirming/identying new Solar events which eventually hit Mars.
Joao Manuel da Silva Santos
University of Porto
THROES: A caTalogue of HeRschel Observations of Evolved Stars
Tutors: Carmen Sánchez Contreras and Pedro García-Lario
I had an opportunity to collaborate in the project THROES, which has been recently funded by ESA for one year (project kick-off date was September 19th, 2014). THROES is a comprehensive and systematic study of the far infrared properties of low- and intermediate-mass (1-8 Msun) evolved stars using Herschel archival data with the ultimate goal of better understanding the important physical processes and dramatic chemical and morphological changes that take place in these stars at the end of their evolution. As a first step, I have concentrated my analysis on the spectroscopic information obtained by Herschel/PACS in the 55-210 micron range on more than 200 individual sources, which cover the whole range of possible evolutionary stages in this short-lived phase of stellar evolution: from the less evolved Asymptotic Giant Branch (AGB) stars to the oldest planetary nebulae (PN), including also transition objects like optically bright post-AGB stars, heavily obscured OH/IR stars and proto-planetary nebulae (pPN). These objects display a wide variety of dust and gas chemical properties at optical and near- to mid-infrared wavelengths.
The main objectives of my project were:
to compile an initial catalogue with all the evolved stars with Herschel/PACS range spectroscopy data,
to uniformly re-process interactively the whole set of such observations,
- to extend this catalogue with information available in the literature on the sources selected for study, including ancillary data taken from other ground-based and space-based observatories at other wavelengths. Ultimately, we intended to analyze the data and to interpreted the results from the scientific point of view.
The project was a great opportunity to learn the PACS data reduction steps and since by summer 2015 a significant part of the data was fully reduced, I was able to concentrate on identifying and classifying different spectral features typically present in these objects, namely, 1) solid state (dust) features, 2) molecular lines, and 3) atomic/ionized gas emission lines. I have contributed to the generation of a catalogue containing the main parameters of the spectral features found in the PACS dataset (such as central wavelength, feature width, peak intensity, integrated flux,...) and to the identification of the main feature carrier. I also had chance to collaborate in the task of seeking for trends and correlations between these parameters and other nebular or stellar properties of the targets, as well as systematic changes of the features that might be correlated with the evolutionary stage of the object. I was also introduced to the dust emission and (molecular/atomic) line excitation and radiation theory, which are needed for a full scientific exploitation of the data.
University of Applied Sciences Erfurt
An Integrated Research Collaboration Application
Tutors: Richard Saxton, Aitor Ibarra, Carlos Gabriel and Erik Kuulkers
Collaborations between researchers are today much more efficient and effective than they were 30 years ago. Scientists working on the same topic can exchange ideas by email, in blogs, on wikis and attending conferences which are ever better served by the new social media. The advent of mobile internet, if harnessed correctly, could take this interaction to a new level, giving a more ergonomic, intensive connection between scientists and facilitating the group analysis of new data and ideas in real time.
Some bespoke tools have already been designed for mobile devices, e.g. EXOPLANET and publication access tools such as Mendeley and Scholarly. Nevertheless, there was no a general multi-function platform which would allow a group to seamlessly access scientific articles, results, archived data and a knowledge base and act as a real discussion platform.
Therefore the idea of my project was to develop an application based on web mobile technologies which integrates existing tools in an easy to use mobile format. As a test case we have developed the application for the Tidal Disruption Event research group, a community of 50-100 observational and theoretical astronomers.
The application addressed the following issues:
Present a simple, intuitive, easy-to-read interface to the user.
Store and maintain a database of relevant publications.
Host a discussion forum, (blog or similar).
Maintain a database of the properties of all sources/events which have been detected. Give access to this database and link to associated publications.
Give easy access to database results, Vizier, NED, SDSS, 6DF, Catalina and provide a device-tailored presentation for mobile devices.
Real-time access to GAIA alerts, astronomical telegrams etc.
Give sound or text notifications when something new arrives.
University of Oslo
The mystery of the [NII] emission in the Herschel/HIFI data
Tutors: David Teyssier and Anthony Marston
The HIFI instrument (Heterodyne Instrument for the Far-Infrared) was the high-resolution spectrometer on-board Herschel. After almost 4 years of activity in orbit, a unique scientific data-base has been collected, offering a mine of information for both scientific and instrumental investigation. One of the intriguing outcomes of the mission was the surprising lack of noticeable [NII] emission throughout the HIFI observations. [NII] is expected to be ubiquitous in diffuse and ionised environments such as HII regions. The very limited rate of detection of this ion leads to the suspicion that it got cancelled out in the observing procedure whereby data taken on the target and slightly off-target are being subtracted. If this is true, off-target data should also contain significant [NII] information. Interestingly as well, [CII] which is also very widespread in our Galaxy, was detected without any problem.
The goal of my project was to investigate the reasons for the lack of apparent [NII] detection in the HIFI data, by mean of separately analysing the various observing phases of the data. Comparison to similar data collected with the SPIRE instrument on-board Herschel allowed cross-calibration of the findings, and helped to guide the search. The analysis of the [NII] emission allowed an estimation of the amount of warm ionized gas in our Galaxy, and study its kinematics and compare it to that of [CII] and other tracers. The HIFI data holding the [NII] emission occur in an instrumental band where peculiar data artefacts are present, so another challenge was to understand and characterise those features to discriminate them from the real data. For this I have been using several existing tools developed in the HIPE Software Analysis package. I have also performed data mining and processing of large data-sets.
University of Bari Aldo Moro
Pile-up correction for the XMM-Newton pn X-ray camera
Tutors: Norbert Schartel, Richard Saxton and Maria Santos-Lleo
XMM-Newton carries three high throughput X-ray telescopes with an unprecedented effective area. Each telescope has an X-ray CCD camera on its focal plane: pn, MOS-1 and MOS-2. These cameras are comprising the European Photon Imaging Camera (EPIC). In addition XMM-Newton is equipped with two Reflection Grating Spectrometers and an optical monitor for simultaneous X-ray imaging, spectroscopy and UV/optical measurement. The large collecting area and ability to make long uninterrupted exposures provide highly sensitive observations.
As many astronomical sources are highly variable in X-rays, XMM-Newton sometimes observes its target at a source flux significantly higher than expected. Such observations often show pile-up within the EPIC cameras. Pile-up means that two or more photons arrive in the same camera-pixel during one read-out cycle. The camera then combines these into a single piled-up photon, which creates distorted spectra that are seriously hampering the scientific interpretation of celestial sources.
For the EPIC pn camera the XMM-Newton science operation centre developed the concept of a new pile-up correction method, never used in astrophysics before. This method was motivated by methods applied in nuclear physics and was implemented within the XMM-Newton science analysis software. I was asked to test and explore the new pile-up correction method and to evaluate the limitations. It has been done with simulations and with real data taken by the spacecraft. Together with the validation of the method, the true X-ray spectra of a few selected bright objects were determined and its spectral characteristics and physical parameters determined.