KEYWORDS: Data processing, Data storage, Software development, Algorithm development, System on a chip, Data processing, Seaborgium, Prototyping, Space operations, Information technology, Lead
Euclid is an ESA mission aimed at understanding the nature of dark energy and dark matter by using simultaneously two probes (weak lensing and baryon acoustic oscillations). The mission will observe galaxies and clusters of galaxies out to z~2, in a wide extra-galactic survey covering 15000 deg2, plus a deep survey covering an area of 40 deg². The payload is composed of two instruments, an imager in the visible domain (VIS) and an imager-spectrometer (NISP) covering the near-infrared. The launch is planned in Q4 of 2020. The elements of the Euclid Science Ground Segment (SGS) are the Science Operations Centre (SOC) operated by ESA and nine Science Data Centres (SDCs) in charge of data processing, provided by the Euclid Consortium (EC), formed by over 110 institutes spread in 15 countries. SOC and the EC started several years ago a tight collaboration in order to design and develop a single, cost-efficient and truly integrated SGS. The distributed nature, the size of the data set, and the needed accuracy of the results are the main challenges expected in the design and implementation of the SGS. In particular, the huge volume of data (not only Euclid data but also ground based data) to be processed in the SDCs will require distributed storage to avoid data migration across SDCs. This paper describes the management challenges that the Euclid SGS is facing while dealing with such complexity. The main aspect is related to the organisation of a geographically distributed software development team. In principle algorithms and code is developed in a large number of institutes, while data is actually processed at fewer centers (the national SDCs) where the operational computational infrastructures are maintained. The software produced for data handling, processing and analysis is built within a common development environment defined by the SGS System Team, common to SOC and ECSGS, which has already been active for several years. The code is built incrementally through different levels of maturity, going from prototypes (developed mainly by scientists) to production code (engineered and tested at the SDCs). A number of incremental challenges (infrastructure, data processing and integrated) have been included in the Euclid SGS test plan to verify the correctness and accuracy of the developed systems.
KEYWORDS: Data processing, Galactic astronomy, Space operations, Telescopes, Point spread functions, K band, Sensors, Image quality, Data archive systems, Calibration
Euclid is a space-based optical/near-infrared survey mission of the European Space Agency (ESA) to investigate the
nature of dark energy, dark matter and gravity by observing the geometry of the Universe and on the formation of
structures over cosmological timescales. Euclid will use two probes of the signature of dark matter and energy: Weak
gravitational Lensing, which requires the measurement of the shape and photometric redshifts of distant galaxies, and
Galaxy Clustering, based on the measurement of the 3-dimensional distribution of galaxies through their spectroscopic
redshifts. The mission is scheduled for launch in 2020 and is designed for 6 years of nominal survey operations. The
Euclid Spacecraft is composed of a Service Module and a Payload Module. The Service Module comprises all the
conventional spacecraft subsystems, the instruments warm electronics units, the sun shield and the solar arrays. In
particular the Service Module provides the extremely challenging pointing accuracy required by the scientific objectives.
The Payload Module consists of a 1.2 m three-mirror Korsch type telescope and of two instruments, the visible imager
and the near-infrared spectro-photometer, both covering a large common field-of-view enabling to survey more than
35% of the entire sky. All sensor data are downlinked using K-band transmission and processed by a dedicated ground
segment for science data processing. The Euclid data and catalogues will be made available to the public at the ESA
Science Data Centre.
In June 2012, Euclid, ESA's Cosmology mission was approved for implementation. Afterwards the industrial contracts were signed for the payload module and the spacecraft prime, and the mission requirements consolidated. We present the status of the mission in the light of the design solutions adopted by the contractors. The performances of the spacecraft in its operation, the telescope assembly, the scientific instruments as well as the data-processing have been carefully budgeted to meet the demanding scientific requirements. We give an overview of the system and where necessary the key items for the interfaces between the subsystems.
KEYWORDS: Space operations, Calibration, System on a chip, Sensors, Control systems, Satellites, Mathematical modeling, Visible radiation, Seaborgium, Data processing
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a
survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal
lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near-
Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active
thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations
and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control
of systematic effects is required. From previous experiments, we have built the concept of an
integrated instrument development and verification approach, where the scientific, instrument and
ground-segment expertise have strong interactions from the early phases of the project. In particular,
we discuss the strong integration of test and calibration activities with the Ground Segment, starting
from early pre-launch verification activities. We want to report here the expertise acquired by the
Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it
is applied in the Euclid mission framework.
KEYWORDS: Seaborgium, Data processing, Data storage, Data modeling, Space operations, System on a chip, Software development, Calibration, Device simulation, Data centers
The Scientific Ground Segment (SGS) of the ESA M2 Euclid mission, foreseen to be launched in the fourth quarter of
2019, is composed of the Science Operations Center (SOC) operated by ESA and a number of Science Data Centers
(SDCs) in charge of data processing, provided by a Consortium of 14 European countries. Many individuals, scientists
and engineers, are and will be involved in the SGS development and operations. The distributed nature of the data
processing and of the collaborative software development, the data volume of the overall data set, and the needed
accuracy of the results are the main challenges expected in the design and implementation of the Euclid SGS. In
particular, the huge volume of data (not only Euclid data but also ground based data) to be processed in the SDCs will
require a distributed storage to avoid data migration across SDCs. The leading principles driving the development of the
SGS are expected to be the simplicity of system design, a component-based software engineering, virtualization, and a
data-centric approach to the system architecture where quality control, a common data model and the persistence of the
data model objects play a crucial role. ESA/SOC and the Euclid Consortium have developed, and are committed to
maintain, a tight collaboration in order to design and develop a single, cost-efficient and truly integrated SGS.
Euclid is a space-borne survey mission developed and operated by ESA. It is designed to understand the origin of the
Universe's accelerating expansion. Euclid will use cosmological probes to investigate the nature of dark energy, dark
matter and gravity by tracking their observational signatures on the geometry of the Universe and on the history of
structure formation. The mission is optimised for the measurement of two independent cosmological probes: weak
gravitational lensing and galaxy clustering. The payload consists of a 1.2 m Korsch telescope designed to provide a large
field of view. The light is directed to two instruments provided by the Euclid Consortium: a visual imager (VIS) and a
near-infrared spectrometer-photometer (NISP). Both instruments cover a large common field of view of 0.54 deg2, to be
able to survey at least 15,000 deg2 for a nominal mission of 6 years. An overview of the mission will be presented: the
scientific objectives, payload, satellite, and science operations. We report on the status of the Euclid mission with a
foreseen launch in 2019.
Since the very beginning of 2008, the Large Binocular Telescope (LBT) is officially equipped with it's first binocular
instrument ready for science observations: the Large Binocular Camera (LBC). This is a double CCD imager, installed at
the prime focus stations of the two 8.4m telescopes of LBT, able to obtain deep and wide field images in the whole
optical spectrum from UV to NIR wavelengths.
We present here the overall architecture of the instrument, a brief hardware review of the two imagers and notes how
observations are carried on. At the end we report preliminary results on the performances of the instrument along with
some images obtained during the first months of observations in binocular mode.
A. Mennella, B. Aja, E. Artal, M. Balasini, G. Baldan, P. Battaglia, T. Bernardino, M. Bersanelli, E. Blackhurst, L. Boschini, C. Burigana, R. Butler, B. Cappellini, F. Colombo, F. Cuttaia, O. D'Arcangelo, S. Donzelli, R. Davis, L. De La Fuente, F. Ferrari, L. Figini, S. Fogliani, C. Franceschet, E. Franceschi, T. Gaier, S. Galeotta, S. Garavaglia, A. Gregorio, M. Guerrini, R. Hoyland, N. Hughes, P. Jukkala, D. Kettle, M. Laaninen, P. Lapolla, D. Lawson, R. Leonardi, P. Leutenegger, G. Mari, P. Meinhold, M. Miccolis, D. Maino, M. Malaspina, N. Mandolesi, M. Maris, E. Martinez-Gonzalez, G. Morgante, L. Pagan, F. Pasian, P. Platania, M. Pecora, S. Pezzati, L. Popa, T. Poutanen, M. Pospieszalski, N. Roddis, M. Salmon, M. Sandri, R. Silvestri, A. Simonetto, C. Sozzi, L. Stringhetti, L. Terenzi, M. Tomasi, J. Tuovinen, L. Valenziano, J. Varis, F. Villa, A. Wilkinson, F. Winder, A. Zacchei
In this paper we present the test results of the qualification model (QM) of the LFI instrument, which is being
developed as part of the ESA Planck satellite. In particular we discuss the calibration plan which has defined
the main requirements of the radiometric tests and of the experimental setups. Then we describe how these
requirements have been implemented in the custom-developed cryo-facilities and present the main results. We
conclude with a discussion of the lessons learned for the testing of the LFI Flight Model (FM).
The Large Binocular Telescope is currently equipped with a couple of wide field Prime Focus. The two cameras are optimized for, respectively, the blue and the red portion of the visible spectrum. The history of this project is here sketched up and the current status is shown. The Blue channel is currently working onboard the telescope and provided what has been named the first-light of the telescope in single eye configuration.
The Prime Focus for the Large Binocular Telescope are a couple of Prime Focus stations each equipped with four 4kx2k CCDs and a six lenses corrector with an aspheric surface and the first lens as large as roughly 800mm in diameter. These cameras will cover almost half degree of Field of View on 8m-class telescopes with unprecedented velocity of F/1.4. The two units are optimized for the Red and Blue portions of the visible wavelength and additionally an extension to J and H bands is foreseen. An overview of the project, including the optomechanics, the cryogenics, the electronics, and the software is given along with a preliminary account of lessons learned and on how much the second unit, the Red one, the schedule of which is shifted with respect to the Blue one by several months, will take advantage from the experience gained in the Blue unit assembly and integration.
The International Virtual Observatory Alliance (IVOA: http://www.ivoa.net) represents 14 international projects working in coordination to realize the essential technologies and interoperability standards necessary to create a new research infrastructure for 21st century astronomy. This international Virtual Observatory will allow astronomers to interrogate multiple data centres in a seamless and transparent way, will provide new powerful analysis and visualisation tools within that system, and will give data centres a standard framework for publishing and delivering services using their data. The first step for the IVOA projects is to develop the standardised framework that will allow such creative diversity. Since its inception in June 2002, the IVOA has already fostered the creation of a new international and widely accepted, astronomical data format (VOTable) and has set up technical working groups devoted to defining essential standards for service registries, content description, data access, data models and query languages following developments in the grid community. These new standards and technologies are being used to build science prototypes, demonstrations, and applications, many of which have been shown in international meetings in the past two years. This paper reviews the current status of IVOA projects, the priority areas for technical development, the science prototypes and planned developments.
KEYWORDS: Software development, Prototyping, Standards development, Data processing, Calibration, Data modeling, Interfaces, Data archive systems, Data communications, Device simulation
A geographically distibuited software project needs to have a well defined software integration & development plan to avoid extra work in the pipeline creation phase. Here we will describe the rationale in the case of the Planck/LFI DPC project and what was designed and developed to build the integration and testing environment.
The Large Binocular Camera (LBC) is the double optical imager that will be installed at the prime foci of the Large Binocular Telescope (2x8.4 m). Four Italian observatories are cooperating in this project: Rome (CCD Camera), Arcetri-Padua (Optical Corrector) and Trieste (Software). LBC is composed by two separated large field (27 arcmin FOV) cameras, one optimized for the UBV bands and the second for the VRIZ bands. An optical corrector balances the aberrations induced by the fast (F#=1.14) parabolic primary mirror of LBT, assuring that the 80% of the PSF encircled energy falls inside one pixel for more of the 90% of the field. Each corrector uses six lenses with the first having a diameter of 80cm and the third with an aspherical surface. Two filter wheels allow the use of 8 filters. The two channels have similar optical designs satisfying the same requirements, but differ in the lens glasses: fused silica for the "blue" arm and BK7 for the "red" one. The two focal plane cameras use an array of four 4290 chips (4.5x2 K) provided
by Marconi optimized for the maximum quantum efficiency (85%) in each channel. The sampling is 0.23 arcseconds/pixel. The arrays are cooled by LN2 cryostats assuring 24 hours of operation. Here we present a description of the project and its current status including a report about the Blue camera and its laboratory tests. This instrument is planned to be the first light instrument of LBT.
KEYWORDS: Lithium, Data mining, Astronomy, Databases, Stereolithography, Data analysis, Digital Light Processing, Palladium, Heads up displays, Observatories
The International Virtual Observatory will pose unprecedented problems to data mining. We shortly discuss the effectiveness of neural networks as aids to the decisional process of the astronomer, and present the AstroMining Package. This package was written in Matlab and C++ and provides an user friendly interactive platform for various data mining tasks. Two applications are also shortly outlined: the derivation of photometric redshifts for a subsample of objects extracted from the Sloan Digital Sky Survey Early Data Release, and the evaluation of systematic patterns in the telemetry data for the Telescopio Nazionale Galilo (TNG).
The success of the Planck mission heavily relies on careful planning, design and implementation of its ground segment facilities. Among these, two Data Processing Centres (DPCs) are being implemented, which are operated by the consortia responsible for building the instruments forming the scientific payload of Planck. The two DPCs, together with the Mission Operations Centre (MOC) and the Herschel Science Centre (HSC), are the major elements of the Herschel/Planck scientific ground segment.
The Planck DPCs are responsible for the operation of the instruments, and for the production, delivery and archiving of the scientific data products, which can be considered as the final results of the mission:
· Calibrated time series data, for each receiver, after removal of systematic features and attitude reconstruction.
· Photometrically and astrometrically calibrated maps of the sky in the observed bands.
· Sky maps of the main astrophysical components.
· Catalogs of sources detected in the sky maps of the main astrophysical components.
· CMB power spectrum coefficients.
During the development phase, the DPCs are furthermore responsible for the production of data simulating realistically the behaviour of the instruments in flight, and for the support to instrument testing activities.
In this paper, some aspects related to the control of the Planck instruments, to the data flow and to the data processing for Planck are described, and an overview of the activities being carried out is provided.
The Italian National "Galileo" Telescope (Telescopio Nazionale "Galileo" - TNG) is a 3.5m telescope located at La Palma, in the Canary islands, which has seen first light in 1998. Available TNG subsystems include four first-generation instruments, plus adaptive optics, meteo and seeing towers; the control and data handling systems are tightly coupled allowing a smooth data flow while preserving integrity. As a part of the data handling systems, the production of a local "Archive at the Telescope" (AaT) is included, and the production of database tables and hard media for the TNG Long-Term Archive (LTA) is supported. The implementation of a LTA prototype has been recently terminated, and the implementation of its operational version is being planned by the Italian National Institute for Astrophysics (INAF).
A description of the AaT and prototype LTA systems are given, including their data handling/archiving and data retrieval capabilities. A discussion of system features and lessons learned is also included, with particular reference to the issues of completeness and data quality. These issues are of particular importance in the perspective of the preparation of a national facility for the archives of data from ground-based telescopes, and its possible inclusion as a data provider in the Virtual Observatory framework.
The large Binocular Telescope is currently in the pre- erection phase. The instrument has been already funded and its first-light is expected shortly after that of the LBT. Given the peculiarity of the telescope optics we designed tow prime focus cameras with two five-lens refractive correctors, optimized in the blue-side and red-side of the visible spectrum respectively. This independent coating. Detectors also reflect this choice, being optimized separately. We present the most relevant features of the instrument, the optical design as well as the structural and mechanical layout. Each of the two Prime Focus cameras gather light form a very fast, F/1.14 parabolic primary mirror. The field is corrected over roughly half a degree in size, allowing optical performances in terms of 80 percent of Encircled Energy in better than approximately 0.3 inch. Focal length is slightly increased in order to provide a better sampling using 13.5 micrometers pixel size chips. The CCD array is made up with 4 EEV 42-90 chips, per channel, to obtain an equivalent 6000 by 6000 pixels optimizing the AR coating to the U,B,V and V,R,I,Z bands respectively. The array will be read out in 10 seconds using a 1Meegapixel/second controller with four video channels. The cryostat will use a state of the art dewar to reach an holding time of several days using a limited amount of liquid nitrogen. The whole mechanical design has bene modeled using Finite Elements analysis in order to check for mechanical flexures of the mount tube and of the optical components by themselves. A brief overview of the informative facilities to be provided with the instrument and of a few science case studies that can be attacked by this instrument are also given.
Kevin Bennett, Fabio Pasian, Jean-Francois Sygnet, Anthony Banday, Matthias Bartelmann, Richard Gispert, Adam Hazell, William O'Mullane, Claudio Vuerli
KEYWORDS: Commercial off the shelf technology, Prototyping, Software development, Data processing, Document management, Databases, Space operations, Data storage, Calibration, Process control
During all phases of the Planck mission (Design, Development, Operations and Post-operations), it is necessary to guarantee proper information management among many Co-Is, Associates, engineers and technical and scientific staff (the estimated number of participants is over 200), located throughout countries in both Europe and North America. Information concerning the project ranges from instrument information (technical characteristics, reports, configuration control documents, drawings, public communications, etc.), to the analysis of the impact on science implied by specific technical choices. For this purpose, an Integrated Data and Information System (IDIS) will be developed to allow proper intra-Consortium and inter-Consortia information exchange. A set of tools will be provided, maximizing use of Commercial Off-The-Shelf (COTS) or reliable public domain software, to allow distributed collaborative research to be carried out. The general requirements for IDIS and its components have bene defined; the preparation of software requirements and COTS selection is being carried out. A prototype IDIS is expected to be available in spring 2000.
Massimo Callegari, Antonio Panciatici, Fabio Pasian, Mauro Pucillo, Paolo Santin, Simo Aro, Peter Linde, Maria Duran, Jose Rodriguez, Francois Genova, Francois Ochsenbein, J. Ponz, Antonio Talavera
KEYWORDS: Telecommunications, Control systems, Telescopes, Astronomy, Software development, Data storage, Space telescopes, Plasma physics, Human-machine interfaces, Data archive systems
The REMOT (Remote Experiment Monitoring and conTrol) project was financed by 1996 by the European Community in order to investigate the possibility of generalizing the remote access to scientific instruments. After the feasibility of this idea was demonstrated, the DYNACORE (DYNAmically, COnfigurable Remote Experiment monitoring and control) project was initiated as a REMOT follow-up. Its purpose is to develop software technology to support scientists in two different domains, astronomy and plasma physics. The resulting system allows (1) simultaneous multiple user access to different experimental facilities, (2) dynamic adaptability to different kinds of real instruments, (3) exploitation of the communication infrastructures features, (4) ease of use through intuitive graphical interfaces, and (5) additional inter-user communication using off-the-shelf projects such as video-conference tools, chat programs and shared blackboards.
KEYWORDS: Telescopes, Control systems, Active optics, Local area networks, Human-machine interfaces, Interfaces, Operating systems, Data archive systems, Databases, CCD cameras
Since March 1997, the TNG Telescope is int its Commissioning phase. In this paper, we describe the structure of the control software of TNG and the on-going activity of the software integration team. The Telescope Communication Network has been completely installed, the control software has been set up and the integration phase is currently in progress. The TNG control software has been designed having in mind the needs of a modern telescope control system: it is based on stable and widespread industry standards; its architecture is fully modular and intrinsically open in order to allow future enhancements and/or modifications of its components. Moreover, the code was written paying a particular attention to its portability. All these characteristics make the TNG control system open to future technology evolutions, both hardware and software-wise. The TNG control software provides a coherent environment where the information flow is constantly guided and controlled through its path across the system. Despite the multiplicity and non-homogeneity of the different subsystems, TNG provides the operator a common framework from the raw data gathering, to the real-time applications, up to the operator interface and archiving system. This was made designing and building a set of layers of increasing abstraction that were mapped onto the various physical components. A brief description of the steps followed during the integration of a number of subsystems will be given.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.