Romain Laugier, Nick Cvetojevic, Frantz Martinache
Aug 18, 2020·astro-ph.IM·PDF The use of interferometric nulling for the direct detection of extrasolar planets is in part limited by the extreme sensitivity of the instrumental response to tiny optical path differences between apertures. The recently proposed kernel-nuller architecture attempts to alleviate this effect with an all-in-one combiner design that enables the production of observables inherently robust to residual optical path differences (<< lambda). Until now, a unique kernel nuller design has been proposed ad hoc for a four-beam combiner. We examine the properties of this original design and generalize them for an arbitrary number of apertures. We introduce a convenient graphical representation of the complex combiner matrices that model the kernel nuller and highlight the symmetry properties that enable the formation of kernel nulls. The analytical description of the nulled outputs we provide demonstrates the properties of a kernel nuller. Our description helps outline a systematic way to build a kernel nuller for an arbitrary number of apertures. The designs for 3- and 6-input combiners are presented along with the original 4-input concept. Combiners grow in complexity with the square of the number of apertures. While one can mitigate this complexity by multiplexing nullers working independently over a smaller number of sub-apertures, an all-in-one kernel nuller recombining a large number of apertures appears as the most efficient way to characterize a high-contrast complex astrophysical scene. One can design kernel nullers for an arbitrary number of apertures that produce observable quantities robust to residual perturbations. The designs we recommend are lossless and take full advantage of all the available interferometric baselines. They are complete, result in as many kernel nulls as the theoretically expected number of closure-phases, and are optimized to require as few outputs as possible.
Romain Laugier, Frantz Martinache, Nick Cvetojevic, David Mary, Alban Ceau, Mamadou N'Diaye, Jens Kammerer, Julien Lozi, Olivier Guyon, Coline Lopez
To reach its optimal performance, Fizeau interferometry requires that we work to resolve instrumental biases through calibration. One common technique used in high contrast imaging is angular differential imaging, which calibrates the point spread function and flux leakage using a rotation in the focal plane. Our aim is to experimentally demonstrate and validate the efficacy of an angular differential kernel-phase approach, a new method for self-calibrating interferometric observables that operates similarly to angular differential imaging, while retaining their statistical properties. We used linear algebra to construct new observables that evolve outside of the subspace spanned by static biases. On-sky observations of a binary star with the SCExAO instrument at the Subaru telescope were used to demonstrate the practicality of this technique. We used a classical approach on the same data to compare the effectiveness of this method. The proposed method shows smaller and more Gaussian residuals compared to classical calibration methods, while retaining compatibility with the statistical tools available. We also provide a measurement of the stability of the SCExAO instrument that is relevant to the application of the technique. Angular differential kernel phases provide a reliable method for calibrating biased observables. Although the sensitivity at small separations is reduced for small field rotations, the calibration is effectively improved and the number of subjective choices is reduced.
Romain Laugier, Denis Defrère, Michael Ireland, Germain Garreau, Olivier Absil, Alexis Matter, Romain Petrov, Philippe Berio, Peter Tuthill, Marc-Antoine Martinod, Lucas Labadie
Jul 24, 2024·astro-ph.IM·PDF To leverage the angular resolution of interferometry at high contrast, one must employ specialized beam-combiners called interferometric nullers. Nullers discard part of the astrophysical information to optimize the recording of light present in the dark fringe of the central source. Asgard/NOTT will deploy a beam-combination scheme offering good instrumental noise rejection when phased appropriately, but for which information is degenerate on the outputs, prompting a dedicated tuning strategy using the science detector. The dispersive effect of water vapor can be corrected with prisms forming a variable thickness of glass. But observations in the L band suffer from an additional and important chromatic effect due to longitudinal atmospheric dispersion coming from a resonance of CO2 at 4.3 micron. To compensate for this effect efficiently, a novel type of compensation device will be deployed leveraging a gas cell of variable length at ambient pressure. After reviewing the impact of water vapor and CO2, we present the design of this atmospheric dispersion compensation device and describe a strategy to maintain this tuning on-sky.
Romain Laugier, Julien Woillez, Denis Defrère, Benjamin Courtney-Barrer, Muhammad Salman, Babak Sedghi, Roberto Abuter, Azzurra Bigioli, Maximilian Fabricius, Frank Eisenhauer, Frédéric Gonté, Nicolas Schuhler, Dieter Lutz, Miguel Riquelme, Pierre Bourget, Philippe Neuville, Sylvestre Lacour, Mathias Nowak
Jul 24, 2024·astro-ph.IM·PDF Scaling up interferometry to 8m collectors should smooth-out the optical piston perturbations and allow a slow fringe tracker to obtain high precision correction on faint targets. In practice, the GRAVITY fringe tracker still observes high frequency OPD components that limit the exposure time, its precision and limiting magnitude. Perturbations seem to come from mechanical vibrations in the train of mirrors. As part of the GRAVITY+ efforts, accelerometers were added to all the mirrors of the coudé train to compensate in real-time the optical path using the main delay lines. We show their effectiveness on vibrations peaks between 40 and 200Hz and outline prospects for the upgrade of the deformable mirrors and the beam-compressor differential delay lines.
Romain Laugier, Denis Defrère, Benjamin Courtney-Barrer, Felix A. Dannert, Alexis Matter, Colin Dandumont, Simon Gross, Olivier Absil, Azzurra Bigioli, Germain Garreau, Lucas Labadie, Jérôme Loicq, Marc-Antoine Martinod, Alexandra Mazzoli, Gert Raskin, Ahmed Sanny
Nov 17, 2022·astro-ph.IM·PDF Context: NOTT (formerly Hi-5) is a new high-contrast L' band (3.5-4.0 \textmu m) beam combiner for the VLTI with the ambitious goal to be sensitive to young giant exoplanets down to 5 mas separation around nearby stars. The performance of nulling interferometers in these wavelengths is affected both by fundamental noise from the background and by the contributions of instrumental noises. This motivates the development of end-to-end simulations to optimize these instruments. Aims: To enable the performance evaluation and inform the design of such instruments on the current and future infrastructures, taking into account the different sources of noise, and their correlation. Methods: SCIFYsim is an end-to-end simulator for single mode filtered beam combiners, with an emphasis on nulling interferometers. It is used to compute a covariance matrix of the errors. Statistical detection tests based on likelihood ratios are then used to compute compound detection limits for the instrument. Results: With the current assumptions on the performance of the wavefront correction systems, the errors are dominated by correlated instrumental errors down to stars of magnitude 6-7 in the L band, beyond which thermal background from the telescopes and relay system becomes dominant. Conclusions: SCIFYsim is suited to anticipate some of the challenges of design, tuning, operation and signal processing for integrated optics beam combiners. The detection limits found for this early version of NOTT simulation with the unit telescopes are compatible with detections at contrasts up to $10^5$ in the L band at separations of 5 to 80 mas around bright stars.
Alban Ceau, David Mary, Alexandra Greenbaum, Frantz Martinache, Anand Sivaramakrishnan, Romain Laugier, Mamadou N'Diaye
The James Webb Space Telescope will offer high-angular resolution observing capability in the near-infrared with masking interferometry on NIRISS, and coronagraphic imaging on NIRCam & MIRI. Full aperture kernel-phase based interferometry complements these observing modes, probing for companions at small separations while preserving the telescope throughput. Our goal is to derive both theoretical and operational contrast detection limits for the kernel-phase analysis of JWST NIRISS full-pupil observations by using tools from hypothesis testing theory, applied to observations of faint brown dwarfs with this instrument, but the tools and methods introduced here are applicable in a wide variety of contexts. We construct a statistically independent set of observables from aberration-robust kernel phases. Three detection tests based on these observable quantities are designed and analysed, all guaranteeing a constant false alarm rate for small phase aberrations. One of these tests, the Likelihood Ratio or Neyman-Pearson test, provides a theoretical performance bound for any detection test. The operational detection method considered here is shown to exhibit only marginal power loss with respect to the theoretical bound. In principle, for the test set to a false alarm probability of 1%, companion at contrasts reaching 10^3 at separations of 200 mas around objects of magnitude 14.1 are detectable. With JWST NIRISS, contrasts of up to 10^4 at separations of 200 mas could be ultimately achieved, barring significant wavefront drift. The proposed detection method is close to the ultimate bound and offers guarantees over the probability of making a false detection for binaries, as well as over the error bars for the estimated parameters of the binaries detectable by JWST NIRISS. This method is not only applicable to JWST NIRISS but to any imaging system with adequate sampling.
Marc-Antoine Martinod, Denis Defrère, Michael Ireland, Stefan Kraus, Frantz Martinache, Peter Tuthill, Azzurra Bigioli, Julia Bryant, Sorabh Chhabra, Benjamin Courtney-Barrer, Fred Crous, Nick Cvetojevic, Colin Dandumont, Germain Garreau, Tiphaine Lagadec, Romain Laugier, Daniel Mortimer, Barnaby Norris, Gordon Robertson, Adam Taras
Jan 16, 2023·astro-ph.IM·PDF The Very Large Telescope Interferometer is one of the most proficient observatories in the world for high angular resolution. Since its first observations, it has hosted several interferometric instruments operating in various bandwidths in the infrared. As a result, the VLTI has yielded countless discoveries and technological breakthroughs. Here, we introduce a new concept for the VLTI, Asgard: an instrumental suite comprised of four natively collaborating instruments: BIFROST, a combiner whose main science case is studying the formation processes and properties of stellar and planetary systems; NOTT, a nulling interferometer dedicated to imaging young nearby planetary systems in the L band; HEIMDALLR, an all-in-one instrument performing both fringe tracking and stellar interferometry with the same optics; Baldr, a Strehl optimiser. These instruments share common goals and technologies. The goals are diverse astrophysical cases such as the study of the formation and evolution processes of binary systems, exoplanetary systems and protoplanetary disks, the characterization of orbital parameters and spin-orbit alignment of multiple systems, the characterization of the exoplanets, and the study of exozodiacal disks. Thus, the idea of this suite is to make the instruments interoperable and complementary to deliver unprecedented sensitivity and accuracy from the J to M bands to meet these goals. The interoperability of the Asgard instruments and their integration in the VLTI are major challenges for this project.
Iva Laginja, Óscar Carrión-González, Romain Laugier, Elisabeth Matthews, Lucie Leboulleux, Axel Potier, Alexis Lau, Olivier Absil, Pierre Baudoz, Beth Biller, Anthony Boccaletti, Wolfgang Brandner, Alexis Carlotti, Gaël Chauvin, Élodie Choquet, David Doelman, Kjetil Dohlen, Marc Ferrari, Sasha Hinkley, Elsa Huby, Mikael Karlsson, Oliver Krause, Jonas Kühn, Jean-Michel Le Duigou, Johan Mazoyer, Dino Mesa, Michiel Min, David Mouillet, Laurent M. Mugnier, Gilles Orban de Xivry, Frans Snik, Daniele Vassallo, Arthur Vigan, Pieter de Visser
Mar 17, 2025·astro-ph.IM·PDF The Habitable Worlds Observatory (HWO) will enable a transformative leap in the direct imaging and characterization of Earth-like exoplanets. For this, NASA is focusing on early investment in technology development prior to mission definition and actively seeking international partnerships earlier than for previous missions. The "R&D for Space-Based HCI in Europe" workshop, held in March 2024 at Paris Observatory, convened leading experts in high-contrast imaging (HCI) to discuss European expertise and explore potential strategies for European contributions to HWO. This paper synthesizes the discussions and outcomes of the workshop, highlighting Europe's critical contributions to past and current HCI efforts, the synergies between ground- and space-based technologies, and the importance of laboratory testbeds and collaborative funding mechanisms. Key conclusions include the need for Europe to invest in technology development for areas such as deformable mirrors and advanced detectors, and establish or enhance laboratory facilities for system-level testing. Putting emphasis on the urgency of aligning with the timeline of the HWO, the participants called on an open affirmation by the European Space Agency (ESA) that a European contribution to HWO is clearly anticipated, to signal national agencies and unlock funding opportunities at the national level. Based on the expertise demonstrated through R&D, Europe is poised to play a pivotal role in advancing global HCI capabilities, contributing to the characterization of temperate exoplanets and fostering innovation across domains.
Lorenzo Cesario, Tim Lichtenberg, Eleonora Alei, Óscar Carrión-González, Felix A. Dannert, Denis Defrère, Steve Ertel, Andrea Fortier, A. García Muñoz, Adrian M. Glauser, Jonah T. Hansen, Ravit Helled, Philipp A. Huber, Michael J. Ireland, Jens Kammerer, Romain Laugier, Jorge Lillo-Box, Franziska Menti, Michael R. Meyer, Lena Noack, Sascha P. Quanz, Andreas Quirrenbach, Sarah Rugheimer, Floris van der Tak, Haiyang S. Wang, Marius Anger, Olga Balsalobre-Ruza, Surendra Bhattarai, Marrick Braam, Amadeo Castro-González, Charles S. Cockell, Tereza Constantinou, Gabriele Cugno, Jeanne Davoult, Manuel Güdel, Nina Hernitschek, Sasha Hinkley, Satoshi Itoh, Markus Janson, Anders Johansen, Hugh R. A. Jones, Stephen R. Kane, Tim A. van Kempen, Kristina G. Kislyakova, Judith Korth, Andjelka B. Kovacevic, Stefan Kraus, Rolf Kuiper, Joice Mathew, Taro Matsuo, Yamila Miguel, Michiel Min, Ramon Navarro, Ramses M. Ramirez, Heike Rauer, Berke Vow Ricketti, Amedeo Romagnolo, Martin Schlecker, Evan L. Sneed, Vito Squicciarini, Keivan G. Stassun, Motohide Tamura, Daniel Viudez-Moreiras, Robin D. Wordsworth, the LIFE Collaboration
Oct 17, 2024·astro-ph.EP·PDF The increased brightness temperature of young rocky protoplanets during their magma ocean epoch makes them potentially amenable to atmospheric characterization to distances from the solar system far greater than thermally equilibrated terrestrial exoplanets, offering observational opportunities for unique insights into the origin of secondary atmospheres and the near surface conditions of prebiotic environments. The Large Interferometer For Exoplanets (LIFE) mission will employ a space-based mid-infrared nulling interferometer to directly measure the thermal emission of terrestrial exoplanets. Here, we seek to assess the capabilities of various instrumental design choices of the LIFE mission concept for the detection of cooling protoplanets with transient high-temperature magma ocean atmospheres, in young stellar associations in particular. Using the LIFE mission instrument simulator (LIFEsim) we assess how specific instrumental parameters and design choices, such as wavelength coverage, aperture diameter, and photon throughput, facilitate or disadvantage the detection of protoplanets. We focus on the observational sensitivities of distance to the observed planetary system, protoplanet brightness temperature using a blackbody assumption, and orbital distance of the potential protoplanets around both G- and M-dwarf stars. Our simulations suggest that LIFE will be able to detect (S/N $\geq$ 7) hot protoplanets in young stellar associations up to distances of $\approx$100 pc from the solar system for reasonable integration times (up to $\sim$hours). Detection of an Earth-sized protoplanet orbiting a solar-sized host star at 1 AU requires less than 30 minutes of integration time. M-dwarfs generally need shorter integration times. The contribution from wavelength regions $<$6 $μ$m is important for decreasing the detection threshold and discriminating emission temperatures.
Felix A. Dannert, Philipp A. Huber, Thomas Birbacher, Romain Laugier, Markus J. Bonse, Emily O. Garvin, Adrian M. Glauser, Veronika Oehl, Sascha P. Quanz
Jun 25, 2025·astro-ph.IM·PDF With the astrophysics community working towards the first observations and characterizations of Earth-like exoplanets, interest in space-based nulling interferometry has been renewed. This technique promises unique scientific and technical advantages by enabling direct mid-infrared observations. However, concept studies of nulling interferometers often overlook the impact of systematic noise caused by instrument perturbations. Earlier research introduced analytical and numerical models to address instrumental noise and, building on these results, we reproduce key simulations and report that the noise in the differential output of nulling interferometers follows a non-Gaussian distribution. The presence of non-Gaussian noise challenges the validity of classical hypothesis tests in detection performance estimates, as their reliance on Gaussian assumptions leads to overconfidence in detection thresholds. For the first time, we derive the true noise distribution of the differential output of a dual Bracewell nulling interferometer, demonstrating that it follows iterative convolutions of Bessel functions. Understanding this noise distribution enables a refined formulation of hypothesis testing in nulling interferometry, leading to a semi-analytical prediction of detection performance. This computationally efficient instrument model, implemented in a publicly available codebase, is designed for integration into science yield predictions for nulling interferometry mission concepts. It will play a key role in refining key mission parameters for the Large Interferometer For Exoplanets (LIFE).
GRAVITY+ Collaboration, :, Roberto Abuter, Patricio Alarcon, Fatme Allouche, Antonio Amorim, Christophe Bailet, Helen Bedigan, Anthony Berdeu, Jean-Philippe Berger, Philippe Berio, Azzurra Bigioli, Richard Blaho, Olivier Boebion, Marie-Lena Bolzer, Henri Bonnet, Guillaume Bourdarot, Pierre Bourget, Wolfgang Brandner, Cesar Cardenas, Ralf Conzelmann, Mauro Comin, Yann Clénet, Benjamin Courtney-Barrer, Yigit Dallilar, Ric Davies, Denis Defrère, Alain Delboulbé, Françoise Delplancke-Ströbele, Roderick Dembet, Tim de Zeeuw, Antonia Drescher, Andreas Eckart, Clemence Édouard, Frank Eisenhauer, Maximilian Fabricius, Helmut Feuchtgruber, Gert Finger, Natascha M. Förster Schreiber, Eloy Fuenteseca, Enrique Garcia, Paulo Garcia, Feng Gao, Eric Gendron, Reinhard Genzel, Juan Pablo Gil, Stefan Gillessen, Tiago Gomes, Frédéric Gonté, Carole Gouvret, Patricia Guajardo, Ivan Guidolin, Sylvain Guieu, Ronald Guzmann, Wolfgang Hackenberg, Nicolas Haddad, Michael Hartl, Xavier Haubois, Frank Haußmann, Gernot Heißel, Thomas Henning, Stefan Hippler, Sebastian Hönig, Matthew Horrobin, Norbert Hubin, Estelle Jacqmart, Laurent Jocou, Andreas Kaufer, Pierre Kervella, Jean-Paul Kirchbauer, Johan Kolb, Heidi Korhonen, Laura Kreidberg, Peter Krempl, Sylvestre Lacour, Stephane Lagarde, Olivier Lai, Vincent Lapeyrère, Romain Laugier, Jean-Baptiste Le Bouquin, James Leftley, Pierre Léna, Steffan Lewis, Dieter Lutz, Yves Magnard, Felix Mang, Aurelie Marcotto, Didier Maurel, Antoine Mérand, Florentin Millour, Nikhil More, Hugo Nowack, Matthias Nowak, Sylvain Oberti, Francisco Olivares, Thomas Ott, Laurent Pallanca, Thibaut Paumard, Karine Perraut, Guy Perrin, Romain Petrov, Oliver Pfuhl, Nicolas Pourré, Sebastian Rabien, Christian Rau, Miguel Riquelme, Sylvie Robbe-Dubois, Sylvain Rochat, Muhammad Salman, Malte Scherbarth, Markus Schöller, Joseph Schubert, Nicolas Schuhler, Jinyi Shangguan, Pavel Shchekaturov, Taro Shimizu, Silvia Scheithauer, Arnaud Sevin, Christian Soenke, Ferreol Soulez, Alain Spang, Eric Stadler, Christian Straubmeier, Eckhard Sturm, Calvin Sykes, Linda Tacconi, Helmut Tischer, Konrad Tristram, Frédéric Vincent, Sebastiano von Fellenberg, Sinem Uysal, Felix Widmann, Ekkehard Wieprecht, Erich Wiezorrek, Julien Woillez, Şenol Yazıcı, Gérard Zins
Frantz Martinache, Alban Ceau, Romain Laugier, Jens Kammerer, Mamadou N'Diaye, David Mary, Nick Cvetojevic, Coline Lopez
Kernel-phase is a data analysis method based on a generalization of the notion of closure-phase invented in the context of interferometry, but that applies to well corrected diffraction dominated images produced by an arbitrary aperture. The linear model upon which it relies theoretically leads to the formation of observable quantities robust against residual aberrations. In practice, detection limits reported thus far seem to be dominated by systematic errors induced by calibration biases not sufficiently filtered out by the kernel projection operator. This paper focuses on the impact the initial modeling of the aperture has on these errors and introduces a strategy to mitigate them, using a more accurate aperture transmission model. The paper first uses idealized monochromatic simulations of a non trivial aperture to illustrate the impact modeling choices have on calibration errors. It then applies the outlined prescription to two distinct data-sets of images whose analysis has previously been published. The use of a transmission model to describe the aperture results in a significant improvement over the previous type of analysis. The thus reprocessed data-sets generally lead to more accurate results, less affected by systematic errors. As kernel-phase observing programs are becoming more ambitious, accuracy in the aperture description is becoming paramount to avoid situations where contrast detection limits are dominated by systematic errors. Prescriptions outlined in this paper will benefit any attempt at exploiting kernel-phase for high-contrast detection.
Olivier Lai, Kanoa Withington, Romain Laugier, Mark Chun
Dome seeing is a known source of image quality degradation, but despite tremendous progress in wavefront control with the development of adaptive optics and environmental control through implementation of dome venting, surprisingly little is known about it quantitatively. We have found evidence of non-Kolmogorov dome turbulence from our observations with the imaka wide field adaptive optics system; PSFs seem to indicate an excess of high spatial frequencies and turbulence profiles reveal turbulence at negative conjugations. This has motivated the development of a new type of optical turbulence sensor called AIR-FLOW, Airborne Interferometric Recombiner: Fluctuations of Light at Optical Wavelengths. It is a non-redundant mask imaging interferometer that samples the optical turbulence passing through a measurement cell and it measures the two-dimensional optical Phase Structure Function. This is a useful tool to characterise different types of turbulence (e.g. Kolmogorov, diffusive turbulence, etc.). By fitting different models, we can determine parameters such as Cn 2 , r0, L0 or deviation from fully developed turbulence. The instrument was tested at the Canada France Hawaii Telescope, at the University of Hawaii 2.2-meter telescope (UH88'') and at the Observatoire de la C{ô}te d'Azur. It is ruggedised and sensitive enough to detect changes with different dome vent configurations, as well as slow local variations of the index of refraction in the UH88'' telescope tube. The instrument is portable enough that it can be used to locate sources of turbulence inside and around domes, but it can also be used in an operational setting without affecting observations to characterise the local optical turbulence responsible for dome seeing. Thus, it could be used in real-time observatory control systems to configure vents and air handlers to effectively reduce dome seeing. We believe it could also be a tool for site surveys to evaluate dome seeing mitigation strategies in situ.
Marc-Antoine Martinod, Denis Defrere, Romain Laugier, Steve Ertel, Olivier Absil, Barnaby Norris, Bertrand Mennesson
May 13, 2025·astro-ph.IM·PDF Nulling interferometry is a powerful observing technique to study exoplanets and circumstellar dust at separations too small for direct imaging with single-dish telescopes. With recent photonics developments and the near-future ground-based instrumental projects, it bears the potential to detect young giant planets near the snow lines of their host stars. The observable quantity of a nulling interferometer is called the null depth, its precise measurement and calibration remain challenging against instrument and atmospheric noise. Null self-calibration is a method aiming to model the statistical distribution of the nulled signal. It has proven to be more sensitive and accurate than average-based data reduction methods in nulling interferometry. The variety of existing and upcoming of nullers raises the issue of consistency of the calibration process, structure of the data and the ability to reduce archived data on the long term. It has also led to many different implementations of the Null self-calibration method. In this article, we introduce GRIP: the first open-source toolbox to reduce nulling data with enhanced statistical self-calibration methods from any nulling interferometric instrument within a single and consistent framework. Astrophysical results show good consistency with two published GLINT and LBTI datasets and confirm nulling precision down to a few 10$^{-4}$.
Marc-Antoine Martinod, Denis Defrère, Romain Laugier, Steve Ertel, Olivier Absil, Barnaby Norris, Germain Garreau, Bertrand Mennesson
Jul 11, 2024·astro-ph.IM·PDF Nulling interferometry is a powerful observing technique to reach exoplanets and circumstellar dust at separations too small for direct imaging with single-dish telescopes and too large for indirect methods. With near-future instrumentation, it bears the potential to detect young, hot planets near the snow lines of their host stars. A future space mission could detect and characterize a large number of rocky, habitable-zone planets around nearby stars at thermal-infrared wavelengths. The null self-calibration is a method aiming at modelling the statistical distribution of the nulled signal. It has proven to be more sensitive and accurate than average-based data reduction methods in nulling interferometry. This statistical approach opens the possibility of designing a GPU-based Python package to reduce the data from any of these instruments, by simply providing the data and a simulator of the instrument. GRIP is a toolbox to reduce nulling and interferometric data based on the statistical self-calibration method. In this article, we present the main features of GRIP as well as applications on real data.
Drinor Cacaj, Daniel Angerhausen, Prabal Saxena, Romain Laugier, Jens Kammerer, Eleonora Alei, Sascha P. Quanz
One of the primary objectives in modern astronomy is to discover and study planets with characteristics similar to Earth. This pursuit involves analyzing the spectra of exoplanets and searching for biosignatures. Contamination of spectra by nearby objects (e.g., other planets and moons in the same system) is a significant concern and must be addressed for future exo-Earth searching missions. The aim is to estimate, for habitable planets, the probability of spectral contamination by other planets within the same star system. This investigation focuses on the Large Interferometer for Exoplanets (LIFE). Since the Rayleigh criterion is inapplicable to interferometers such as those proposed for LIFE, we present new criteria based on the principle of parsimony, which take into account two types of issues: contamination or blending of point sources, and cancellation of point sources due to destructive interference. We define a new spatial resolution metric associated with contamination or cancellation that generalizes to a broader family of observing instruments. In the current baseline design, LIFE is an X-array architecture nulling interferometer. Our investigation reveals that its transmission map introduces the potential for two point sources to appear as one, even if they do not appear in close proximity. We find that LIFE has a spatial resolution comparable to that of a traditional telescope with a diameter of $D = 600\,\text{m}$, observing at $λ= 4 \,μ\text{m}$. Our survey of a star system population shows that, out of 73.4 expected habitable planets detected, 71.3 are not contaminated on average.
Gael Chauvin, Oscar Carrion Gonzalez, Iva Laginja, Daniel Dicken, Sebastiaan Haffert, Markus Kasper, Olivier Absil, Jens Kammerer, Axel Potier, Herve Le Coroller, Elisabeth Matthews, Romain Laugier, Denis Defrere, Jonah Hansen, Oliver Krause, Pieter de Visser, Mikael Karlsson, Thomas Henning, Marc Ferrari, Jonas Kuehn, Markus Janson, Feng Zhao, Celia Desgrange
Apr 16, 2026·astro-ph.IM·PDF The European Research and Development for Space based High Contrast Imaging II Workshop, held at MPIA in May 2025, advanced Europe strategic coordination in support of future exoplanet imaging missions such as the Habitable Worlds Observatory and the Large Interferometer for Exoplanets mission. Building on the first 2024 workshop, this meeting defined concrete priorities across eight technical areas, including wavefront sensing, coronagraphs, post processing, nulling interferometry, deformable mirrors, detectors, and telescope design. Discussions emphasized Europe strengths in adaptive optics, ground-based facilities, and interferometry, while identifying key gaps, particularly the need for a dedicated European vacuum testbed for high contrast imaging. The community highlighted near infrared or UV coronagraphy as a promising domain for European leadership and called for joint development of advanced data reduction algorithms, detectors, and cross-mission coordination with HWO and LIFE. The workshop outcomes establish a collaborative roadmap to strengthen Europe technological readiness, foster agency partnerships, and ensure its continued leadership in the next generation of space-based exoplanet exploration.
Nick Cvetojevic, Frantz Martinache, Peter Chingaipe, Romain Laugier, Katarzyna Ławniczuk, Ronald G. Broeke, Roxanne Ligi, Mamadou N'Diaye, David Mary
Jun 10, 2022·astro-ph.IM·PDF The use of interferometric nulling for the direct characterization of extrasolar planets is an exciting prospect, but one that faces many practical challenges when deployed on telescopes. The largest limitation is the extreme sensitivity of nullers to any residual optical path differences between the incoming telescope beams even after adaptive optics or fringe-tracker correction. The recently proposed kernel-nulling architecture attempts to alleviate this by producing the destructive interference required for nulling, in a scheme whereby self-calibrated observables can be created efficiently, in effect canceling out residual atmospheric piston terms. Here we experimentally demonstrate for the first time a successful creation of self-calibrated kernel-null observables for nulling interferometry in the laboratory. We achieved this through the use of a purpose-built photonic integrated device, containing a multimode interference coupler that creates one bright, and two nulled outputs when injected with three co-phased telescope beams. The device produces the nulled outputs in a way that, by the subtraction of the measured output flux, create a single self-calibrated kernel-null. We experimentally demonstrate the extraction of kernel-nulls for up to 200 nm induced piston error using a laboratory test-bench at a wavelength of 1.55 μm. Further, we empirically demonstrate the kernel-null behaviour when injected with a binary companion analogue equivalent to a 2.32 mas separation at a contrast of 10^{-2}, under 100 nm RMS upstream piston residuals.
Taro Matsuo, Felix Dannert, Romain Laugier, Sascha P. Quanz, Andjelka B. Kovacevic, LIFE collaboration
A mid-infrared nulling-space interferometer is a promising way to characterize thermal light from habitable planet candidates around Sun-like stars. However, one of the main challenges for achieving this ambitious goal is a high-precision stability of the optical path difference (OPD) and amplitude over a few days for planet detection and up to a few weeks for in-depth characterization. Here we propose a new method called phase-space synthesis decomposition (PSSD) to shorten the stability requirement to minutes, significantly relaxing the technological challenges of the mission. Focusing on what exactly modulates the planet signal in the presence of the stellar leak and systematic error, PSSD prioritizes the modulation of the signals along the wavelength domain rather than baseline rotation. Modulation along the wavelength domain allows us to extract source positions in parallel to the baseline vector for each exposure. The sum of the one-dimensional data converts into two-dimensional information. Based on the reconstructed image, we construct a continuous equation and extract the spectra through the singular value decomposition (SVD) while efficiently separating them from a long-term systematic stellar leak. We performed numerical simulations to investigate the feasibility of PSSD for the LIFE mission concept. We confirm that multiple terrestrial planets in the habitable zone around a Sun-like star at 10 pc can be detected and characterized despite high levels and long durations of systematic noise. We also find that PSSD is more robust against a sparse sampling of the array rotation compared to purely rotation-based signal extraction. Using PSSD as signal extraction method significantly relaxes the technical requirements on signal stability and further increases the feasibility of the LIFE mission.
Yinzi Xin, Laurent Pueyo, Romain Laugier, Leonid Pogorelyuk, Ewan S. Douglas, Benjamin J. S. Pope, Kerri L. Cahoy
Directly observing exoplanets with coronagraphs is impeded by the presence of speckles from aberrations in the optical path, which can be mitigated in hardware with wavefront control as well as in post-processing. This work explores using an instrument model in post-processing to separate astrophysical signals from residual aberrations in coronagraphic data. The effect of wavefront error (WFE) on the coronagraphic intensity consists of a linear contribution and a quadratic contribution. When either of the terms is much larger than the other, the instrument response can be approximated by a transfer matrix mapping WFE to detector plane intensity. From this transfer matrix, a useful projection onto instrumental modes that removes the dominant error modes can be derived. We apply this projection to synthetically generated Roman Space Telescope hybrid Lyot coronagraph (HLC) data to extract "robust observables," which can be used instead of raw data for applications such as detection testing. The projection improves planet flux ratio detection limits by about 28% in the linear regime and by over a factor of 2 in the quadratic regime, illustrating that robust observables can increase sensitivity to astrophysical signals and improve the scientific yield from coronagraphic data. While this approach does not require additional information such as observations of reference stars or modulations of a deformable mirror, it can and should be combined with these other techniques, acting as a model-informed prior in an overall post-processing strategy.