website image header

Book of abstracts

Book of abstracts

 

Uncertainties of chemical kinetic data: their effects on the chemistry in AGB outflows

Marie Van de Sande

During the asymptotic giant branch (AGB) phase near the end of their lives, solar-like stars lose their outer envelope via a stellar outflow to the interstellar medium (ISM). AGB outflows are rich astrochemical laboratories: more than 100 molecules have been detected in them, together with various types of newly formed dust. Chemistry and dynamics are closely coupled throughout the outflow, as it is the chemical process of dust formation that launches the outflow.

Chemical kinetics modelling of AGB outflows is therefore essential to interpret observations and understand the underlying dynamics. The thousands of reactions included the chemical network, which lies at the core of the model, each come with an uncertainty on their reaction rates. Following the methods of Wakelam et al. (2005, 2010), we performed the first study of the effects of these uncertainties on the chemistry within AGB outflows.

Our results show how uncertainties of the kinetic data affect the predicted chemistry throughout the outflow, ranging from the sizes of molecular envelopes to the shapes of abundance profiles. We discuss their consequences for the interpretation of observation. The results open up questions as to how accurate the chemical model can and should be, for which species and why, and what could be done to improve the network specifically for AGB outflows.

 

Interdiscipinary approach to understand the chemistry of Titan's atmosphere

Ludovic Biennier

During its thirteen years of observation of the Saturnian system, Cassini-Huygens has provided a new view on Titan and its atmosphere. Despite significant progress in our understanding of this moon, some questions remain and, as often in research, many new have appeared.  The ubiquitous haze that surrounds Titan keeps many mysteries regarding its composition and formation pathways. Interdisciplinary approaches, combining astronomical observations, space missions, laboratory experiments, theoretical calculations and photochemical models are required to make some substantial advances.

 

Extending the loop: model evaluation and discordances in astrochemistry

Marie Gueguen

I present different strategies for evaluating the adequacy of a model in context of high uncertainties, all based on the idea of iterative coherence testing, i.e., a methodology that anchors the improvement of the model’s accuracy in an iterative process of solving the discrepancies between the model’s predictions and  observations. Based on Le Quéré (2006), I consider three different phases of the development of a model and how iterative conference testing, or “the closing the loop” methodology in G.Smith’s terms, can guide decisions for improving the model for each one of these phases. I argue that iterative testing serves a different purpose and is connected to a different evaluation strategy at each phase of the development of a model, from the adoption of a initial model known to be imperfect to a mature model ready to be confronted to observations. 

I will illustrate the merits and pitfalls of such an account of iterative testing and model evaluation based on the example of the photochemical models of Titan’s atmosphere developed over the last two decades. Models of Titan's atmosphere have roughly followed two very different paths: either they have started with a complex chemical model but simplified astrophysical constraints, or with a complex description of the astrophysical conditions but a simplified chemistry. Both, however, failed to reproduce observations, although uncertainties attached to observations made the discordance not so worrying, until the observations of the Cassini-Huygens' mission were released in 2007. The latter showed that the discrepancies were not dampened, but on the contrary worsen by improved observations: the uncertainty attached to the model were shown to be higher than those related to observations, rendering the comparison between observations and models impossible to interpret. This case has led to an impressive surge of creativity from the astrochemists, who have exploited this opportunity to develop innovative iterative tools designed to enrich and develop their photochemical network to the point where a discordance between their chemical models and the Cassini's observations could finally be significantly interpreted.

 

Non-LTE modelling of the HCCNC and HNCCC abundance in astrophysical environments

Cheikh. T. Bop, Alexandre Faure , Ernesto Quintas-Sánchez, Richard Dawes, and François Lique

The isomers of HCCCN, namely HCCNC and HNCCC, are widely observed in the interstellar medium and in circumstellar envelopes. Their abundance has been determined under the assumption of local thermodynamic equilibrium (LTE) conditions or non-LTE radiative transfer models, but considering the collisional excitation of HCCCN as the same for all isomers. Chemical models for the prototypical cold cores, TMC-1 and L1544, reproduced the abundance of HCCCN fairly well, but they tend to overestimate the abundances of HCCNC and HNCCC with respect to the observations. It is therefore worth revisiting the interpretation of the observational spectra of  these isomers using a rigorous non-LTE modelling. The abundance of HCCNC and HNCCC were then determined using non-LTE radiative transfer calculations based on the proper rate coefficient for the first time in this work. Modelling the brightness temperature of HCCNC and HNCCC when using their proper collision rate coefficients shows that models based on LTE or non-LTE with approximate collision data may lead to deviations of up to a factor of 1.5. Reinterpreting the  observational spectra led us to significant differences relative to the observed abundances previously determined. Our findings suggest quite similar abundance ratios for the TMC-1 and L1544 cold cores as well as the L483 protostar. This work will encourage further modelling with more robust non-LTE radiative transfer calculations and future studies to revisit the chemistry of HCCCN and its isomers in cold molecular clouds.

 Astrophysics joins data science to catch the actual nature of stellar nurseries  

Jérôme Pety

Atoms and molecules have long been thought to be versatile tracers of the cold neutral gas in the universe, from high-redshift galaxies to star forming regions and proto-planetary disks, because their internal degrees of freedom bear the signature of the physical conditions where these species reside. However, the promise that molecular emission has a strong diagnostic power of the underlying physical and chemical state is still hampered by the difficulty to combine sophisticated chemical codes with an incomplete observational knowledge of the source. It is therefore important 1) to acquire self-consistent data sets that can be used as templates for this theoretical work, and 2) to reveal the diagnostic capabilities of molecular lines accurately.

I will present some efforts to address the associated challenges: lack of ground truth, censored information, existence of hidden explanatory variables, biases related to the use of too simple models, etc. I will use two recent large observing projects (the Horsehead WHISPER and ORION-B surveys) to illustrate these efforts. 

Is it worth to investigate such phenomena, which are predictable with a good accuracy?

Sándor Demes and François Lique

Astronomy and astrophysics is a field of science, which evolves rapidly due to the technological progress. The recent telescopes allow one to carry out observations very efficiently, giving a possibility to look into various regions of the Universe. In order to correctly interpret the observations, however, complementary research is required: laboratory measurements and/or theoretical modellings. For instance, when a molecular species is observed in Space, there is a need for precise collisional and spectroscopic data in order to correctly describe the physical conditions of the astronomical environment, from where it was detected.

Very often the spectroscopic and especially the collisional data are challenging to get, both from experimental and also from theoretical perspective. The calculation of collisional rate coefficients could be calculated only from sophisticated quantum-theories. Such calculations are very computationally demanding, so one should carefully choose the boundaries of calculations (e.g. making limitations on energy, temperature, etc.) and also properly select the possible approximations, which can be applicable.

To estimate the rate coefficients for molecular collisions involving H2 (which is the most abundant molecule in interstellar clouds), often the collisional data with helium atom. Many studies have shown earlier that a multiplicative factor of 1.4 applied on rate coefficients calculated for He could be a good approximation for collisional rates for molecular hydrogen. This is based on their similar electronic structure and small effective size. Several studies shown however the strong limitations behind this crude approximation, concluding that collisional data for H2 is highly necessary for precise astrophysical applications.

On the other hand, while analysing the rate coefficients for collisions of ionic species with para- or ortho- H2 nuclear spin-selected molecules, we find that the collisional rates are very similar, the difference between them is usually less than 10–20 %, which is far within the required “astronomical precision”. Since the calculation of rate coefficient for collisions both with para- and ortho-H2 are very demanding, the following question arises: is it really worth to investigate such phenomena, which are predictable with a good accuracy? Is it valid from a critical scientific perspective to calculate/measure a value (e.g. a collisional rate coefficient for para-H2) and then “suppose” the same value for another quantity (e.g. the corresponding rate coefficient for ortho-H2) based on previous general experiences?

 

Two or three things I'm sure about uncertainty 

Pascal Pernot

I will present a few lessons learned over the years about the probabilistic representation  of uncertain properties, the difficulties of uncertainty quantification, and the logic of  prediction uncertainty validation. Applications range from Titan's ionospheric chemistry modeling to computational chemistry predictions, through the analysis of inconsistent  experimental data with inadequate models.

Main references:

* E. Hébrard et al. (2009)  J. Phys. Chem. A 113:11227–11237.

* S. Plessis et al. (2010) J. Chem. Phys. 133:34110.

* P. Pernot & F. Cailliez (2017) AIChE J. 63:4642–4665.

* P. Pernot (2022)  J. Chem. Phys. 156:114109.

 

Computing Good Pseudo-Random Numbers: a Social and Cognitive Problem

Cyrille Imbert

Philosophers of science have emphasized over the last decades how the epistemological analysis of science requires considering its social and cognitive dimension. In this talk, I highlight how this stance is also appropriate concerning the completion of a central task of contemporary science, namely computing pseudo-random numbers that are good enough for some target inquiry. Indeed, it can require pointed expertise, both concerning one’s scientific needs and about PRNGs, to determine whether a (P)RNG is sufficient for a particular scientific inquiry. Furthermore, evidence shows that the computational environment of scientists is not safe territory since most available pseudo-random number generators suffer from various shortcomings and may not be trivial to use safely.

In this context, I provide evidence that heuristics, usual availability biases, and the difficulty to be aware of errors in the present case may easily trump scientists in their use of (P)RNG. This suggests that the development of appropriate environments providing norms of good practices, warn-out trigger signals, and well-targeted institutional assistance is crucial to guarantee that the quality of actually computed pseudo-random numbers does not spoil scientific results.

Controlling uncertainty in classical molecular dynamics simulations for drug discovery

Maxime Vassaux

Molecular dynamics simulation is now a widespread approach for understanding complex systems on the atomistic scale. In the last decade, the approach has begun to advance from being a computer-based means of rationalizing experimental observations to producing apparently credible predictions for a number of real-world applications within industrial sectors such as advanced materials and drug discovery. However, key aspects concerning the reproducibility of the method have not kept pace with the speed of its uptake in the scientific community. Here, we present an approach of uncertainty quantification for molecular dynamics simulation designed to produce actionable results. The approach adopted to mitigate and control uncertainty is a standard one in the field of uncertainty quantification, namely using ensemble methods, in which a sufficiently large number of replicas are run concurrently, from which reliable statistics can be extracted. Indeed, because molecular dynamics is intrinsically chaotic, the need to use ensemble methods is fundamental and holds regardless of the duration of the simulations performed. I will discuss the approach and illustrate it with a systematic uncertainty analysis of a widely used molecular dynamics code (NAMD), applied to estimate binding free energy of a ligand bound to a protein, a typical use case in drug discovery.

Collisional excitation of reactive interstellar molecules: New statistical approach

Benjamin Desrousseaux 

Observational spectra are our main source of knowledge about the interstellar medium. In such low density environments, the frequency of collisions is not large enough to maintain a local thermodynamical equilibrium (LTE) and it is necessary to take into account both radiative and collisional processes in order to properly interpret molecular spectra. It is then crucial to obtain state-to-state rate coefficients describing the collisional (de)excitation of interstellar species with the main collisional partners (H2, H, He).

Quantum time-independent close-coupling calculations is the method of choice to obtain accurate enough collisional rate coefficients at typically low interstellar temperatures (< 100 K). However, in the case of reactive systems, i.e. open-shell molecules and ions that can undergo a reaction with the most dominant interstellar species H or H2, this method is impractical due to its memory and CPU requirements. As a result, the astrophysical community lack of appropriate collisional data for many detected reactive molecules of key importance in astrochemistry (NH, OH+, CH+, HCl+, H2O+, ...), preventing a proper determination of their abundance.

We present a new approach based on the statistical adiabatic channel model (SACM) to compute collisional rate coefficients in the case of reactive molecules. This efficient approach allows the determination of the rate coefficients with an accuracy meeting the needs of astrophysical applications while overcoming the memory and CPU limitations of the close-coupling method. This new approach was successfully validated on light triatomic collisional systems for which close-coupling results were available. The present statistical approach should be considered as a useful alternative to prohibitive close-coupling calculations in order to provide the astrophysical community with accurate collisional data.

Collisional excitation by H2O: the challenge of numerical limitations

Amélie Godard Palluet

With an operating Atacama Large Millimeter Array telescope and the recently launched James Webb Space Telescope, astronomy enters its golden age. Physical conditions of the interstellar medium (ISM) are derived from molecular spectra. The interpretation of these spectra requires to know the population of energy levels of the chemical species. However, most of the astrophysical media do not fulfill local thermodynamic equilibrium conditions. Therefore, populations are determined by taking into account both radiative, characterized by Einstein coefficients, and collisonal, characterized by rate coefficients, transitions. Nowadays, the rate coefficients, which are system-specific, can only be computed for "small" molecules colliding with light partners. However, in media like cometary or planetary atmospheres, the dominant colliders are "big" molecules, such as H2O, CO, etc.

Our understanding of the chemical composition of the universe is limited by computer power. Therefore, numerical strategies need to be elaborated to compute accurate rate coefficients for "big" collisional systems.

In this work, limitations and strategies that can be put into place to treat colliding systems involving H2O will be presented through the example of CS-H2O and HCN-H2O. This study starts by quantum calculations in order to reach convergence for as many transitions as possible. The approaches used are the "exact" Close-Coupling method, that will be then supported with Coupled-State or infinite order sudden approximations. In a second part, collisional data are computed with statistical methods. The accuracy of the results obtained with the different methods will be evaluated by a comparison with the results obtained with quantum calculations.

The excitation of ethynyl in the interstellar medium : A key to understand isotopic fractionation

Paul Pirlot, Paul Dagdigian, and François Lique

Since the discovery of the ethynyl CCH radical in the interstellar medium [1], it has been detected in a wide range of astrophysical environments. It is one of the most abundant hydrocarbon in space and together with its detected isotopologues (CCD, 13CCH and C13CH), they are useful tracers of physical conditions. Indeed, as abundances strongly depend on the formation pathways, the measurment for the ratio [CCD]/[CCH] may constrain the age of molecular clouds in astrophysical models [2]. However, hyperfine resolved 13C-based spectra show a higher intensity in favor of C13CH lines than 13CCH ones [3, 4] whereas their formation path is supposed to be the same. It is then of high interest to investigate this apparent different abundance of the two isotopologues.

An accurate interpretation of these observations requires to determine precise rate coefficients. Such quantities are necessary for non local thermodynamic equilibrium modeling which takes into account competition between radiative and collisional processes.

Non zero nuclear spins from 13C, D and H atoms lead to a resolved hyperfine structure for these species. Such complex energetic structure is a real theoretical challenge as exact scattering calculations are not feasible in a reasonable time. Then, it is necessary to develop numerical and methodological tools in order to determine accurate collisional data for astrophysical applications. In this talk, I will present the calculations of new accurate collisional data for all the CCH isotopologues.

Scattering calculations have been performed using a recoupling technique in order to determine accurate hyperfine resolved rate coefficients of CCH and its isotopologues in collision with H2.These data were derived for a large range of temperature and are expected to improve abundance ratio calculations and better understand the evolution of the isotopic fraction in astrochemical models.

Collisional Energy Transfer in the CO-CO system

Michał Żółtowski, François Lique, Jérome Loreau

Accurate determination of the physical conditions in comets can be inferred from the modeling of molecular spectra. However, the full exploitation of molecular spectra requires going beyond the local thermodynamic equilibrium (LTE) approach and hence requires radiative and collisions properties of the molecular species. Among the cometary molecules, CO is of key importance and is even the dominant species in the cometary at large heliocentric distances1. Here, we present new scattering calculations for the rotational excitation in CO-CO collisional system using the non-reactive quantum scattering code MOLSCAT. Calculations were performed using coupled-channel methods within the coupled states approximation, using the four-dimensional potential energy surface calculated by Vissers et al (2003). Collisional rate coefficients are provided for rotational levels up to j ≤ 10 and for temperatures up to 150 K. The new results are compared to the previous ones from the literature and significant differences are found, especially for the dominant transitions. The impact of these new data on astrophysical modeling is also discussed.

Calibration-free and high-precision absorption spectroscopy in the infrared spectral region. Lucile Rutkowskiand Romain Dubroeucq

Precision spectroscopy of small molecules aims at providing accurate and precise absorption line parameters to support a wide variety of fields ranging from climate sciences (to monitor greenhouse gas traces), to combustion processes (to optimize the industrial yields). These parameters are also one of the keys necessary to invert the observational data of extraterrestrial environments.

In this context, absorption spectroscopy based on optical frequency combs and enhancement cavities offer an attractive solution. Optical frequency comb are the signature spectra of femtosecond pulsed lasers and combine broad spectral coverage with a high-resolution frequency structure. Using such sources, we recently demonstrated the first multiplex cavity ring-down measurement, yielding a calibration-free absorption spectrum. The precision and accuracy of the technique and practical limitations will be discussed, together with future directions.

 A case study in the photoionization of atomic anions

Mariko Terao Dunseath, Kevin Dunseath, Matthieu Génévriez, Raphaël Marion and Xavier Urbain

Good models require accurate data and it is often believed that variational principles such as the Hylleraas-Undheim-MacDonald theorem provide a foolproof method for obtaining accurate energy levels. This however does not apply to energy differences nor to wave functions and hence to quantities relevant to astrophysics such as oscillator strengths and cross sections.

Negative ions, in particular those of carbon and oxygen, are of fundamental interest for the understanding of electron correlation and play an important role in astrophysics and atmospheric physics. Until recently, there was only one experimental absolute cross section available for the photodetachement of C−, which was also used to calibrate many data for other systems such as oxygen, despite the fact that no calculations could reproduce it.

Using modern technologies based on lasers, animated crossed beams and velocity map imaging, it is now possible to measure absolute cross sections and angular distributions with fewer assumptions on experimental parameters such as beam profiles, opening the possibility of performing a thorough study of experimental uncertainties.

Advances in computing power and development of more rigorous numerical methods allows ab initio calculations to be performed in an extensive and systematic way, with the possibility of establishing error bars on the theoretical results.

These advances have allowed the long-standing discrepancy between theory and experiment for the photodetachment of C− to be resolved.

Interdisciplinary Progress and the Constraints of Funding Schemes 

Jamie Shaw

Over 80% of interdisciplinary collaborations are focused on practical problems that do not obviously belong to any particular discipline (at least at first). This results in situations where the problem becomes reframed, reinterpreted, and reconstructed according to different disciplinary perspectives several times before a satisfactory method of study has garnered consensus. However, funding bodies often require that interim progress reports are handed in which forces scientists to report on the progress they have made thus far. A lack of progress, in theory, could lead to the withholding of additional funds leading to an early end to the project. Using qualitative studies of how these forms are filled and reviewed, I show how funding bodies typically do not hold funded projects accountable for their activities and one benefit of this is that they do not need to accurately report the work they have thus far engaged in. If they did, there are legitimate worries that interdisciplinary projects would frequently be abandoned prematurely. This will be illustrated from case studies in agricultural collaborations.

Epistemology for interdisciplinary research – facilitating interdisciplinary communication

Mieke Boon

In this lecture, (if time allows), I would like to address one of the topics of your workshop: "the interdisciplinary challenges faced by astrochemistry." Although I have no expertise in astrochemistry, several new insights about interdisciplinary research developed in the Philosophy of Science in Practice may be relevant for your practice. 

My starting point is to briefly explain an alternative paradigm of science that can facilitate interdisciplinary communication (1, 2). The alternative engineering paradigm of science may not sound relevant to astrochemistry. Nevertheless, I will try to convince you that this alternative can indeed help to get to grips with the challenge of interdisciplinary working in that field.

A crucial aspect of the alternative paradigm is to shift the focus from hypothesis-testing to scientific modeling practices. Through this turn, it can be understood that one of the challenges of interdisciplinary research is that different scientific disciplines have different scientific modeling practices (e.g., causal-mechanistic modeling, mathematical modeling, diagrammatic modeling, (3)).

The good news is that scientific models of phenomena (e.g., as published in scientific articles) can be systematically analyzed in the sense of reconstructing them (i.e., based on an epistemology of scientific models (4, 5, 6)). This meta-methodology of (re)constructing concrete scientific models is one way in which interdisciplinary communication can be facilitated, which will be illustrated by examples from physics.

In a similar way, the modeling practices in a discipline can be systematically analyzed (i.e., based on an epistemology of disciplinary perspective (4)), which also provides a meta-methodology to facilitate interdisciplinary research. This is briefly illustrated with an example from research in a medical-diagnostic practice.

Finally, I will address the question of how to establish relationships between scientific models from different disciplines (i.e. integration). Here I suggest that diagrammatic modeling (7) is crucial for establishing connections between different model types that represent 'the same' phenomenon, and that measurements (8) are the crucial link in this regard.

 

Using Structure Dialogue to Clarify and Push Interdisciplinary Ideas Forward: A Toolbox Dialogue Initiative Introduction 

Edgar Cardenas
 
The approach to conducting research has increasing moved towards cross-disciplinary collaboration. The challenges we now face require that expertise span multiple disciplines if researchers are to make substantial advances in our scientific understanding. One of the main challenges for cross-disciplinary teams is knowledge integration, taking what we know and learn in each disciplinary space and connecting it to other disciplines working in the same problem space. How we communicate with one another is a central component of addressing the integration challenge and for developing new ideas. 
Developed specifically for creating clarity and mutual understanding across disciplines, the Toolbox Dialogue Initiative uses structured dialogue to surface differences in perspectives, assumptions, values, and methods. By revealing these differences collaborators can better co-develop and coordinate their approaches so as to make progress on the problems they are collectively addressing. In this hybrid talk/workshop I will delve deeper into the Toolbox approach and provide some prompts (the primary mechanism for structured dialogue) to demonstrate how this approach may be used in your collaborations.

 

Online user: 2 Privacy
Loading...