SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Setzer Christian N.) "

Sökning: WFRF:(Setzer Christian N.)

  • Resultat 1-11 av 11
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Blunden, Jessica, et al. (författare)
  • State of the Climate in 2012
  • 2013
  • Ingår i: Bulletin of The American Meteorological Society - (BAMS). - 0003-0007 .- 1520-0477. ; 94:8, s. S1-S258
  • Tidskriftsartikel (refereegranskat)abstract
    • For the first time in serveral years, the El Nino-Southern Oscillation did not dominate regional climate conditions around the globe. A weak La Ni a dissipated to ENSOneutral conditions by spring, and while El Nino appeared to be emerging during summer, this phase never fully developed as sea surface temperatures in the eastern conditions. Nevertheless, other large-scale climate patterns and extreme weather events impacted various regions during the year. A negative phase of the Arctic Oscillation from mid-January to early February contributed to frigid conditions in parts of northern Africa, eastern Europe, and western Asia. A lack of rain during the 2012 wet season led to the worst drought in at least the past three decades for northeastern Brazil. Central North America also experienced one of its most severe droughts on record. The Caribbean observed a very wet dry season and it was the Sahel's wettest rainy season in 50 years. Overall, the 2012 average temperature across global land and ocean surfaces ranked among the 10 warmest years on record. The global land surface temperature alone was also among the 10 warmest on record. In the upper atmosphere, the average stratospheric temperature was record or near-record cold, depending on the dataset. After a 30-year warming trend from 1970 to 1999 for global sea surface temperatures, the period 2000-12 had little further trend. This may be linked to the prevalence of La Ni a-like conditions during the 21st century. Heat content in the upper 700 m of the ocean remained near record high levels in 2012. Net increases from 2011 to 2012 were observed at 700-m to 2000-m depth and even in the abyssal ocean below. Following sharp decreases in to the effects of La Ni a, sea levels rebounded to reach records highs in 2012. The increased hydrological cycle seen in recent years continued, with more evaporation in drier locations and more precipitation in rainy areas. In a pattern that has held since 2004, salty areas of the ocean surfaces and subsurfaces were anomalously salty on average, while fresher areas were anomalously fresh. Global tropical cyclone activity during 2012 was near average, with a total of 84 storms compared with the 1981-2010 average of 89. Similar to 2010 and 2011, the North Atlantic was the only hurricane basin that experienced above-normal activity. In this basin, Sandy brought devastation to Cuba and parts of the eastern North American seaboard. All other basins experienced either near-or below-normal tropical cyclone activity. Only three tropical cyclones reached Category 5 intensity-all in Bopha became the only storm in the historical record to produce winds greater than 130 kt south of 7 N. It was also the costliest storm to affect the Philippines and killed more than 1000 residents. Minimum Arctic sea ice extent in September and Northern Hemisphere snow cover extent in June both reached new record lows. June snow cover extent is now declining at a faster rate (-17.6% per decade) than September sea ice extent (-13.0% per decade). Permafrost temperatures reached record high values in northernmost Alaska. A new melt extent record occurred on 11-12 July on the Greenland ice sheet; 97% of the ice sheet showed some form of melt, four times greater than the average melt for this time of year. The climate in Antarctica was relatively stable overall. The largest maximum sea ice extent since records begain in 1978 was observed in September 2012. In the stratosphere, warm air led to the second smallest ozone hole in the past two decades. Even so, the springtime ozone layer above Antarctica likely will not return to its early 1980s state until about 2060. Following a slight decline associated with the global 2 emissions from fossil fuel combustion and cement production reached a record 9.5 +/- 0.5 Pg C in 2011 and a new record of 9.7 +/- 0.5 Pg C is estimated for 2012. Atmospheric CO2 concentrations increased by 2.1 ppm in 2012, to 392.6 ppm. In spring 2012, 2 concentration exceeded 400 ppm at 7 of the 13 Arctic observation sites. Globally, other greenhouse gases including methane and nitrous oxide also continued to rise in concentration and the combined effect now represents a 32% increase in radiative forcing over a 1990 baseline. Concentrations of most ozone depleting substances continued to fall.
  •  
2.
  • Coogan, Adam, et al. (författare)
  • Efficient gravitational wave template bank generation with differentiable waveforms
  • 2022
  • Ingår i: Physical Review D. - 2470-0010 .- 2470-0029. ; 106:12
  • Tidskriftsartikel (refereegranskat)abstract
    • The most sensitive search pipelines for gravitational waves from compact binary mergers use matched filters to extract signals from the noisy data stream coming from gravitational wave detectors. Matched-filter searches require banks of template waveforms covering the physical parameter space of the binary system. Unfortunately, template bank construction can be a time-consuming task. Here we present a new method for efficiently generating template banks that utilizes automatic differentiation to calculate the parameter space metric. Principally, we demonstrate that automatic differentiation enables accurate computation of the metric for waveforms currently used in search pipelines, whilst being computationally cheap. Additionally, by combining random template placement and a Monte Carlo method for evaluating the fraction of the parameter space that is currently covered, we show that search-ready template banks for frequency-domain waveforms can be rapidly generated. Finally, we argue that differentiable waveforms offer a pathway to accelerating stochastic placement algorithms. We implement all our methods into an easy-to-use python package based on the jax framework, diffbank, to allow the community to easily take advantage of differentiable waveforms for future searches.
  •  
3.
  • Hlozek, R., et al. (författare)
  • Results of the Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC)
  • 2023
  • Ingår i: Astrophysical Journal Supplement Series. - 0067-0049 .- 1538-4365. ; 267:2
  • Tidskriftsartikel (refereegranskat)abstract
    • Next-generation surveys like the Legacy Survey of Space and Time (LSST) on the Vera C. Rubin Observatory (Rubin) will generate orders of magnitude more discoveries of transients and variable stars than previous surveys. To prepare for this data deluge, we developed the Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC), a competition that aimed to catalyze the development of robust classifiers under LSST-like conditions of a nonrepresentative training set for a large photometric test set of imbalanced classes. Over 1000 teams participated in PLAsTiCC, which was hosted in the Kaggle data science competition platform between 2018 September 28 and 2018 December 17, ultimately identifying three winners in 2019 February. Participants produced classifiers employing a diverse set of machine-learning techniques including hybrid combinations and ensemble averages of a range of approaches, among them boosted decision trees, neural networks, and multilayer perceptrons. The strong performance of the top three classifiers on Type Ia supernovae and kilonovae represent a major improvement over the current state of the art within astronomy. This paper summarizes the most promising methods and evaluates their results in detail, highlighting future directions both for classifier development and simulation needs for a next-generation PLAsTiCC data set.
  •  
4.
  •  
5.
  • Lochner, Michelle, et al. (författare)
  • Optimizing the LSST Observing Strategy for Dark Energy Science : DESC Recommendations for the Wide-Fast-Deep Survey
  • 2018
  • Annan publikation (övrigt vetenskapligt/konstnärligt)abstract
    • Cosmology is one of the four science pillars of LSST, which promises to be transformative for our understanding of dark energy and dark matter. The LSST Dark Energy Science Collaboration (DESC) has been tasked with deriving constraints on cosmological parameters from LSST data. Each of the cosmological probes for LSST is heavily impacted by the choice of observing strategy. This white paper is written by the LSST DESC Observing Strategy Task Force (OSTF), which represents the entire collaboration, and aims to make recommendations on observing strategy that will benefit all cosmological analyses with LSST. It is accompanied by the DESC DDF (Deep Drilling Fields) white paper (Scolnic et al.). We use a variety of metrics to understand the effects of the observing strategy on measurements of weak lensing, large-scale structure, clusters, photometric redshifts, supernovae, strong lensing and kilonovae. In order to reduce systematic uncertainties, we conclude that the current baseline observing strategy needs to be significantly modified to result in the best possible cosmological constraints. We provide some key recommendations: moving the WFD (Wide-Fast-Deep) footprint to avoid regions of high extinction, taking visit pairs in different filters, changing the 2x15s snaps to a single exposure to improve efficiency, focusing on strategies that reduce long gaps (>15 days) between observations, and prioritizing spatial uniformity at several intervals during the 10-year survey.
  •  
6.
  • Lochner, Michelle, et al. (författare)
  • The Impact of Observing Strategy on Cosmological Constraints with LSST
  • 2022
  • Ingår i: Astrophysical Journal Supplement Series. - : American Astronomical Society. - 0067-0049 .- 1538-4365. ; 259:2
  • Tidskriftsartikel (refereegranskat)abstract
    • The generation-defining Vera C. Rubin Observatory will make state-of-the-art measurements of both the static and transient universe through its Legacy Survey for Space and Time (LSST). With such capabilities, it is immensely challenging to optimize the LSST observing strategy across the survey's wide range of science drivers. Many aspects of the LSST observing strategy relevant to the LSST Dark Energy Science Collaboration, such as survey footprint definition, single-visit exposure time, and the cadence of repeat visits in different filters, are yet to be finalized. Here, we present metrics used to assess the impact of observing strategy on the cosmological probes considered most sensitive to survey design; these are large-scale structure, weak lensing, type Ia supernovae, kilonovae, and strong lens systems (as well as photometric redshifts, which enable many of these probes). We evaluate these metrics for over 100 different simulated potential survey designs. Our results show that multiple observing strategy decisions can profoundly impact cosmological constraints with LSST; these include adjusting the survey footprint, ensuring repeat nightly visits are taken in different filters, and enforcing regular cadence. We provide public code for our metrics, which makes them readily available for evaluating further modifications to the survey design. We conclude with a set of recommendations and highlight observing strategy factors that require further research.
  •  
7.
  • Malz, A., et al. (författare)
  • The Photometric LSST Astronomical Time-series Classification Challenge PLAsTiCC : Selection of a Performance Metric for Classification Probabilities Balancing Diverse Science Goals
  • 2019
  • Ingår i: Astronomical Journal. - : American Astronomical Society. - 0004-6256 .- 1538-3881. ; 158:5
  • Tidskriftsartikel (refereegranskat)abstract
    • Classification of transient and variable light curves is an essential step in using astronomical observations to develop an understanding of the underlying physical processes from which they arise. However, upcoming deep photometric surveys, including the Large Synoptic Survey Telescope (LSST), will produce a deluge of low signal-to-noise data for which traditional type estimation procedures are inappropriate. Probabilistic classification is more appropriate for such data but is incompatible with the traditional metrics used on deterministic classifications. Furthermore, large survey collaborations like LSST intend to use the resulting classification probabilities for diverse science objectives, indicating a need for a metric that balances a variety of goals. We describe the process used to develop an optimal performance metric for an open classification challenge that seeks to identify probabilistic classifiers that can serve many scientific interests. The Photometric LSST Astronomical Time-series Classification Challenge (PLASTICC) aims to identify promising techniques for obtaining classification probabilities of transient and variable objects by engaging a broader community beyond astronomy. Using mock classification probability submissions emulating realistically complex archetypes of those anticipated of PLASTICC, we compare the sensitivity of two metrics of classification probabilities under various weighting schemes, finding that both yield results that are qualitatively consistent with intuitive notions of classification performance. We thus choose as a metric for PLASTICC a weighted modification of the cross-entropy because it can be meaningfully interpreted in terms of information content. Finally, we propose extensions of our methodology to ever more complex challenge goals and suggest some guiding principles for approaching the choice of a metric of probabilistic data products.
  •  
8.
  • Sarin, Nikhil, et al. (författare)
  • Redback : a Bayesian inference software package for electromagnetic transients
  • 2024
  • Ingår i: Monthly notices of the Royal Astronomical Society. - : Oxford University Press (OUP). - 0035-8711 .- 1365-2966. ; 531:1, s. 1203-1227
  • Tidskriftsartikel (refereegranskat)abstract
    • Fulfilling the rich promise of rapid advances in time-domain astronomy is only possible through confronting our observations with physical models and extracting the parameters that best describe what we see. Here, we introduce redback; a Bayesian inference software package for electromagnetic transients. redback provides an object-orientated python interface to over 12 different samplers and over 100 different models for kilonovae, supernovae, gamma-ray burst afterglows, tidal disruption events, engine-driven transients among other explosive transients. The models range in complexity from simple analytical and semi-analytical models to surrogates built upon numerical simulations accelerated via machine learning. redback also provides a simple interface for downloading and processing data from various catalogues such as Swift and FINK. The software can also serve as an engine to simulate transients for telescopes such as the Zwicky Transient Facility and Vera Rubin with realistic cadences, limiting magnitudes, and sky coverage or a hypothetical user-constructed survey or a generic transient for target-of-opportunity observations with different telescopes. As a demonstration of its capabilities, we show how redback can be used to jointly fit the spectrum and photometry of a kilonova, enabling a more powerful, holistic probe into the properties of a transient. We also showcase general examples of how redback can be used as a tool to simulate transients for realistic surveys, fit models to real, simulated, or private data, multimessenger inference with gravitational waves, and serve as an end-to-end software toolkit for parameter estimation and interpreting the nature of electromagnetic transients.
  •  
9.
  • Setzer, Christian N., 1990- (författare)
  • Modelling and Detecting Kilonovae in the Rubin Observatory Era
  • 2024
  • Doktorsavhandling (övrigt vetenskapligt/konstnärligt)abstract
    • Survey astronomy is a powerful tool for discoveries in astrophysics and cosmology. In the coming years, this field will be revolutionised with the start of the ten-year Legacy Survey of Space and Time (LSST), to be conducted at the Vera C. Rubin Observatory. This survey, with its unique capabilities in temporal sampling, single-image depth and covered sky-area, will explore a new discovery space for astrophysical transients in the Universe. The 2017 discovery of an electromagnetic and gravitational-wave transient presents a unique opportunity to influence the design of the LSST observing strategy for the detection of binary neutron star (BNS) mergers. This will be scientifically beneficial, not only for studies of the astrophysics of these sources, but also for developing new cosmological probes. Given the sensitivity of the Rubin Observatory, it is expected that this instrument will detect these binary neutron star mergers to greater distances than detectable by current and near-term gravitational wave detectors. This presents further opportunities to study the characteristics of the BNS population that will be selected into these surveys. If we understand the underlying BNS merger population and associated electromagnetic emission, it may also be possible to recover the previously undetected counterpart gravitational wave signals.In this thesis I discuss kilonovae (kNe) from BNS mergers with a focus on detection of kNe in the LSST survey. I will discuss the physics and modelling of kNe, including my work incorporating a viewing-angle dependence in the optical light curve modelling of BNS kNe. After setting the context for the Rubin Observatory and the LSST, I will describe work on optimising the observing strategy of the LSST to detect kNe from BNS mergers and the observing strategy features that impact detection. This work also indicates that a portion of the BNS mergers associated with kN detections in the LSST will be below the threshold for detection of their gravitational wave emission. Furthermore, I will discuss modelling a population of kNe from BNS mergers that is consistent with each merger’s associated gravitational-wave signal. This modelling includes a dependence of the kN on nuclear physics calibrated with detailed emulation of radiation-transport simulations. I conclude by summarising the scientific impact of this research and discussing future directions, such as: studying the BNS multi-messenger observational selection function for the LSST and concurrent gravitational wave detectors, detection of subthreshold signals, and the problem of classifying kN light curves.
  •  
10.
  • Setzer, Christian N., 1990-, et al. (författare)
  • Modelling populations of kilonovae
  • 2023
  • Ingår i: Monthly notices of the Royal Astronomical Society. - : Oxford University Press (OUP). - 0035-8711 .- 1365-2966. ; 520:2, s. 2829-2842
  • Tidskriftsartikel (refereegranskat)abstract
    • The 2017 detection of a kilonova coincident with gravitational-wave emission has identified neutron star mergers as the major source of the heaviest elements and dramatically constrained alternative theories of gravity. Observing a population of such sources has the potential to transform cosmology, nuclear physics, and astrophysics. However, with only one confident multi-messenger detection currently available, modelling the diversity of signals expected from such a population requires improved theoretical understanding. In particular, models that are quick to evaluate and are calibrated with more detailed multi-physics simulations are needed to design observational strategies for kilonovae detection and to obtain rapid-response interpretations of new observations. We use grey-opacity models to construct populations of kilonovae, spanning ejecta parameters predicted by numerical simulations. Our modelling focuses on wavelengths relevant for upcoming optical surveys, such as the Rubin Observatory Legacy Survey of Space and Time (LSST). In these simulations, we implement heating rates that are based on nuclear reaction network calculations. We create a Gaussian-process emulator for kilonova grey opacities, calibrated with detailed radiative transfer simulations. Using recent fits to numerical relativity simulations, we predict how the ejecta parameters from binary neutron star (BNS) mergers shape the population of kilonovae, accounting for the viewing-angle dependence. Our simulated population of BNS mergers produce peak i-band absolute magnitudes of −20 ≤ Mi ≤ −11. A comparison with detailed radiative transfer calculations indicates that further improvements are needed to accurately reproduce spectral shapes over the full light curve evolution. 
  •  
11.
  • Setzer, Christian N., et al. (författare)
  • Serendipitous discoveries of kilonovae in the LSST main survey : maximizing detections of sub-threshold gravitational wave events
  • 2019
  • Ingår i: Monthly notices of the Royal Astronomical Society. - : Oxford University Press (OUP). - 0035-8711 .- 1365-2966. ; 485:3, s. 4260-4273
  • Tidskriftsartikel (refereegranskat)abstract
    • We investigate the ability of the Large Synoptic Survey Telescope (LSST) to discover kilonovae (kNe) from binary neutron star (BNS) and neutron star-black hole (NSBH) mergers, focusing on serendipitous detections in the Wide-Fast-Deep (WFD) survey. We simulate observations of kNe with proposed LSST survey strategies, focusing on cadence choices that are compatible with the broader LSST cosmology programme. If all kNe are identical to GW170817, we find the baseline survey strategy will yield 58 kNe over the survey lifetime. If we instead assume a representative population model of BNS kNe, we expect to detect only 27 kNe. However, we find the choice of survey strategy significantly impacts these numbers and can increase them to 254 and 82 kNe over the survey lifetime, respectively. This improvement arises from an increased cadence of observations between different filters with respect to the baseline. We then consider the detectability of these BNS mergers by the Advanced LIGO/Virgo (ALV) detector network. If the optimal survey strategy is adopted, 202 of the GW170817-like kNe and 56 of the BNS population model kNe are detected with LSST but are below the threshold for detection by the ALV network. This represents, for both models, an increase by a factor greater than 4.5 in the number of detected sub-threshold events over the baseline strategy. These subthreshold events would provide an opportunity to conduct electromagnetic-triggered searches for signals in gravitational-wave data and assess selection effects in measurements of the Hubble constant from standard sirens, e.g. viewing angle effects.
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-11 av 11

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy