Description
The Vera C. Rubin Observatory’s 10-Year Legacy Survey of Space and Time (LSST) will transform time-domain astronomy, increasing the number of observed transients a hundredfold. For tidal disruption events (TDEs), this means a leap from ~100 to ~50,000, offering a unique opportunity to answer key open questions: Can we observe intermediate-mass black holes through TDEs? Why do TDEs prefer ‘green valley’ galaxies?
However, LSST’s immense data volume presents a challenge: with limited follow-up resources, we must carefully prioritize detections. To maximize LSST’s scientific impact, we must prepare now to handle this data effectively.
Using Zwicky Transient Facility (ZTF) forced photometry of all sources spatially coincident with the centre of galaxies from 2017-2023 and a combination of Gaussian processes, machine learning and SNCosmo simulations, we have developed a simulated photometric data set for LSST. What sets our approach apart is its foundation in real observational data: we leverage ZTF photometry as a base to construct a realistic LSST training set with minimal assumptions about the underlying physics of the lightcurves.
We further apply this ZTF dataset to refine target selection for the Time Domain Extragalactic Survey (TiDES), LSST’s spectroscopic follow-up program on the 4m Multi-Object Spectroscopic Telescope (4MOST). Active Galactic Nuclei (AGN) variability may lead to many false alarms in LSST, especially in the nuclei of galaxies where we hope to find TDEs. We test different cuts on the amplitude of variability in ZTF lightcurves to separate known TDEs and AGN based exclusively on their photometry to optimise follow-up resource allocation.