Skip to content

stefanofisc/dartvetter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DART-Vetter: A Deep LeARning Tool for automatic vetting of TESS candidates

Contact

Stefano Fiscale: [email protected]

Background

DART-Vetter is a Convolutional Neural Network trained on Kepler and TESS Threshold Crossing Events (TCEs). The model is designed to distinguish planetary candidates from false positives detected in any transiting survey. For further details, readers may find useful to read Fiscale et al. (2023) in Applications of Artificial Intelligence and Neural Systems to Data Science pp 127–135.

Citation (Fiscale et al., in preparation)

The paper detailing the model architecture and most recent applications on TESS candidates is in preparation. We planned to submit the manuscript to The Astrophysical Journal.

Code

This section provides an overview on the content of each directory

  • TFRecord: methods for creating and visualizing the samples of TFRecord files;
  • cnn: several methods to build different network architectures;
  • preprocessing: a set of files developed to pre-process light curves. The main file is generate_input_records.py and it is used to produce the global view for each TCE. Since this pre-processing pipeline deal with a huge volumen of data (i.e. TCEs), the workload has been distributed on different nodes. We provide the tess256core.slurm file to allow the user to run the generate_input_records.py file in parallel;
  • tce_csv_catalogs: we provide all the TCE csv catalogs used in this work;
  • training: methods for model training and assessment;
    • trained_models/dartvetter: checkpoint files to load our best model. Build the model and load the optimized weights contained in this file if you do not want to run the training procedure.

Installation

  1. Create the conda environment as detailed in the requirements.txt file

  2. Training set generation:

    You might generate the training samples through the tess256core.slurm and generatate_input_records.py files. In this case you need to download the Kepler and TESS light curves from Mikulski Archive for Space Telescopes. Otherwise, we provided in the tfrecord_data directory the Kepler and TESS global views we used to train our model.

  3. Training the model:

    To train the model, you can use the training_job.slurm file. Otherwise, you might build a CNN architecture with the cnn_architecture.py file, save the model and load the optimized weights we made available in this directory.

About

Convolutional Neural Network for distinguishing planetary transits from false positives

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published