Deep-HiTS: Rotation Invariant Convolutional Neural Network for Transient Detection

You are here: Home / Submitted Papers / Astroinformatics / Deep-HiTS: Rotation Invariant Convolutional Neural Network for Transient Detection

Abstract

We introduce Deep-HiTS, a rotation-invariant convolutional neural network (CNN) model for classifying images of transient candidates into artifacts or real sources for the High cadence Transient Survey (HiTS). CNNs have the advantage of learning the features automatically from the data while achieving high performance. We compare our CNN model against a feature engineering approach using random forests (RFs). We show that our CNN significantly outperforms the RF model, reducing the error by almost half. Furthermore, for a fixed number of approximately 2000 allowed false transient candidates per night, we are able to reduce the misclassified real transients by approximately one-fifth. To the best of our knowledge, this is the first time CNNs have been used to detect astronomical transient events. Our approach will be very useful when processing images from next generation instruments such as the Large Synoptic Survey Telescope. We have made all our code and data available to the community for the sake of allowing further developments and comparisons at https://github.com/guille-c/Deep-HiTS.

Author

Cabrera-Vives, Guillermo; Reyes, Ignacio; Förster, Francisco; Estévez, Pablo A.; Maureira, Juan-Carlos

Journal

Astrophysical Journal

Paper Publication Date

February 2017

Paper Type

Astroinformatics