Authors: David M. Reiman & Brett E. Göhre
MNRAS: https://doi.org/10.1093/mnras/stz575
arXiv: https://arxiv.org/abs/1810.10098
Abstract: Near-future large galaxy surveys will encounter blended galaxy images at a fraction of up to 50% in the densest regions of the universe. Current deblending techniques may segment the foreground galaxy while leaving missing pixel intensities in the background galaxy flux. The problem is compounded by the diffuse nature of galaxies in their outer regions, making segmentation significantly more difficult than in traditional object segmentation applications. We propose a novel branched generative adversarial network (GAN) to deblend overlapping galaxies, where the two branches produce images of the two deblended galaxies. We show that generative models are a powerful engine for deblending given their innate ability to infill missing pixel values occluded by the superposition. We maintain high peak signal-to-noise ratio and structural similarity scores with respect to ground truth images upon deblending. Our model also predicts near-instantaneously, making it a natural choice for the immense quantities of data soon to be created by large surveys such as LSST, Euclid and WFIRST.
-
Notifications
You must be signed in to change notification settings - Fork 4
Galaxy image deblending using convolutional neural networks with an adversarial regularizing loss
davidreiman/deblender
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Galaxy image deblending using convolutional neural networks with an adversarial regularizing loss
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published




