Parallelized Pipeline - danecross/SOM-photoz-BFD GitHub Wiki
As seen here, a basic pipeline looks like this:
- Generate
GaussianSimsusing the full deep field catalog - Train wide SOM (
wide_SOM) - Train deep SOM (
deep_SOM) - Classify Wide Catalog with the
wide_SOM - Classify Deep Catalog with the
deep_SOM - Classify simulations using
wide_SOM - Match simulations to deep catalog and get their deep classifications
- Make redshift map with
PZCobject - Make tomographic bins with
TomographicBinsobject
Many of these processes do not reply on each other and can be run in parallel. To see this, let's split them up into phases where the steps in each phase do not rely on each other.
Phase 1
- 1.1 Generate
GaussianSimsusing the full deep catalog- Cut deep catalog based on which deep galaxies make the signal-to-noise cut
- 1.2 Train
wide_SOM
Phase 2
- 2.1 Train
deep_SOMon cut deep fields - 2.2 Classify the wide wata
- 2.3 Classify the generated wide fluxes from the simulations
Phase 3
- 3.1 Classify the deep fields
Phase 4
- 4.1 Match classified deep galaxies to simulations
Phase 5
- Make P(z|c) and Tomographic Bins