Prerequisites - s-ilic/ECLAIR GitHub Wiki
Main code
The ECLAIR suite is written in Python and thus requires a working Python 3 installation. The code also requires a small number of additional Python modules, namely:
numpy
(for general-purpose array manipulation)scipy
(for some specific calculations)matplotlib
andcorner
(for the plotting scripts)getdist
(for converting ECLAIR chains to the getdist format)emcee
and/orzeus
(for MCMC sampling)
All those packages can be installed with a simple pip
(the Python package installer) command:
pip install numpy scipy matplotlib corner getdist emcee zeus-mcmc
Optionally, if you wish to parallelize ECLAIR runs via MPI, you also need to install the mpi4py
and schwimmbad
modules, which can be done with:
pip install mpi4py schwimmbad
CLASS Boltzmann code
Currently, the Boltzman code CLASS is the only "theoretical engine" interfaced to ECLAIR. As such, its installation is required to run the ECLAIR suite, and especially the installation of its associated CLASS Python wrapper classy
. Please refer to the CLASS code webpage and GitHub repository for detailed installation instructions, although the following lines should work for most users:
git clone https://github.com/lesgourg/class_public.git
cd class_public
make
Note that ECLAIR is compatible with any variant or modified version of CLASS, as long as the name of the corresponding Python wrapper is correctly passed to the which_class
option in the ECLAIR .ini
file (Tip: you can modify the name of the CLASS Python wrapper by editing the python/setup.py
file in the CLASS folder before compiling it).
Likelihood-related prerequisites
1) Background measurements
"BG.SN.JLA"
Python modules:
numexpr
, can be installed viapip install numexpr
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget http://supernovae.in2p3.fr/sdss_snls_jla/jla_likelihood_v6.tgz # download compressed dataset
tar -xf jla_likelihood_v6.tgz # untar archive
cp jla_likelihood_v6/data/jla_*_covmatrix.dat likelihoods/BG/SN/JLA/. # copy relevant files
cp jla_likelihood_v6/data/jla_lcparams.txt likelihoods/BG/SN/JLA/.
rm -rf jla_likelihood_v6 jla_likelihood_v6.tgz # cleanup
"BG.SN.Pantheon"
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget https://raw.githubusercontent.com/dscolnic/Pantheon/master/lcparam_full_long.txt -O likelihoods/BG/SN/Pantheon/lcparam_full_long.txt
wget https://raw.githubusercontent.com/dscolnic/Pantheon/master/sys_full_long.txt -O likelihoods/BG/SN/Pantheon/sys_full_long.txt
sed -i -e 's/biascor//g' likelihoods/BG/SN/Pantheon/lcparam_full_long.txt
# Note that the last line aims at correcting a slight mistake in the main Pantheon data file (removes one unnecessary column titles).
"BG.SN.PantheonPlus"
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget https://github.com/PantheonPlusSH0ES/DataRelease/raw/main/Pantheon+_Data/4_DISTANCES_AND_COVAR/Pantheon+SH0ES.dat -O likelihoods/BG/SN/PantheonPlus/Pantheon+SH0ES.dat
wget https://github.com/PantheonPlusSH0ES/DataRelease/raw/main/Pantheon+_Data/4_DISTANCES_AND_COVAR/Pantheon+SH0ES_STAT+SYS.cov -O likelihoods/BG/SN/PantheonPlus/Pantheon+SH0ES_STAT+SYS.cov
"BG.SN.DESY5"
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget https://github.com/des-science/DES-SN5YR/raw/main/4_DISTANCES_COVMAT/DES-SN5YR_HD.csv -O likelihoods/BG/SN/DESY5/DES-SN5YR_HD.csv
wget https://github.com/des-science/DES-SN5YR/raw/main/4_DISTANCES_COVMAT/STAT+SYS.txt.gz
gzip -d STAT+SYS.txt.gz
mv STAT+SYS.txt likelihoods/BG/SN/DESY5/covsys_000.txt
2) Cosmic microwave background measurements
"CMB.Planck.PR2" and "CMB.Planck.PR3"
Python modules:
Requires the installation of the latest official Planck likelihood code (direct download link). Please refer to the Readme file enclosed within that archive for detailed instructions. After a successful installation, the command python -c "import clik"
should not yield any error.
Datasets:
Requires downloading the PR2 (direct link) or/and PR3 (direct link) Planck datasets, all available at the Planck legacy archive. You will also need to create two environment variables (adding them e.g. to your .bashrc
file) named PLANCK_PR2_DATA
or/and PLANCK_PR3_DATA
, pointing respectively to the PR2 and PR3 data folders. Those folders should have the following structure:
$ tree -L 1 $PLANCK_PR2_DATA # or tree -L 1 $PLANCK_PR3_DATA
/path/to/some/folder
├── hi_l
├── lensing
└── low_l
(which should contain the low_l
, hi_l
, etc, subfolders).
"CMB.Planck.PR4"
Python modules:
astropy
, can be installed viapip install astropy
Datasets:
Requires downloading the PR4 Planck datasets. The non-lensing ones are all available at the following link (get the files ending in v4.2
). You can/should extract all lollipop datasets you wish to use in a single lollipop
folder, and all hillipop datasets in a single hillipop
folder. You will also need to create an environment variable (to be added e.g. to your .bashrc
file) named PLANCK_PR4_DATA
, pointing to the PR4 data folder. This folder should have the following structure:
$ tree -L 2 $PLANCK_PR4_DATA
/path/to/some/folder
├── hillipop
│ ├── data
│ └── foregrounds
└── lollipop
├── clcov_lolEB_NPIPE.fits
├── cl_lolEB_NPIPE.dat
└── fiducial_lolEB_planck2018_tensor_lensedCls.dat
For the lensing likelihoods, assuming you already created the PLANCK_PR4_DATA
environment variable (and associated folder), type the following commands in a terminal anywhere on your machine:
git clone https://github.com/carronj/planck_PR4_lensing.git
mv planck_PR4_lensing/planckpr4lensing/data_pr4 $PLANCK_PR4_DATA/lensing
rm -rf planck_PR4_lensing # cleanup
"CMB.ACT.ACTPol_DR4"
Python modules:
pyactlike
, can be install via:
# Commands to be typed in a terminal anywhere on your machine:
git clone https://github.com/ACTCollaboration/pyactlike.git
cd pyactlike
pip install . --user
cd ..
rm -rf pyactlike # cleanup
"CMB.ACT.DR6_lensing"
Python modules:
act_dr6_lenslike
, can be installed viapip install act_dr6_lenslike
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget https://lambda.gsfc.nasa.gov/data/suborbital/ACT/ACT_dr6/likelihood/data/ACT_dr6_likelihood_v1.2.tgz
tar -zxvf ACT_dr6_likelihood_v1.2.tgz
mv v1.2 likelihoods/CMB/ACT/DR6_lensing/data
rm ACT_dr6_likelihood_v1.2.tgz
"CMB.BK.BK15" and "CMB.BK.BK18"
Python modules:
pandas
, can be installed viapip install pandas
Datasets:
# Commands to be typed in the main ECLAIR folder:
# For BK15
wget http://bicepkeck.org/BK15_datarelease/BK15_cosmomc.tgz
tar -xf BK15_cosmomc.tgz
cp -r BK15_cosmomc/data/BK15 likelihoods/CMB/BK/BK15/data
rm -rf BK15_cosmomc BK15_cosmomc.tgz
# For BK18
wget http://bicepkeck.org/BK18_datarelease/BK18_cosmomc.tgz
tar -xf BK18_cosmomc.tgz
cp -r BK18_cosmomc/data/BK18lf_dust likelihoods/CMB/BK/BK18/data
rm -rf BK18_cosmomc BK18_cosmomc.tgz
"CMB.SPT.SPT3G_2020"
Datasets:
# Commands to be typed in the main ECLAIR folder:
wget https://pole.uchicago.edu/public/data/dutcher21/SPT3G_2018_EETE_likelihood.tar.gz
tar -xf SPT3G_2018_EETE_likelihood.tar.gz
mv SPT3G_2018_EETE_likelihood/data/SPT3G_Y1_EETE likelihoods/CMB/SPT/SPT3G_2020/data
rm -rf SPT3G_2018_EETE_likelihood SPT3G_2018_EETE_likelihood.tar.gz