Survey all - ZhenZHAO/VNSIM GitHub Wiki

Survey-All Function

This script provide a processing routine for surveying multiple radio sources. For each source position, we not only record the image results of (u, v) plots, dirty beam, source model, dirty map and clean map, but also provide some quantitative information including the beam size, position angle, preferable observation time range, and dynamic ranges of the clean map. Multiprocessing accerleration is also performed and each source is processed independently within a certain subprocess. Let's see how it works.

Firstly, You can specify the obseration settings and multiple sources in the file of config_survey.ini.

[obs_time]
start = 2020/01/01/00/00/00
end = 2020/01/02/00/00/00
step = 00/00/05/00

[bs_type]
bs_flag_gg = 1
bs_flag_gs = 0
bs_flag_ss = 0

[obs_mode]
obs_freq = 1.63e9
bandwidth = 3.2e7
cutoff_angle = 10.0
precession_mode = 0
unit_flag = km

[station]
pos_source = 0316+413, 0202+319
pos_vlbi = ShangHai, Tianma, Urumqi, GIFU11, HITACHI,KASHIM34
pos_telemetry = 
pos_satellite = 

[imaging]
n_pix = 512
source_model = Point-source.model
clean_gain = 0.9
clean_threshold = 0.01
clean_niter = 20
color_map_name = hot

Then run this script, and obstain the help info,

$ python Func_survey_all.py -h

sa-1

In this example, we set two sources in the ini file, then the following results will be obtained after running,

Under the directory of "VNSIM/OUTPUT/survey_all/ ", the results are recorded in the generated folder which is named as the script running timestamp.

$ python Func_survey_all.py -s

sa-2

Eight files are generated, including the clean map, dirty beam, dirty map, az-el, source model, uvplot, uv datafile, source info. The xx-src-info.txt file show as followings

optimal observation interval is : (6.333333327434957, 16.749999984400347)
Best Obs: from 2020/1/1 6:20:0 to 2020/1/1 16:45:0
e_bpa=11.051760485662491 degree
e_bmaj=12.880877766631736 mas
e_bmin=7.484146362156375 mas
e_range=546.0
rms_noise=0.0023037176579236984
dr=289.3873238092851

As you can see, some quantitative information is provided. Note that, in this script, we didn't provide show_gui parameter. All kinds of outputs will be saved automatically.

Obviously, this running is very time-consuming. Thus, we adopt the python multiprocessing to accelerate this computing-intensive task. Each sub-process can deal with a complete combination of parameter settings. To re-duce the overhead of creating sub-processes, a process pool is pre-created and the number of sub-processes can be user-defined (or adopting the CPU core number by default). Since we assume the surveies of these sources are completely independent, we implement both the calculation and image-generation inside each sub-process to further reduce the overhead of intercommunciations among subprocess.