DAQ Log and Trigger Summary - IAA-BURSTT/document GitHub Wiki

When the pulse search program (bonsai) issues a detection trigger to all stations (servers running bursttd), it is desirable to record the DAQ configuration at the moment for all stations and if all stations respond accordingly, e.g., if data is saved properly. Therefore, the DAQ configuration should be logged regularly, and a trigger summary is generated. These information are collected to a monitoring server.

Overview

Logs of different parts are kept separately, depending on the expected frequency they will be changed.

On pulse-searching server

*~/burstt-log/

  • trigger/ : event summary in JSON file: '''event-[Timestamp].json'''
  • DM_t0 : coarse-grained searched parameter space of dispersion measure vs pulse arrival time.
  • SNR-beam : 2D map of SNR vs all beams
  • waterfall_dedisp : de-dispersed intensity waterfall plot
  • waterfall_incoh : original intensity waterfall plot of incoherent beam
  • waterfall_orig : original intensity waterfall plot of the triggered beam

On each beamforming server

*~/burstt-log/

  • station/ : station configuration
  • fpga/ : FPGA configuration log, including bitcode and beamforming matrix, temperature.
  • bfm2/ : 2nd beamforming matrix written to ring buffers, as well as its log
  • daq/ :
  • trigger/: trigger summary for every event
  • spec/: 16-bit noise spectra regularly taken directly from FPGA
  • figures/: almost all plots generated by the monitoring scripts such as noise spectra

On monitor server

*~/burstt-monior/ : where summary and log are uploaded to for monitoring

  • [3-letter station ID]/ : contain all the sub-folders in previous section.
  • trigger/: for easier sorting [timestamp]-[Stn ID].txt

Guidelines

Idea: dedicated logs accessible for all users, as every user will has his/her own account (currently all use ubuntu or frbobs…). The format should be human readable for easy monitoring and analysis.

  • For main station with multiple servers: DAQ configuration logs should be timely updated for estimating the source direction accurately.

  • Date and Time: UTC is preferred as there will are international stations. ISO 8601 format for date & time, e.g. “20240528T01:00:00Z ”

  • The number of columns of a log file should be fixed for easily parsing. TBD

  • get server name, e.g., burstt#, using ${HOSTNAME}

Application

  • Monitoring DAQ configuration of all stations: what DAQ mode (16-bit, bf16, bf64, etc.), which bitcode and BFM.
  • Estimate source direction: use '''antenna coordinates''', '''trigger time''', and '''beamforming matrix''', given the SNR distribution vs beams from bonsai.
  • Real-time VLBI: collecting baseband data from all stations to monitoring server and processing.
  • Look up files for offline data analysis

Modifications and Configurations

(by SH)

  • zcu216_100g_config.py >> zcu216_100g_config_log.py : keep log at ~/burstt-log/fpga/config-${HOSTNAME}-fpga*.log whenever it is called

  • zcu216_bf16_writeBFM.py >> zcu216_bf16_writeBFM_log.py:

    • additional string argument is assigned as unique BFM ID (same as that in copy2BFM.py)
    • keep log at ~/burstt-log/fpga/bfm-${HOSTNAME}-fpga*.log whenever it is called
  • reload_burstt5.py: containing the two scripts above. Additional argument --bfmid for passing BFM ID to zcu216_bf16_writeBFM_log.py

  • /data/kylin/bin/copy2BFM.py >> /data/wsh/bin/copy2BFM-id.py: additional string argument is assigned as unique BFM ID. Copy the files to sub-folder under BFM/ (instead of BFM/).

  • rudp64/write_2nd_matrix_64.py >> write_2nd_matrix_64_log.py : keep log at /burstt-log/bfm2/bfm2-${HOSTNAME}-ring*.log whenever it is called, and save matrix to npz file stored in ''/burstt-log/bfm2''

  • ''resetFPGACounters.py'': for replacing manual interactive part, modified from ''config_arp/zcu_newcontrol.py'' resetFPGACounter.py

    • station code: 3 uppercase letters, e.g. 'FUS', 'LTN'
    • cmd: 4 = reset counter, 6 = reset counter and network interface

New scripts

under folder ''/home/ubuntu/wsh/script''

  • createLogDir.sh: create burstt-log folder and sub-folders under the specified folder in the argument (/data is used)
 createLogDir.sh /data
  • genTrigSummary.sh: to be called after '''sendsocket''' saving all data

  • uploadDAQLog.sh: for uploading station config and FPGA log, regularly by cron

  • getFPGAIPs.sh: generate list of FPGA IPs and write to ''burstt-log/fpga/LastFPGAList.txt''. It first gets IPs connected to dnsmasq server, identify them as FPGA IPs after excluding those with IP less than minimum IP (< 192.168.40.100).

Schedule by cron

Synchronize server time with local NTP server

To make sure the timestamp is correct and consistent between stations, the servers should have its time synchronized with NTP time.

Brandywine NFS-220+ GPS clock is used as local NTP server.

Check if the server has designated NTP server and synchronize with it:

$ timedatectl status
               Local time: Mon 2024-07-15 12:05:06 CST
           Universal time: Mon 2024-07-15 04:05:06 UTC
                 RTC time: Mon 2024-07-15 12:05:06
                Time zone: Asia/Taipei (CST, +0800)
System clock synchronized: yes
              NTP service: active
          RTC in local TZ: yes

If "System clock synchronized: no" or "NTP service: inactive", modify the chrony configuration file: $ sudo vi /etc/chrony.conf

Replace the NTU server by the LAN IP of GPS clock similar as following (in this example, 192.168.230.220 is the IP, which may differ station by station)

# These servers were defined in the installation:
#server time.stdtime.gov.tw iburst
pool 192.168.230.220

Then restart the chrony daemon and check its status $ sudo systemctl restart chronyd $ sudo systemctl status chronyd

Station configuration

(To be updated, now config/[Stn].JSON is used instead of text file)

  • New version: ~/shwang/script/config/StationConfig.json is a symbolic link to ~/shwang/script/config/Stn-[station code].json,

  • '''Station ID''': 3-letter uppercase. Tentative ID listed in the table

    • defined in ''BURSTTConstant.py'' and ''BURSTTConstant.sh''

Important

The station code will be used as the log folder name, so it MUST be correct to avoid overwriting logs from other station!

station code full name
FUS Fushan, Ilan
LTN Longtien, Nantou
FUG Cape Fuguei
GRN Green Island
KMN Kinmen Island
OGW Ogasawara, Japan
GBD Gauribidanur, India
PHL Pahala, Hawaii
PYC Pyeongchang, Korea

For example: Stn-LTN.json contains the essential information:

{
	"station"	: 	"LTN",
	"name" 		: 	"Longtien",
	"NAnt" 		: 	64,
	"latitude"	:	"23:42:52.47582",
	"longitude"	:	"120:49:27.83946",
	"elevation"	:	880.411,
	"XYZ"		:	"/data/wsh/config/LONGTIEN_64_dgps.config",
	"server"	:	[ "burstt5" ],
	"FPGAIPPrefix" : "192.168.40.",
	"FPGA"	: [	249, 246, 245, 220 ]
}
  • WGS84 coordinates (LATittude, LONgitude, ELeVation)

    • read from PM3T GNSS RTK module, 海大監測網: [http://140.121.130.224/ http://140.121.130.224/] module name: ASIAAT*
  • (Old version) station/StationConfig.txt is a symbolic link to Stn-[station name].txt, and set to read-only (mode 444).

  • Antenna coordinates (XYZ): used for beamforming

    • Note: 16-ch beamforming only uses median separation in X coordinate(E-W direction).
  • '''server name (SRV)''': list of servers ($HOSTNAME) running bursttd, separated by ',' without space. 1 server have up to 4 FPGAs.

    • used as log folder name

For example:
for Fushan station (Stn-Fushan.txt)

STN	FUS
LAT	24:45:23.41411
LON	121:34:53.93382
ELV	642.9882
XYZ	/data/wsh/config/LONGTIEN_64_dgps.config
SRV	burstt1,burstt2,burstt3,burstt4

for Nantou station (Stn-Longtien.txt):

STN	LTN
LAT	23:42:52.49877
LON	120:49:27.77502
ELV	878.7997
XYZ	/data/wsh/config/LONGTIEN_64_dgps.config
SRV	burstt5

DAQ configuration

  • The latest DAQ configurations listed below are compiled in burstt-log/LastDAQ-${Stn}-${HOSTNAME}.txt
  • New version: JSON file summarizing burstt-log/daq/DAQState-${Stn}.json
    • This is compiled from individual servers of the station ''burstt-log/daq/DAQState-${Stn}-${HOSTNAME}.json''

FPGA bitcode

when FPGA bitcode is loaded (FPGA config summary), ''./python_zcu216zcu216_100g_config_log.py'', so it's naturally better to be inline. It appends one entry to log ''fpga/config-fpga[IP].log'', which is regularly uploaded to the monitor server:

  • time | FPGA# | bitcode (or its alias) | beamforming matrix file (if any) | successful?
  • time
  • bitcode or its alias
  • load successfully? error code.

timestamp | IP | bitcode | error code (0=normal)

2024-06-03T00:53:59.652899 192.168.40.232 /home/ubuntu/rfsoc/model_slx/frb_bf16_4a/outputs/frb_bf16_4a_2023-11-23_1821.fpg 0

1st Beamforming matrix (BFM) to FPGA

Note

To be updated: there are two versions of 1st BFM generator scripts: the old one which generates and need to move manually (LTN); and the new one generate BFM on the fly with given parameters. (FUS, GRN, etc.)

  • beamforming matrix if there's any, e.g., 16-bit bitcode does not use BFM.

The BFM log is kept separately from bitcode log because it is more frequently changed.

  • fpga/bfm-fpga[IP].log

For example:

2024-06-03T14:59:48.355566	192.168.40.232	BFM/bf16-20240529/fpga3.pos.npy
2024-06-03T15:05:28.266667	192.168.40.232	BFM/bf16-20240529/fpga3.pos.npy

2nd Beamforming matrix to ring buffer

Note

To be updated: there are two versions of 2nd BFM generator scripts: the old one used at LTN; and the new one used at FUS and GRN, etc.

when using write_2nd_matrix_64_log.py to write 2nd BFM to ring buffer (#0 or #1), an entry will be appended to a log at bfm2/bfm2-$HOSTNAME-ring[ring ID].log'', and an ''npz'' file will be saved to ~/burstt-log/bfm2/bfm2-$HOSTNAME-ring[ring id]-[nRow]row-s[separation in m]-b[beam0 offset]-m[FPGA mask in {0,1}]-d[FPGA delays in ns separated by _].npz

for example, use 2m separation and mask 3rd row at Nantou station:

(rfsoc) [ubuntu@burstt5 rudp64]$ ./write_2nd_matrix_64_log.py 0 -s 2. -t 2nd -m '3'
...
save BFM2 to [/home/ubuntu/burstt-log/bfm2/bfm2-burstt5-ring0-4row-s2.0-b-1.5-m1110.npz]

(rfsoc) [ubuntu@burstt5 rudp64]$ ./write_2nd_matrix_64_log.py 1 -s 2. -t 2nd -m '3'
...
save BFM2 to [/home/ubuntu/burstt-log/bfm2/bfm2-burstt5-ring1-4row-s2.0-b-1.5-m1110.npz]
(rfsoc) [ubuntu@burstt5 burstt-log]$ tail  bfm2/bfm2-burstt5-ring*.log
==> bfm2/bfm2-burstt5-ring0.log <==
2024-06-05T02:02:45.495240	/home/ubuntu/burstt-log/bfm2/bfm2-burstt5-ring0-4row-s2.0-b-1.5-m1110.npz

==> bfm2/bfm2-burstt5-ring1.log <==
2024-06-05T02:02:46.930476	/home/ubuntu/burstt-log/bfm2/bfm2-burstt5-ring1-4row-s2.0-b-1.5-m1110.npz

The bfm2-*.npz file can be read using $ bfm2_info.py [npz file name]

Server

when servers starts bursttd (server config summary). Some information is available in data header and thus no need to duplicate it. version | (header) | 2nd beamform config *bursttd16 keep log at “/var/log/messages “ but may mix with other system message

Event Summary

At pulse searching server (currently burstt15, /data/yhtseng/sujin ),

  • How it is called: double_test_cy_int8_12k_bindex_0410_2ring_opt2_8k.py >>> send_trigger.py >>> sample_genEventSummary.py >>> generateEventSummary()

An example of the event summary

{
    "EventID": "20250713_094930Z",
    "Timestamp": "2025-07-13T09:49:30.330245+00:00",
    "UnixTime": 1752400170.330245,
    "DM": 150.93414346199845,
    "BeamID": 37,
    "MaxSNR": 13.71875,
    "DM-t0": "trigger_20250713-094930.092677_beam37_DM-t0_dm150.9_sn13.7.png",
    "DM-t0_zoomin": "trigger_20250713-094930.092677_beam37_DM-t0_dm150.9_sn13.7_zoomin.png",
    "MainWaterfall": "trigger_20250713-094930.092677_beam37_intensity_dm150.9_sn13.7.png",
    "MainWaterfall_zoomin": "trigger_20250713-094930.092677_beam37_intensity_dm150.9_sn13.7_400-550mhz.png",
    "bonsaiNPZ": "trigger_20250713-094930.092677_beam37_dm150.9_sn13.7.npz",
    "DedispWaterfall": "trigger_20250713-094930.092677_beam37_dedisp_pulse_dm150.9_sn13.7.png",
    "IncohDedispWaterfall": "trigger_20250713-094930.092677_beam37_incoh_dedisp_dm150.9_sn13.7.png",
    "BeamSNRs": {
        "0": 3.138671875,
        "1": 3.125,
        "2": 3.044921875,
......
   },
    "BeamDMs": {
        "0": 150.6995915685066,
        "1": 151.05141940874438,
        "2": 150.64095359513365,
        "3": 150.75822954187956,
......
    },
    "TriggerSent": true,
    "SNR-beam": "BeamSNRs-event-20250713_094930Z.png",
    "Known": false,
    "Flag": "FRB",
    "Name_src": null,
    "DM_src": null,
    "RA_src": null,
    "Dec_src": null,
    "GalLon_src": null,
    "GalLat_src": null
}

Trigger Summary

Upon receiving trigger from the pulse-searching server at the main station via ''sendsocket'' command (TCP/IP), the beamforming server generate a summary for the event.

How it is generated:

  1. rudp*/sock2shmd.c receives a trigger and calls
  2. /opt/burstt/scripts/genTrigSummary.sh , which is symbolic link of /home/ubuntu/shwang/script/genTrigSummary.sh
  • Output :
    • (New ver) ~/burstt-log/trigger/trig-[Timestamp]-[Station code]-[Server].json : record station, server, trigger timestamps, DAQ config log (text file below), data files (Not finished yet; to-do)
    • "~/burstt-log/trigger/trig-[Timestamp]-[Station code]-[Server].txt" : file names of triggered baseband data (Not finished yet; to-do), current DAQ configuration
  1. uploaded to burstt12 regularly (using crontab) by /home/ubuntu/shwang/script/uploadDAQLog.sh (~/burstt-monitor/[station]/trigger)

An example of trigger summary JSON file from LTN station:

{
  "Station": "LTN",
  "Serv": "burstt5",
  "DM": 150.93,
  "TrigTime": "2025-07-13T09:49:30.330200",
  "TxTime": "2025-07-13T09:49:33.356800",
  "RxTime": "2025-07-13T09:49:33.372100",
  "DAQConfig": "/home/ubuntu/burstt-log/trigger/DAQConfig-20250713_094930Z-LTN-burstt5.txt",
}

There are 3 timestamps recorded

  • bonsai trigger time (TrigTime): estimated arrival time of dispersed pulse (at the lowest frequency) by bonsai
  • trigger issue time (TxTime): when the bonsai server at the main station sent out the trigger (sendsocket).
  • trigger receival tie (RxTime) : receipt time, when a server at main and outrigger station received the trigger. for monitoring network delay. from sendsocket log?

(To-do) Data information

  • current DAQ config: bitcode, bf matrix: grabbing from FPGA log.
  • data saved successfully?
  • if yes, local file path (baseband / beam / intensity), e.g. /burstt#/disk#/data/fpga#.YYYYMMDDhhmmss.bin
  • Are FPGAs synchronized? (check file header)
  • queue of uploading baseband data to central server for VLBI

For example

[ubuntu@burstt9 burstt-monitor]$ cat  trigger/trig-20240603_091608-LTN-burstt5.txt 
STN	LTN
TrigTime	20240603_091608
SendTime	20240603_091608
RxTime	20240603_091608
Serv	burstt5
2024-06-03T16:20:43.336040	192.168.40.232	/home/ubuntu/rfsoc/model_slx/frb_bf16_4a/outputs/frb_bf16_4a_2023-11-23_1821.fpg	0
2024-06-03T16:20:22.404848	192.168.40.245	/home/ubuntu/rfsoc/model_slx/frb_bf16_4a/outputs/frb_bf16_4a_2023-11-23_1821.fpg	0
2024-06-03T16:20:01.214066	192.168.40.246	/home/ubuntu/rfsoc/model_slx/frb_bf16_4a/outputs/frb_bf16_4a_2023-11-23_1821.fpg	0
2024-06-03T16:19:40.000936	192.168.40.249	/home/ubuntu/rfsoc/model_slx/frb_bf16_4a/outputs/frb_bf16_4a_2023-11-23_1821.fpg	0
2024-06-03T16:20:59.140902	192.168.40.232	BFM/bf16-20240518-b0/fpga3.pos.npy
2024-06-03T16:20:38.209516	192.168.40.245	BFM/bf16-20240518-b0/fpga2.pos.npy
2024-06-03T16:20:17.203900	192.168.40.246	BFM/bf16-20240518-b0/fpga1.pos.npy
2024-06-03T16:19:56.122845	192.168.40.249	BFM/bf16-20240518-b0/fpga0.pos.npy

Input Channel Monitoring

Noise spectra

script ''takeSpec16.sh'' finds all FPGAs linking to the host server, and calls ''zcu216_accum_spec_16inp-save.py'' to save the 16-channel spectra from every FPGA to both npz and png files. The npz file (~270 KB) is saved to ''burstt-log/spec/'' whereas the png file (~240 KB) to ''burstt-log/figures/''.

...
# get list of FPGA IPs
#e.g.  1717502365  0a:4c:50:41:43:51  192.168.40.232  *  ff:50:41:43:51:00:01:00:01:28:1b:5e:4a:0a:4c:50:41:43:41
FPGAIPs=( $(cat /var/lib/dnsmasq/dnsmasq.leases | awk -F ' ' '{print $3}'))
...

The script is routinely called to monitor the input channel status. For example, at burstt5

# upload DAQ logs to monitor server
*/10 * * * *  . /data/wsh/script/uploadDAQLog.sh > /data/wsh/log-uploadDAQLog.txt 2>&1
# regular noise spectra
0 * * * * . /data/wsh/script/takeSpec16.sh > /data/wsh/log/log-spec16.txt 2>&1

which generates output:

(rfsoc) [ubuntu@burstt5 ~]$ ls ~/burstt-log/spec/* 
...
/home/ubuntu/burstt-log/spec/accum_spec_20240606_160010_fpga232.npz
/home/ubuntu/burstt-log/spec/accum_spec_20240606_160022_fpga249.npz
/home/ubuntu/burstt-log/spec/accum_spec_20240606_160033_fpga245.npz
/home/ubuntu/burstt-log/spec/accum_spec_20240606_160044_fpga246.npz

(rfsoc) [ubuntu@burstt5 ~]$ ls ~/burstt-log/figures/* 
...
/home/ubuntu/burstt-log/figures/accum_spec_20240606_160010_fpga232.png
/home/ubuntu/burstt-log/figures/accum_spec_20240606_160022_fpga249.png
/home/ubuntu/burstt-log/figures/accum_spec_20240606_160033_fpga245.png
/home/ubuntu/burstt-log/figures/accum_spec_20240606_160044_fpga246.png

The script ''plotSpec16.py'' can be used to read and plot the npz file, e.g., plotSpec16.py

Beamformed spectra

To check if BFMs are working properly, regular taking intensity data

(Currently only available for rudp64)

Monitoring server

The logs are regularly uploaded to central monitor server, which collects all log files from all stations.

  • current choice: burstt12 at Fushan

  • DAQ summary per station

Web-Based Monitor

The files at the monitoring server are regularly uploaded to the web server, currently hosted on NAS at NTU lab

DAQ summary per station

replace the 3-letter station code

Website:

Web-Based Event Display

BURSTT Event Display

Functions:

  • Trigger statistics plot: DM vs time, indicating beam IDs, SNR, whether trigger is sent or not: interactive plot by plotly

  • All-Event table:

  • event searching, clicking, filtering, sorting (in both ascending and descending orders) are now working consistently.

  • navigation buttons at the bottom right: for browing next and previous events on the table

  • "export to CSV" button to export the event table: for further analysis, e.g., selecting pulsar events sorted by SNRs

    • "Trigger Sent" checkbox: show only events with triggers sent out to stations. (all events are shown by default)
    • it also calls up trigger info of all stations, e.g., delay of receiving the trigger.
  • Show latest trigger event: generate plots

    • waterfall (spectrogram)
    • DM vs arrival time: call back from coarse-grained result of bonsai
    • trigger summary from all BURSTT stations
  • show recent triggers and types

  • show noise spectra?

    • take 16-bit spectra through embedded Linux of FPGA (zcu216_accum_spec16inp.py), regularly or immediately after trigger?

How is it implemented

The web server does the following

  • All event summary files are indexed at the server side with Node.js: instead of reading at client side on every page reloading.
    • It parses all eventsummary JSON files incrementally, and generate a JSON file for the index table.
  • To improve the initialization performance of the event table with >100 k events, the table is now divided into pages each with only 200 events displayed. The page navigation bar is at the bottom right of the webpage, as shown in the screenshot:
  • Use the bonsai trigger time in event summary JSON to look up for trigger summary JSON files from all BURSTT stations, collecting the results and display on the table on the page

Utility Scripts

Please see [https://github.com/IAA-BURSTT/document/wiki/Utility-Scripts-for-Operation Utility Scripts for Operation]

⚠️ **GitHub.com Fallback** ⚠️