250123 ‐ linux - cywongg/2025 GitHub Wiki
To achieve your desired goal, you can use a combination of Linux commands like grep
, awk
, sed
, and sort
to parse, extract, and format the data. Here's how you can do it step by step:
Extract unique log lines containing the text failed to create connector to
and format them into hostname/ip:port
.
grep "failed to create connector to" log_file | awk '{print $6}' | sort -u > connectors.csv
Explanation:
-
grep
extracts all lines containingfailed to create connector to
. -
awk '{print $6}'
extracts the 6th field (e.g.,SRVTPEGdshjUT101.hostnme.com/176.18.240.161:14212
). -
sort -u
removes duplicates and sorts the output. - The output is written to
connectors.csv
.
Transform the extracted lines into {host}/{ip}:{port}
format.
awk -F'[/|:]' '{print $1 "/" $2 ":" $3}' connectors.csv > formatted_connectors.csv
Explanation:
-
-F'[/|:]'
sets/
,|
, and:
as field delimiters. -
{print $1 "/" $2 ":" $3}
reformats the fields into the desired{host}/{ip}:{port}
format. - The result is saved in
formatted_connectors.csv
.
Search for lines containing Subscription source=
and extract the corresponding Subscription source
, host
, and port
.
grep "Subscription source=" log_file | awk -F'[=|]' -v CONN_FILE=formatted_connectors.csv '
BEGIN {
while (getline < CONN_FILE) {
hosts[$1] = 1;
}
}
{
for (host in hosts) {
if ($0 ~ host) {
print $2 "," host;
}
}
}' > subscription_sources.csv
Explanation:
-
grep "Subscription source="
extracts relevant lines withSubscription source=
. -
awk -F'[=|]'
splits fields by=
or|
. -
BEGIN { while (getline < CONN_FILE) { hosts[$1] = 1; } }
loads hosts fromformatted_connectors.csv
. -
if ($0 ~ host)
checks if the log line contains any of the extracted hosts. -
print $2 "," host
extracts the subscription source (field 2) and the matching host.
Check the subscription_sources.csv
file for the final output, which should look like this:
HKasd3L,SRVTPEGdshjUT101.hostnme.com
You can combine all the above steps into a single bash script for automation:
#!/bin/bash
LOG_FILE="log_file"
CONNECTORS_FILE="connectors.csv"
FORMATTED_FILE="formatted_connectors.csv"
OUTPUT_FILE="subscription_sources.csv"
# Step 1: Extract unique connectors
grep "failed to create connector to" $LOG_FILE | awk '{print $6}' | sort -u > $CONNECTORS_FILE
# Step 2: Reformat connectors
awk -F'[/|:]' '{print $1 "/" $2 ":" $3}' $CONNECTORS_FILE > $FORMATTED_FILE
# Step 3: Extract subscription sources
grep "Subscription source=" $LOG_FILE | awk -F'[=|]' -v CONN_FILE=$FORMATTED_FILE '
BEGIN {
while (getline < CONN_FILE) {
hosts[$1] = 1;
}
}
{
for (host in hosts) {
if ($0 ~ host) {
print $2 "," host;
}
}
}' > $OUTPUT_FILE
echo "Output saved to $OUTPUT_FILE"
- Replace
log_file
with the actual path to your log file. - The final output (
subscription_sources.csv
) will contain the subscription source and the corresponding host. - Ensure your log file has consistent formatting to avoid parsing errors.
Let me know if you need further clarification or adjustments!
To download files from a Linux server to your local PC, you can use tools like scp
(Secure Copy Protocol), rsync
, or even sftp
. Here’s how you can do it:
scp
is a command-line utility that allows you to securely copy files between a local and remote system.
scp username@remote_server:/path/to/remote/file /path/to/local/destination
Download a file named example.txt
from the server to your local desktop:
scp [email protected]:/home/user/example.txt ~/Desktop
To copy a directory and its contents:
scp -r username@remote_server:/path/to/remote/directory /path/to/local/destination
- Replace
username
with your Linux server username. - Replace
remote_server
with the server’s IP address or domain name. - Use the
-P
flag if the server uses a non-standard SSH port, e.g.:scp -P 2222 [email protected]:/path/to/file ~/Desktop
rsync
is another powerful tool for transferring files and directories between systems. It is especially useful for syncing and resuming interrupted downloads.
rsync -avz username@remote_server:/path/to/remote/file /path/to/local/destination
Download a file:
rsync -avz [email protected]:/home/user/example.txt ~/Desktop
rsync -avz [email protected]:/home/user/directory ~/Desktop
- The
-a
flag preserves file permissions and timestamps. - The
-v
flag enables verbose output. - The
-z
flag compresses the data during transfer, making it faster.
sftp
(SSH File Transfer Protocol) is an interactive command-line program for transferring files.
- Connect to the server:
sftp username@remote_server
- Navigate to the directory containing the file:
cd /path/to/remote/directory
- Use the
get
command to download the file:get example.txt /path/to/local/destination
- To download an entire directory, use:
get -r remote_directory /path/to/local/destination
- Exit the
sftp
session:exit
If the file is hosted on an HTTP or FTP server, you can use wget
to download it.
wget http://remote_server/path/to/file
wget http://your.server.com/files/example.txt
wget -P /path/to/local/destination http://your.server.com/files/example.txt
Similar to wget
, you can use curl
to download files.
curl -O http://remote_server/path/to/file
curl -O http://your.server.com/files/example.txt
curl -o /path/to/local/destination/example.txt http://your.server.com/files/example.txt
- Ensure you have SSH access to the server and the necessary permissions to read the files you want to download.
- If you face connection issues, check your network and whether the server’s SSH or FTP ports are open.
- Use
scp
orrsync
for secure file transfers over SSH.
Let me know if you need further clarification!
To run the grep
and awk
command with multiple lines in an interactive command line, you need to properly handle quotes and newlines. Here's how you can execute the command interactively:
If you want to run the whole command in a single line, you need to escape the newlines (\
) and properly handle quotes. Here's the single-line version:
grep "Subscription source=" log_file | awk -F'[=|]' -v CONN_FILE=formatted_connectors.csv 'BEGIN { while (getline < CONN_FILE) { hosts[$1] = 1; } } { for (host in hosts) { if ($0 ~ host) { print $2 "," host; } } }' > subscription_sources.csv
If you're working in an interactive shell and want to type in the command line by line, you can use the following approach:
grep "Subscription source=" log_file | awk -F'[=|]' -v CONN_FILE=formatted_connectors.csv '
BEGIN {
while (getline < CONN_FILE) {
hosts[$1] = 1;
}
}
{
for (host in hosts) {
if ($0 ~ host) {
print $2 "," host;
}
}
}' > subscription_sources.csv
Steps:
- Copy the command and paste it directly into the terminal.
- After pasting each line, press Enter.
- Ensure that the syntax matches exactly as shown, and don't worry about the terminal prompt—it will wait for the closing single quote
'
before executing the full command.
If the multiline syntax is hard to use interactively, save the command to a temporary script file and execute it. Here’s how:
-
Create a script file:
nano extract_sources.sh
-
Paste the following lines into the file:
#!/bin/bash grep "Subscription source=" log_file | awk -F'[=|]' -v CONN_FILE=formatted_connectors.csv ' BEGIN { while (getline < CONN_FILE) { hosts[$1] = 1; } } { for (host in hosts) { if ($0 ~ host) { print $2 "," host; } } }' > subscription_sources.csv
-
Save and exit (press
Ctrl+O
, thenEnter
, thenCtrl+X
). -
Make the script executable:
chmod +x extract_sources.sh
-
Run the script:
./extract_sources.sh
You can use a "here document" in the shell to handle multiline commands cleanly:
grep "Subscription source=" log_file | awk -F'[=|]' -v CONN_FILE=formatted_connectors.csv <<'EOF' > subscription_sources.csv
BEGIN {
while (getline < CONN_FILE) {
hosts[$1] = 1;
}
}
{
for (host in hosts) {
if ($0 ~ host) {
print $2 "," host;
}
}
}
EOF
Explanation:
-
<<'EOF'
starts a here document, allowing you to input multiline text untilEOF
is reached. - The
awk
script is written between<<'EOF'
andEOF
.
- For quick one-time execution, use Option 1 (single line).
- For better readability or frequent use, save the command as a script (Option 3).
- For direct multiline execution, use Option 2 or Option 4.
Let me know if you face any issues!