commands_scripting - RicoJia/notes GitHub Wiki
========================================================================
========================================================================
-
Intro
- text editors are not word processors, like word, because word will introduce additional formatting, e.g, "" turned into curly
- Motivation: if there's a list of tasks that you need to do in the future but rarely do, put them in a script, like Matt's dockint.
- It's fast to develop, despite a bit slower than C++ project.
-
some commands are not shell built-in, like
uptime
(they might be random shell scripts, programs, stored in/usr/bin/whatever
). -
Run a script:
# 1. create a separate shell, so shell variables exist in a separate shell ./PATH_TO_SCRIPT # 2. source or "dot space script" . PATH_TO_SCRIPT source PATH_TO_SCRIPT
- read -p VARNAME1 VARNAME2 #reading variables from stdin.
- read values into variables
echo "1 2" | while read VAR_A VAR_B do ... done
- read values into variables
-
whereis
show the binary file of a command -
less
: see what's in a file without text editor. -
whoami
: prints user ID -
groups
: see all groups of user on your computer. -
id
: UID, current groups, other groups. -
su USER_NAME
: switch to a different user? How to create a different user on Linux? echo THINGS_YOU_WANT_TO_WRITE >> file
-
eval $STRING
run a string as a command: -
ls -halt
- More common:
ls o*
: all files starting with o -
ls [op]*
see all files starting with o or p
- More common:
- Create soft link:
sudo ln -s EXISTING_FILE
-
hostname
: name of the machine. like username @ hostname uptime
-
date %F%N
to get the current date- basename: strips off the last portion of a string; dirname: strips off the directory portion of a string.
-
echo "$(dirname $(realpath ${0}))"
,$()
is also called command substitution-
#
expands to the number of arguments.
-
-
$(pwd)
pwd is a command. - Networking
- netstat: shows ports, like TCP, UDP ports . Use netstat -nutl | grep -Ev "FIRST LINE"
- Reloading a bash file
. ~/.bashrc
- If you have
.bash_aliases
, they need to be sourced explicitly
-
zip & unzip
- tar.gz: tar -xf archive.tar.gz (tar means "tape archive").
- Natively, the archive is not compressed.
- Also, you may need sudo.
- It removes absolute path and keeps only the relative parts within the archive. So you can extract the file without overwriting the original file.
- Old syntax:
tar xvzf ....
instead oftar -xvzf
, but this old style is rigid. with hyphen, you can dotar -f FILE_NAME -c
- flags
- f means file name (so it's a must)
- c is to create an archive
- t means to list, v means verbose,
- x means extract an archive. BE CAREFUL, THIS OVERWRITES FILES WITHOUT ASKING FOR PERMISSION
- z is to work with gz, including compressing or uncompressing files in tar
- gz means to compress it. you can do
gzip TAR_FILE
- gunzip means to uncompress it. Or some ppl name it .tgz
- -P means allows absolute path in file name.
- xz: compress, decompress
xz -z file
will make the file:file.xz
in place.-d
is decompressing.
- tar.gz: tar -xf archive.tar.gz (tar means "tape archive").
-
tree command: generates a tree!
- sudo apt install tree
- tree -a directory
-
tree -I ".pyc"
#ignore pyc - rsync: copy files to ssh computer's folders root folder and above, you don't have access to other folders.
- ssh-agent bash, ssh-add is specific to a shell. try to do this in the current shell
-
du FILE: estimate file memory size
-
curl
-s means silent, hids progress bar, etc.
-
sudo dmidecode -t system | grep Serial
serial number -
Event Designator: a reference to a past command in history.
!
starts a history substitution-
!v
is the most recent command starting with a v. -
!!
last executed command. -
!$
: $ copies the last argument of the last command and puts it here E.g,cat FILE ls -l !$ #!$ is FILE here.
-
-
cat (display file on stdout. But can be used to redirect file, too)
-
cat file1 >> file2 (file direction)
will append to file2 -
cat file1 > file2 (file direction)
will overwrite to file2
-
-
It's possible to run sudo in one command:
IP=$(sudo arp-scan -l | grep Raspberry | cut -f 1) ssh -YC "pi@${IP}"
-
xclip: to ctrl-v
xclip -sel clip < FILE
========================================================================
========================================================================
-
Example
-
chmod +x file.sh
add execution to all: owner, other group users, all other users. without this, you will have permission error.- example
#!/bin/bash #1. # #is sharp in music, ! is "bang". **Shebang determines which program to run this file.Important because other ppl may not have the same login shell!**. 2. In other cases #Add high level comments echo "something" #this is a **"shell builtin"**. to see if a command is a shell built in, do ```type echo``` COUNTRY="Columbia" echo "I want to live in ${COUNTRY}" #1. variable is just a key-value pair, with name and memory allocated to it #2. **No space around =** 3. when referencing a variable, use ${}, or just $if nothing before/after the variable name. #3 **old style is to use `` as $**
- example
-
-
Naming rules for bash files
- only letters, numbers,
_
- do not start with numbers
- ppl like UPPERCASE, since case matters here.
- only letters, numbers,
-
Read Manpage:
-
...
means you can have multiple of these -
[]
is optional, no[]
means not optional. - always read help or man page, see if there's any dependency on configuration files. Some of these settings might be different
-
[+FORMAT]
means you should include"+"
- exit returns the return status of the last command
-
-
help
- every shell command has help on it. like
help COMMAND
. - non shell built-in commands may not have help
- Check if a command is shell command use
type -a COMMAND
- Check if a command is shell command use
- You need man COMMAND for that.
- every shell command has help on it. like
-
Search for a command
- The most common one is
type -a
- But if type -a cannot find a command, do
which COMMAND
, this searches $PATH. People always doexport PATH=${PATH}:new_path
- if $PATH does not have the command, do
locate COMMAND
.- locate searches index generated by updatedb, which is run everyday periodically
- locate doesn't necessarily give you the most updated info, but it's fast
- locate respects the sudo priviledges, so you need sudo to see the hidden ones.
- you might need find, which searches the sub-directories under a given path in real time, slower
- find PATH -name userdel
- The most common one is
-
foreground, background:
-
Motivation: you can run processes in parallel
-
prog &
: puts things on the background (e.g., uncompress something large) - background job will still print on the screen, but it is a child process
- Otherc child processes: things enclosed in
()
, redirection using pipes - cannot communicate with the parent I/O, so it can be achieved through reading/writing to different files
-
-
check jobs(or ps, which is to see all processes ), top (you can sort by CPU using top!).
-
fg %PID - foreground a job 3. bg %
-
-
\
is the to switch lines -
"" ""
vs' '
:""
(expand variables,''
(print exactly what's in the '', everything inside '' will be treated as one)- bash will just find proximate ' ' and " "
- '' keeps whole strings intact, whereas " " will let bash evaluate them first. So if you do "{print $2}", then bash will evaluate $2 before the string gets passed into awk!
-
{} vs () are keywords
- {} will make code blocks run in the current shell (the shell you're using), while () will run in a sub-shell
-
Pipes
- Programs after pipes are run in their own shell, so changes to variables will be local to the sub-shell. e.g,
echo "1 2" | read A B; echo "${A} ${B}"
, after;
, pipe ends. So A, B do not get returned to the parent shell. -
echo "1 2" | {read A B; echo "${A} {B}"}
{}
makes AB running in the current shell
- Programs after pipes are run in their own shell, so changes to variables will be local to the sub-shell. e.g,
-
echo `ls`
- execute the command line
========================================================================
========================================================================
-
stdin and stdout: keyboard and pipeline are stdin. Standard output is the screen. if there's an error, the error comes out as stderror
- echo gives an output, so it can be used as an stdin thru pipeline, and preserves new lines, etc.
echo " machine pypi login password"
- if you want to pass an argument to an interactive command in a non-interactive way, use pipeline, and echo the outputs as many times as required by the input
-
Keyboard, Screens are all files. Each file has a file descriptor that tells the way to interact with the file
- FD0 : stdin
- FD1: stdout
- FD2: STDERR
- pipe: from one program to another. one program that reads from stdin can take pipe as input, and if it accepts output for stdout, it can send into a pipe.
- echo gives an output, so it can be used as an stdin thru pipeline, and preserves new lines, etc.
-
Positional arguments
-
${0}
is the path to the script (first thing in the commandline) -
${1}
is the first argument. -
${#}
number of arguments
-
-
$PATH
: lists for searching for commands. Not the same for every computer.- The way to check which file is executed for a command is:
which COMMAND
, this is the first file on $PATH - You can add a custom command by adding to /usr/local/bin. This will have a higher priviledge than /usr/bin.
- Check all matching files for a command:
which -a
ortype -a
, they will check $PATH - If you delete the file, then bash might put this in a hashed location, which means you need to manually forget all hashed locations.
- Check all matching files for a command:
- "A B C": Pass in multiple words as one input
-
$@
vs$*
: With double quotation marks, first gives you a list of inputs (starting from the first argument), the latter gives you the all inputs as one input. Without "" they are the same
-
- shift: shift all positional arguments down by 1, and takes out the first positional argument.
shift x #shift x arguments
- The way to check which file is executed for a command is:
-
Redirection, works with STDIN, STDOUT, files:
-
> works with all commands and overwrites the file, like echo SOMETHING > FILE. A pipe redirects output from&to a command, not to a file.
- >> is to append to the new file
- read LINE < File or ${VAR} # new line will stop it, so reads only one line
- 2>&1: makes STDERR become STDOUT (see file descriptors).
- 2> means directing STDERR in the output , & means using a file descriptor instead of a file name. 2>&1 forces 2 to become 1
-
COMMAND_YIELDS_ERROR | cat
#this will not pipe STDERR msg to cat, so you need to convert this into STDOUT,COMMAND_YIELDS_ERROR 2>&1 | cat or COMMAND_YIELDS_ERROR |& cat
**cat does not take in STDERR, so &> or |& redirects both. ** - STDOUT, STDERR 分流: COMMAND 2>>FILE1 1>>FILE2
- if you're bugged by errors like "Permission denied, use 2>/dev/null" to make them disappear
-
echo "SOMETHING" >/dev/stderr
prints something as stderr.- or
echo "SOMETHING" >&2
- or
- redirect to the "blackhole": /dev/null. This will only return a new line char.
- >> doesn't work with sudo:
sudo echo "SOMETHING" >> /etc/hosts #sudo is on echo, not on /etc/hosts echo "SOMETHING" | sudo tee -a /etc/hosts #use tee instead
-
> works with all commands and overwrites the file, like echo SOMETHING > FILE. A pipe redirects output from&to a command, not to a file.
-
getopts
-
Example in dream byobu
while getopts 'vl:s' OPTION #1. getopts are used more often, '' is optional, h: means argument is following. OPTION will store the current option, a built-in OPTARG will store the argument. #2. Also, use a while loop to parse all arguments. #if an Option is not there, OPTION will have a random char. do case ${OPTION} in v) echo "VERBOSE" ;; l) PASSWORD_LEN=${OPTARG} echo "now length is ${PASSWORD_LEN}" ;; s) echo "SPECIAL CHAR" ;; ?) #? is for one single character. echo "Please supply valid options" >&2 #send this to STDERR exit 1 ;; esac done
- ${OPTIND}: index of the next argument to be read, starting at 1.
-
Example in dream byobu
-
df
: disk free. check disk space usage. -
here document?
- in many shell scripting languages
- section of code, treated like text
========================================================================
========================================================================
- floating point: python -c "print 4.0/5.0"
function float_eval(){ local FLOAT="$(echo "scale=2; $*" | bc -q 2>/dev/null)" echo ${FLOAT} } res=$(float_eval $NUM1) echo "the result is ${res}"
- ++ only works with integer, not with floating point.
- In some other languages, you may use let or expr
- see code
-
grep, vim can use regex. when you search in vim, DO NOT enter space, if there's no spacing.. When using grep, do grep -E, so we're making sure to use extended regex.
- grep 'asdf dsfa', if you have space, there must be ' '
- grep '134 asd' will still give you a match like '134 asd asdfas'
- if you want to find an exact match, use
grep '^134 asd$'
-
grep -v 'SOMETHING YOU DON"T WANT TO SEE'
#-v is to print everything but - The line is the basic unit here: line matches the pattern will be displayed here.
-
grep KEYWORD file
: searches KEYWORD in file grep -A #_of_lines_to_show something_to_search file_name
-
regex commands
- ^a means line begin with a.
- a{3} means 3 consecutive a.
- t$ means line ends with t
- extended regex:
grep -E 'SOMETHING|Or_Anotherthing'
-
grep -A3 some_word
: 3 lines after the matching line of grep. -
grep -i
case insensitive. By default grep is case sensitive - for docs, pipe them into less, aka a (pager): can do searches, etc.
========================================================================
========================================================================
heredoc is a multi-line function - it's treated as a separate file. dream byobu example
-
Stream: text from one process to another thru pipe; one file to another thru redirect; one device to another.
- Standard input is standard input stream
- It is used to edit streams such as removing lines, making substitutions, like Macros in Vim. However, it does not require someone to start vim!
-
basic uses:
/
is delimiter. Between the delimiters is regex.# replace, s is the substitute command sed 's/SEARCH/REPLACE/g' #flag g means global, all occurances sed 's/REPLACE/SEARCH/2', #replace the 2nd occurance sed 's/SEARCH/REPLACE/ig' #means case insensitive and global. sed 's/SEARCH/REPLACE/igw NEW_FILE_NAME' #w means to write to a new file, with only the modified lines sed -i.bak 's/REPLACE/SEARCH/ig' # in place editing, since sed itself does not modify the original file. Also supply a suffix to make a backup copy. .bak is the suffix, no spacing here. # delete sed '/SEARCH/d FILE #You can also use this to remove comments, and empty lines: sed '/^#/d' FILE #remove comments sed '/^$/d' FILE #remove empty lines # execute multiple sed at a time sed '/^#/d ; /^$/d' FILE #use -e for each command #use a script with all the commands echo "/^#/d" >> commands.sed echo "/^$/d" >> commands.sed sed -f commands.sed # Address range sed 'START,END s/dwight/rico' FILE #START, END can be a line number, or regex that the start line contains
- escape chars: if you want to replace
/home/etc
with/lol/looll
, since / is already a delimiter, you need to escape /. There are 2 options:- use \ to escape
echo "/home/etc" | sed 's/\/home\/etc/\/lol\/loool/
- use a different delimiter, right after s. / is just the default delimiter here.
echo "/home/etc | sed `s#/home/etc#/lol/lool#`"
- use \ to escape
- escape chars: if you want to replace
-
replace using
tr
:
tr -s 'CHAR' \\012
-
\\012
means '\n'
- if a string contains a substring:
if [[ "$STR" == *"something"* ]]
- quoting:
''
is the strong quote, where everything is treated literally, whereas""
is the weak quotes, some chars are treated in special ways -
$
is variable expansion. So$STR
expands STR without putting that into a string.
- quoting:
-
Basic concept: cut, awk both separates a file line by line, according to delimiters
#"one,two,three" becomes field_1 field_2 field_3 ...
-
cut: gets a range of char of each line
printf "12345\n67890" | cut -c 2 #prints 2,7 on two lines printf "12345\n67890" | cut -c 1,2 #prints 1,2; 6,7 on two lines printf "12345\n67890" | cut -c 2-4 #prints 2,3,4;7,8,9 on two lines printf "12345\n67890" | cut -c 2- #prints 2,3,4,5; 7,8,9,0 on two lines cut -c 1 FILE_NAME
- echo sends a new line,
printf
doesn't; echo doesn't allow formatting,printf
can do formatting like%d
- UTF-8 chars takes up multiple bytes
- delimiter, \ is by default a line continuation char
echo -e "1\t2\t3" | cut -f 2 #prints 2, \t is tab witn echo -e echo -e "1,2,3" | cut -d ',' -f 2 #delimiter is ,; it can work with csv file as well (see info) cut -d ',' -f 1 --output-delimiter='' test.csv cut -d ',' -f 1- --output-delimiter=' ' test.csv
- csv is separated by commas
echo "1,2,3" >> test.csv echo "4,5,6" >> test.csv cut -d ',' -f 1,2 test.csv #prints 1,2; 4,5, note that the original delimiter is still there. cut -d ',' -f 1,2 --output-delimiter='*' #prints 1*2;4*5
- One Drawback is that you can only have a single-char delimiter!! which leads you to awk
- echo sends a new line,
-
awk: cut cannot handle delimiters with multiple chars
- basic usage:
-
$awk '{print $1}'
print the first "column" of the whole sentence, the natural delimeter is space echo "1,2,3" | awk -F ',' '{print $2" ,"$3'}' # field_1, field_2, field_3
- ',' is the delimiter, {} is actions, " ," is what to print instead of the input delimiter.
- Natively, cut keeps the delimiters, but awk prints spacing.
-
-
echo "1,2,3" | awk -F ',' '{print $NF}'
# NF gives the last column -
echo "1,2,3" | awk -F ',' '{print $(NF-1)}'
# you can do simple math here as well - awk is really good at different white spacing, and it thinks spacing is just one space, like
1 2 3 1 2 3 awk '${print $1, $2}'#will print 1 2 3 1 2 3
- basic usage:
-
Sort
- -n for numbers, by default, it's by the first number before space
- -r for reverse
- -h for sorting human readable inputs, i.e, reads K, M,G ...
du -h DIRECTORY | sort -h #if you use -n it's not gonna work
- -u unique entries, no repetition
- -k and -t
echo -e "4,3,2,1\n9,8,7,6" | sort -t ',' -k 1 #sort knows delimiter is , and we sort by the 1st field
-
Uniq: how many input items are unique?
- uniq alone needs input to be sorted
- -c count how many are unique.
-
wc: word count
-
du -ch *08-Mar*
is to find the total size of certain files
========================================================================
========================================================================
-
case
case ${1} in start) #this is the pattern "start". We try to find **an exact match** here. so you want *. echo "hehe" ;; #is to end case. *) #all other cases echo "HEE" esac
-
functions
- Basics
function log(){ #keyword function is optional if [[ "${GLOBAL_VERBOSE}" ]] then echo "${GLOBAL_VERBOSE}" echo "${@}" else local DUMMY_VAR="${1}" fi } readonly GLOBAL_VERBOSE='false' #many ppl do it this way log 'heh'
- you can create a function in command line, and check that by type -a FUNC
- Variables don't have to be local, a lot of ppl do that. Global Variables might cause issues when functions try to modify them.
- return:
-echo
: to actually "return something"; -return only works with int exit codes - one-line:
if ps aux | grep some_proces[s] > /tmp/test.txt; then echo 1; else echo 0; fi
- each line must terminated by newline or semi-colon
- then, else do not need
;
.
- elif also needs
[[ ]]
- Inside function we can't read args from the command line directly, because they're reserved for function args
function print_param_value(){ value1="${1}" # $1 represent first argument value2="${2}" # $2 represent second argument } print_param_value "6" "4" # Space-separated value # You can also pass parameters during executing the script print_param_value "$1" "$2" # Parameter $1 and $2 during execution
- Basics
-
Conditional, if. see code
- basics
- spacing matters, [[]] is an alias for test, new line is equivalent to ;
- "" keeps one string with spaces intact
- eq, lt; = for comparing strings, == is bash only
-
$?
is a special operator, returns the execute status of the most recent command. it may return > 1. -
This is in help test:
if [[ -f FILENAME ]]
-
-f
check if file exits (regular type), -
if[[ -e "/home/ricojia"]]
, -e means if a file exists, regardless of type (socket, node, directory, etc.) -
-d
means if a directory exists.
-
-
if [[ A ]] || [B]
is or, &&
- basics
-
for loop
for VARIABLE IN 1,2,3 #or for VARIABLE in {a..b} do COMMAND done
- for loop in one line:
for i in 1; do (); done
do stuff with all names: no need for ls- if you can just use
mkdir, echo, rm
, you can usexargs
, no for loop neededls | grep pkg | xargs -i echo "{}/production" ls | xargs rm -rf #if rm -rf complains that the input arg list is too long (like 10000)
- if you can just use
- for loop in one line:
-
while loop
while CONTROL_COMMAND #CONTROL_COMMAND can be anything that returns true/false, do do_something done
========================================================================
========================================================================
-
random number in [0, 2^16, 65536]: ${RANDOM}
- You can also use *sum as well. These sums represent the Hex sum of a huge chunk of data, so the integrity of download can be checked. Sha1sum, sha256sum, etc. Usage: sha1sum file. Man page said it can be used to read a file or stdin. So it's compatible with pipe. Most commands are like this too!
- You can "chain" pipeline: $(date +%s | sha1sum | sha256sum)
- say you have defined S="asdfaqwer":
- echo "$S" | fold -w1 | shuf | head -C1 : fold is to squeeze an input string into 1 byte, shuf is to "shuffle them", head-C1 is to output the string.
- system log: log hardware events. you can add to it by
logger -t tag_name "msg"
.
-
read everyline of a file
for F in $(cat FILE_PATH) ; do something done
-
go thru all file names in a directory
for i in $(ls $(pwd)/boost) ; do done
-
Add things to an array and read the array
arr=("a" "b") arr+=("x") for t in ${arr[@]} ; do #[@] means all array elements. t #is an element of the array done for i in ${!arr[@]}; do now i is the elements done
-
export lines matching certain pattern to a file:
grep -o "PATTERN" FILE_NAME > another_file
-
bash
for i in {10..45} do grep -A 1 127.0.0.$i:554 wh_batches.log | grep decode_time | cut -b 50-70 #keep only 50-70 bytes | head -500 #the top 500 | sed 's/.$//'| perl -e '$last = 0; $s = 0; while (<>) { if ($last>0) {$s += $_ - $last;} $last = $_ } print $s/500'; done
echo "lolsdf" | sed 's/.$/9/'. sed 's/REGEX/string/', . is any char but new line, $ means at the end of line
========================================================================
========================================================================
- Pip
- pip show pedantic, that gives you version, required_by, metadata, etc.
========================================================================
========================================================================
- Avoid using pwd, since pwd might be different as you navigate. Use
$(dirname $ (realpath ${0})) as much as possible- realpath is used here to get the absolute path.
- Map out all the direcotries, avoid using relative paths. ls
- Avoid hard code the version numbers. major, minor, patch
========================================================================
========================================================================
-
condition:
if [-n "$NULL_STRING"]
: returns true if string is NOT null.if [-z $NULL_STRING]
returns true if string is Null-
[[ -d FILE_PATH ]]
if file path exists
-
-
condition truly is:
if return_value
:if groups $USER | grep -q docker; then ...
is totally valid.grep -q docker
returns 1 if keyword is found.-q
means quiet -
tail -n1
shows the first item on the top -
set -euo pipefail
, see ref- This is set because by default, bash script doesn't halt at errors. Here any errror will stop the script, and fails loudly