Linux General Commands

Name Service Switch libraries COMMANDS

Survival kit :)

apt install -y less procps vim net-tools wget

Timezone configuration

The /etc/localtime file configures the system-wide timezone of the local system that is used by applications for presentation to the user.

That is a symlink to /usr/share/zoneinfo/, followed by a timezone identifier such as « Europe/Berlin » or « Etc/UTC ».

Example :
ls -lrt /etc/localtime
Output :

/etc/localtime -> ../usr/share/zoneinfo/Europe/Paris

zdump /etc/localtime
Output :

/etc/localtime  Fri Feb 20 12:12:27 2019 CET

Date commands

flags :
–utc : print the date as utc

date command allows mainly to :
– print the current date or any other date (string) in a specific format or in the default of the system.

Display the current date :
date

Convert a string date in ISO-8601 format to our current Linux date format :
date -d STRING_DATE

Examples :
Convert a timestamp :
date -d @1617537859
Output (With my french date conf):
dimanche 4 avril 2021, 14:04:19 (UTC+0200)

Display the date with a specific format:
date +%CODE_VALUE

Examples :
date +%D : DD/MM/YY
date +%T : HH:MN:SS
date +%s : epoch time in seconds, that is seconds since 01/01/1970…UTC
date +%s%3N : epoch time in ms
date +%Z : timezone (ex: CET)
Or with a custom format :
date +%Y-%m-%d : yyyy-MM-dd
If contain space characters, we should enclose the whole with «  » :
date "+%Y-%m-%d -- %H:%M" :  yyyy-MM-dd — hh:mm

Formatting a specified date (instead of the current) as epoch time sec :
date -d "2001-12-24T21:34:56" +%s
output :
1009226096

Time commands

To measure execution time of a command.
Basic syntax : time myCommand foo bar

To capture the time output in a distinct output of the timed command :
{ time myCommand foo bar 1>somewhere; } &>somewhereElse
For example to capture the output of the duration of a curl command in a file (here we also ignore the curl output):
{ time curl -s "http://foobar:" 1>/dev/null; } &>time.txt

LS COMMAND

Display behavior

On the terminal a plain ls command outputs files/folders in columns, sorted vertically.
For example :
A folder with few elements may be rendered on single line such as :

foo-file01.txt  foo-file02.txt  foo-file03.txt  foo-file04.txt  foo-file05.txt  foo-file06.txt  foo-file07.txt  foo-file08.txt

While a folder with many elements is rendered on multiple lines such as :

foo-file01.txt  foo-file03.txt  foo-file05.txt  foo-file07.txt  foo-file09.txt  foo-file11.txt
foo-file02.txt  foo-file04.txt  foo-file06.txt  foo-file08.txt  foo-file10.txt  foo-file12.txt

But that display by columns is just a special layout in terminal.
In fact, elements are not separated by spaces but by a newline.
If we output the ls result in a file or if we pipe it with another command, we could notice that.

Flags :
-l : use a long listing format
-t : sort by modification time
-S : sort by file size
-R : list subdirectories recursively
-r : reverse order while sorting
-a : do not ignore entries starting with .
-A : do not list implied . and ..
-d : list directories themselves, not their contents
--full-time : list the full time modification of files (second, ms, timezeone) instead of limiting it to minutes.

Remote accesses

ssh command

Open SSH client.

login with a user on a host :
ssh [user]@host[:port]
Default port : 22.
Default user used on the remote machine : the current user of the client machine that executes the ssh command.
Helpful flags :
-v : verbose mode. Helpful to debug connection, authentication, and configuration problems.
Use -vvv to get maximum verbosity.

ssh key exchange for password-less connection

Two actors : a host (or a client) that wants to login on a remote host.
The principle illustrated by an example:
david@machine-foo wants to connect to ansible@machine-bar.

1) The client (david) generates a pair of key : a public and a private.
Pre-requirement :
– the client machine has a /home/david/.ssh/ folder with rwx only for the user (david).
The process :
Here we generate rsa keys :
david@machine-foo:$ ssh-keygen -t rsa
As a result we could see something like that:

Generating public/private rsa key pair.
Enter file in which to save the key (/home/david/.ssh/id_rsa): 
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/david/.ssh/id_rsa.
Your public key has been saved in /home/david/.ssh/id_rsa.pub.

Note : the generated public key file contains that textual information : david@foo-machine.
It means that only the user david logged to foo-machine would be able to perform the ssh on a remote machine.

2) We add the public key in the list of authorized keys of the machine-bar for the user ansible.
Pre-requirement :
– the remote machine has a /home/ansible/.ssh/ folder and a authorized_keys file inside it with rwx only for the user (ansible).
The process :
We remotely append the public key to the authorized_keys file of the ansible user :
david@machine-foo:~/.ssh$ cat id_rsa.pub | ssh ansible@machine-bar 'cat >> .ssh/authorized_keys'
Note : we need to enter the ansible password of the remote machine here.

2 bis) If the ansible user of foo-machine is required to perform the ssh on another user account of the bar-machine or even on another machine, we just need to append our ansible public key to whichever user@machine:.ssh/authorized_keys file.

3) Now we can login with SSH without password :

david@machine-foo:~$ ssh ansible@machine-bar 
Last login: Wed Oct ....
[ansible@machine-bar ~]$ here we go

scp command

Secure Copy.
– to copy a file from local to a remote machine:
scp foo.txt user@host:/remoteDir/

– to copy a file from local a remote machine to local:
scp user@host:/remoteDir/foo.txt /localDir

Helpful flags :
-r : recursive
Note : to copy recursive directories (/a/b/c) to a recursive target (/a/b/c), the parent directories on the target host (/a/b/) have to exist.
-p : preserves modification times, access times, and modes from the original file

HTTP tools

wget tool : non interactive downloader. It downloads in the working directory the resource located by the url parameter.

Useful flags :
-q : quiet
--show-progress : show progress bar
-O fooFile : concatenate the downloaded documents and redirect it to an (o)utput file
-O - : redirect it to the standard output

curl command

non interactive tool to transfer data from or to a server, using one of the
supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP,
IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS,
SMTP, SMTPS, TELNET and TFTP).

Useful flags :
-o, –output <file> :write output to <file> instead of stdout.
-L : follow the location of the redirection
-s or –silent : silent mode (Don’t show progress meter or error messages)
-v : verbose mode
--fail : returns a not 0 exit code if the response is not Ok (2XXX).
--data "@myFile.json" : pass a file as posted data
--form "@foo_file_parameter=@path_to_a_local_file": upload a local file in the form
-u user : provide the user/token
or -u user:pass : provide the user and pass
Beware don’t enclose user or user:pass between quotes.
-I : perform a HEAD request and outputs only http header
-w (or --write-out) <format> : defines what to display on stdout after a completed and successful operation.
-X POST|GET|PUT|DELETE... : specify and force the method name to use for the current request and any other followed (if -L is used).
WARN : should generally not be used but for PUT or DELETE because in other cases, curl guesses according to the passed parameters which method name should be used.
-d or --data : send the specified data in a POST request (it uses the application/x-www-form-urlencoded content-type by default). We can pass the data literally or specify a local file as data
-c (or --cookie-jar) fooFile : write the in-memory cookies in the fooFile after the operation is completed
-b (or --cookie) data or file : add in the request cookie header, data in the form « NAME1=VALUE1; NAME2=VALUE2 » or a file
-H "key:value" : add the header key-value.
-H "key1:value1" -H "key2:value2", ... : add several header keys-values
--noproxy localhost foobar : disable system proxy for listed urls.
-k: unsecure (to use to bypass SSL).

Examples curl :

1) Spring Security Login – Send a POST request with some data and store the cookies after the initial response and reuse them in the second request performed upon the redirection response :

curl -c cookies -b cookies -v -L \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "username=john&password=123" \
http://localhost:9000/spring-security-oauth-server/login

2) Get only status code with a HEAD request:
curl -Is -w %{http_code} -o /dev/null http://my-website.com

2 bis) Get only status code with a specific (GET/POST…) request:
curl -s -w %{http_code} -o /dev/null http://my-website.com

3) Set Authorization Basic :
curl -v -H "Authorization:Basic encodedBase64ofuser:pass=" "myUrl"

4) POST a json message without variable:
With a file :
curl -v -d "@myFile.json" -H "Content-Type: application/json" "http://foo.com/bar"
With inline json (enclose it with ‘ to prevent collision with « ) :
curl -v -d '{"username":"xyz","password":"xyz"}' -H "Content-Type: application/json" "http://foo.com/bar"

5) POST a json message with variables :
With a file :
The idea is to make things in two times : generate the json since a function and then pass its result as the data param.

function generateJson(){
 cat << EOF
   {
    "version": "$version",
    "comment": "$comment"
   }
EOF
}
version="1.0"
comment="it is an important version (note the whitespaces)"
 
curl -v -X POST -H "Content-Type: application/json" -d "$(generateJson)"  "http://foo.com/action"

As alternative we can also not use a function but only EOF:

statements_json=$(cat << EOF
{"statements":
[
    {    
     "statement_type": "Assignment", 
     "assigned_variable":"foo",
     "call":
          { 
           "function":{"name":"foo_compute"},
          }
     } 
]
}
EOF
)
curl  -H "Content-Type: application/json"  \
-d "${statements_json}" "localhost:5001/...

The inline version is more complex.
Variable values that never contain spaces may be passed as : "'$foo'"
Variable values that may contain spaces have to be passed as : "'"$foo"'"
To avoid any whitespace issue, using the second way for any variable is the way.
Example :

version="1.0"
comment="it is an important version (note the whitespaces)"
curl -v -X POST -H "Content-Type: application/json" -d '{"version": "'"$version"'","comment": "'"$comment"'"}' "http://foo.com/action"

6) test whether an URL returns a successful status code (2XX) or not :

url="http://foo.com"
if curl -o /dev/null --silent --head --fail "$url"; then
  echo "URL exists: $url"
else
  echo "URL does not exist: $url"
fi

7) (variant of 6) test explicitly the return code of response :

codeResponse=$(curl -w %{http_code} --silent -L -o /dev/null  "http://foo.com/bar")
if (( codeResponse < 200 || codeResponse >= 400 )); then
     echo "Bad answer ! Server response=${codeResponse}"     
fi

ENCODING

base64 : program that encodes/decodes data and print to standard output.

Example :

*encode the base64 value form the stdin and print that:
printf user:pass | base64

output–>dXNlcjpwYXNzCg==

*decode the base64 value form the stdin and print that :
printf dXNlcjpwYXNzCg== | base64 -d

output–>user:pass

OUTPUT

*print number of lines from the output:
fooCommand | wc -l

*print newline, word, and byte counts from the output:
fooCommand | wc

*print newline, word, and byte counts from file(s) (or all file : *):
wc file

* print the character in the stdout
echo tutu

* print the character in the stdout without new line char at the end
printf tutu

FIND/FILTER COMMANDS

——–

FIND COMMAND

-name option :
search by file name thanks a shell regex.
See shell regex here.

-iname option :
search by insensitive file name.

*find files and folders with the name « foo » pattern in any directory (/) :
find / -name foo 2>/dev/null

* To do a match with the full path :
find / -wholename a/b/c/foo 2>/dev/null

* To do a match with a part of the path :
find / -path a/*/foo 2>/dev/null

* To use a regex :
find / -regex foo 2>/dev/null

* To make an insensitive search (works with name, wholename and regex):
find / -iname foo 2>/dev/null

* To do a reverse search (works with name and -regex at least):
find ! -name "value" : includes only names that doesn’t container « value ».
Ex: -type f ! -name "*.log" ! -name "*.csv" ! excludes .log and .csv files.
Or -not :
find -name "*.js" -not -path "./directory/*"

* Express multiple conditions with find and without exec : 
By default,the and operator is used between each TESTS argument :
So specify theand is often unnecessary.
Or syntax : -or othercondition or –-o othercondition
And syntax : nothing, -and othercondition or -a othercondition

Examples :
Matches to fooRegex regex or barRegex regex :
-regex fooRegex -or -regex barRegex

Matches to fooRegex regex and last accessed 1 day ago :
-regex fooRegex -atime 1
or
-regex fooRegex -and -atime 1

Matches to fooRegex regex or last accessed 1 day ago :
-regex fooRegex -or -atime 1

Ex: find files of type xml or properties.
find . -type f -regex ".*.xml" -or -regex ".*.properties"

* Express multiple conditions with find and exec : 
The use of exec with AND conditions (the default) doesn’t cause any problem.
The above syntax is fine.
With OR conditions, we may have unexpected behavior because the last TEST condition implies a AND condition with -exec. Which means that earlier OR condition may make the exec part to be not executed.
Ex: TEST_ONE -or TEST_TWO -exec fooCmd {} \;
With operator precedence, it may be understand like : TEST_ONE OR (TEST_TWO AND -exec)
So if TEST_TWO is false, the -exec part is never executed.

To prevent that issue, we need to surround the TEST OR conditions with ( )
. Ex: Find file finishing by .log or accessed 1 day go, and so for each one we echo it :
find /var/log/ \( -name "*.log" -or -atime 1 \) -exec echo {} \;

Useful flags :
-L : Follow symbolic links (at the beginning of the command!!)
-type (f/d/l/...) : includes only files/directories/symbolic links/…
-maxdepth DEPTH : the max depth of the search
-size -100(k/M/G) : includes only files < 100kb/100mb/100gb
-size +100(k/M/G) : includes only files > 100kb/100mb/100gb
-newerXY 'date time' : filter files with a date time more recent as the passed date as parameter.
Example : -newermt '10/01/2020 14:00:00' : filter files modified after ’10/01/2020 14:00:00′.

Measure times TESTS
These are :
-amin
-atime n : File was last accessed n*24 hours ago
-cmin
-ctime
-mmin +n or -n or n : File’s data was last modified (+ or – or equal) n* minutes.

-mtime +n or -n or n : File’s data was last modified (+ or – or equal) n*24 hours.

The measure time value (n) can accept a sign + or -.
+n : strictly more than n
-n : strictly less than n

Mesure time can be used with -daystart.
-daystart : Measure times (for -amin, -atime, -cmin, -ctime, -mmin, and -mtime) from the beginning of today rather than from 24 hours.

Example with -mtime :
-mtime -nbDay : filter filed modified strictly less than nbDays days ago  
-mtime +nbDay : filter filed modified strictly more than nbDays days ago
-mtime nbDay : filter filed modified nbDays days ago
Beware : -mtime reasons in terms of of whole days
So -mtime +0 : filter files modified greater than 24 hours ago
So -mtime +1 : filter files modified greater than 48 hours ago
So -mtime 0 : filter files modified today and less than 24h
So -daystart -mtime 0 : filter files modified today since 0h
So -daystart -mtime 1 : filter files modified yesterday between 0h and 23h59

-print0 : print the full file name on the standard output, followed by a
null character instead of the newline character that the default option (-print) uses.

-printf FORMAT : print format on the standard output, interpreting `\’ escapes and `%’ directives. Field widths and precisions can be specified as with the `printf’ C function
May be seen as an alternative to -exec in some cases.
For example :
To output the filenames of the found paths (instead of the absolute path) :
find -name "foo*" -printf "%f\n"

-exec foocmd {} \; : executes the foocmd for each file retrieved. {} represents the current file matched by find.

Command to execute by file
The command is executed for each input file. And only one command may be specified
To execute multiple commands (similar way to xargs), we may use as trick bash -c "foo command" as command such as :
find -regex .*foo.* -exec sh -c "echo {} ; echo 'current file done'" \;
Sometimes we want to output the second command only if the first one returned something. We could achieve it such as :
find -name pom.xml -exec sh -c "grep distributionManagement {} && echo '{}' " \;

Example with -printf :
find . -regex .*java -printf « %T+ %p\n »
find files with java extension and output : file name (%p) by line (\n), sorted by modification date (%T+)

Example with -type and -exec :
* find . -type f -exec sha1sum {} \; | sha1sum
computes the sha1 for each files and compute the sha1 of these sha1.

Other example with -exec:
* find . -name Dockerfile* -exec grep -l maven {} \;
find all files starting by Dockerfile and execute grep -l maven <match> for each match

Less command

– search : /fooWord
– toggle unsensitive/sensitive search : -i
– toggle print line number : -N

sed command

Command to replace text in std input or files.
By default, sed doesn’t modify the input files. Indeed that writes in the std output the result of the files/input after transformation.
About regex syntax, sed (as grep) uses basic regex by default.
Advise : favor ‘ ‘ for sed expression over  »  » because ‘ ‘ is not interpreted by shell while ‘ » « is.

Helpful flags and syntax :
Basic syntax : s/old/new
For Global replace the syntax is : s/old/new/g
s means « substitute »
-i flag : means « in-place » means don’t use a temporary files for changes but do the changes directly on the files< br /> g means global, that is : don’t limit to one match transformation by line.
-Euse the extended regex instead of the basic regex (the default)
-d skipskip directory

Some examples :
* To replace inline all matching chars in file(s) whatever the matching count by line (g : to global):
sed -i s@old@new@g fooFile barFile ...
or with any other valid delimiter :
s/old/new/g

* To replace inline all matching chars in file(s) with a single replace by line (no g option):
sed -i s@old@new fooFile barFile
or with any other valid delimiter :
s/old/new

* To replace multiple files without specifying each filename (works also with find ... -exec sed ...):
grep -rl "old" . | xargs sed -i s@old@new@g
or with any other valid delimiter :
s/old/new/g

* Example with extended regex :
Example :
echo http://foobar.com | sed -E 's@http[s]?://@@'
Output : foobar.com

* to specify > or < char in the regex, we need to prevent shell to interpret them, so we enclose the whole expression between the double quotes :
sed -i "s/<h2>/<h3>/g" fooFile

* to specify a special character as a literal in the pattern, we escape the char (double quotes, slashes…) with \ :
Example to transform chien to « chien » :
echo chien | sed "s/chien/\"chien\"/g"
Or clearer with ‘ ‘ :
echo chien | sed 's/chien/"chien"/g'

* To specify a range of chars or even of complex chars (space, tabulation…), we use […].
Ex: [abc] means any a, b or char and [ z] means any space or z char.
Example to remove all whitespaces :
echo « hello you\n how are you ? » | sed « s/[ ]//g »  
output : helloyou\nhowareyou?

* To replace \n char by newline  :
sed 's/\\n/\n/g' anyFile.txt

* To insert text at a specific line :

Hardcoded example :
Insert « dog » at line 25 of the fooFile :
sed -i "25i dog" fooFile

Dynamic example:
Insert value of $fooValueInserted at line {lineNumberToInsert} of the fooFile :
sed -i "${lineNumberToInsert}i ${fooValueInserted}" fooFile

* To delete the whole line matching to a pattern :
sed '/regex/d' foo
Note : unlike the « s » (substitute) command, the « d » goes after the pattern.
Example to delete line containing the foo chars :
sed '/.*foo.*/d' foofile
Example to delete line containing only space (or blank lines):
sed '/^$/d' foofile

* To output specific lines in a file:
Ex : output file 10 to 15 :
sed -n '10,15p' foofile

grep Command

Pattern and match flavors
* Default is basic regular expression (note the escape):
fooCommand | grep "david\|dog"

* filter with a POSIX pattern :
fooCommand | grep -e "anyText"
This can be used to specify multiple search patterns :
fooCommand | grep -e "anyText" -e "otherText"

* filter with a text pattern :
fooCommand | grep -F « anyText » (F like Fixed-String)

* Filter with an extended regex (note no escape)
grep -E 'foo[0-9]+'
grep -E 'david|dog'

* Filter with a Perl regex
The perl mode is very powerful. It is the single way to benefit from :
– non-greedy modifiers
– use the short class characters syntax.

Ex with greedy:
We search the exact match that starts with "foo-patterns": followed by any chars until encountered the first ] char. First thanks to greedy  .*?] 
grep -P -o "\"foo-patterns\":.*?]"

Ex with class characters:
We search match that have the form X[X].X[X].X[X] where X is any digit.
grep -P "\d{1,2}\.\d{1,2}\.\d{1,2}"

* Variant with (w)hole word search:
grep -rnw ‘pattern’ ‘path’

Specify scopes

* find files (r)recursively, with their line (n)umber that matches with the regular expression for the path(s):
grep -rn ‘pattern’ ‘path(s)’

* To search in the current directory :
grep -ns ‘pattern’ *.*

Chars to escape with \ : []

Useful grep flags :
-r : read all files/directories recursively, following symbolic links only if they are on the command line
-R : read all files/directories recursively, following all symbolic links
-o, –only-matching : print only the matched (non-empty) parts of a matching line, with each such part on a separate output line.
-B numberOfLine : Print numberOfLine lines of leading context (B)efore matching lines.
-A numberOfLine : Print numberOfLine lines of trailing context (A)fter matching lines.
-a : Allows to bypass the problem of binary file.
-n : print the line number along the matching part
-m maxCount : stop reading after max match reached
-q, –quiet, –silent : do not write to standard output.
-i : ignore case
-v : invert/reverse the matching
-s : suppress error messages about nonexistent or unreadable files.
--exclude-dir=dir: exclude a directory
--include=baseNameMatchWith : include only files whose basename match with the param
--exclude=baseNameMatchWith : exclude files whose basename match with the param
To include/exclude more than one kind of extension use *.{ext1, ext2, extn} such as
--exclude=*.{var,so,h,svg}
We can use both : include only some files and excludes some from them such as :
grep -rl --include=*.java --exclude=*Test.java assertIsDirectoryContaining to include only .java files but that don’t finish by Test.java

-l : replace normal output by (l)ist of filenames matching
-L : replace normal output by (list) of filenames not matching

From Grep 2.5.2 :
–exclude-dir=fooDir : exclude a directory

pgrep command

* list pid of processes which the command name start with a string : pgrep
ex: pgrep jav

Helpful flags :
-f : use the full command name to do the match

uniq command

By default it removes only successive repeated lines.
To remove any repeated lines, successive or not, we need to sort them first :
With a file input : sort foo-file | uniq
With std input : anyCommandThatWriteInStdInput | sort | uniq

xargs command

Build and execute a command from standard input.
The syntax is : xargs commandToExecute initialArgs.
It executes commandToExecute with initialArgs followed by items received from the std input.

Command to execute by token :
The command is executed for each received/input items. And only one command may be specified
To execute multiple commands (similar way to find … -exec), we may use as trick bash -c "foo command" as command such as :
grep foo anyFile | xargs -I% bash -c "echo % && echo 'end current token'"

Way of referencing the items received from the std input 
With the default placeholder {} or a specific placeholder. 
For example with the default :
echo 10 | xargs -i expr {} / 2
Or with a specific :
echo 10 | xargs -I% expr % / 2
In both cases, the output is 5.

Useful flags :
-0, –null : Input items are terminated by a null character instead of by white‐space, and the quotes and backslash are not special.
It disables the end of file String. Useful when the input may contain spaces, quotes or backslashed (generally files)

Example :
echo /etc/ | xargs ls
produces the listing of the /etc/ folder

Example :
ls /etc/ | xargs echo « file in etc : »
produces the listing of the /etc/ folder (1 command executed)

Example :
find . -type f -print0 | xargs -0 sha1sum | sha1sum
computes the sha1 for each files and compute the sha1 of these sha1.
Beware the first sha1sum is invoked a single time : with all filenames.

AWK command

Pattern scanning and text processing language.
It handles line by line along as field of lines. The default separator for fields is a space char.
Flags : 
-F "anyChar" : specify the field delimiter char.

Typical uses :

Numerical computation :
* for each line, process some columns such as summing and display the result at the end :
awk '{sum+=$1} END {print sum}'

* for each line, filter on a condition and process some columns such as summing only the positive numbers and display the sum for each line:
awk '$1>=0 {sum+=$1; print "sum="sum}'

Insert:
* Take inputFile and insert at line 4 the value defined in $shellArg and store the result into a new file : outputFile :
awk -v awkArg="$shellArg" 'NR==4 {print "$awkArg" }1' inputFile > ouputFile

String concatenation :
* take each line in a file and separate them by a white space: 
cat fooFile | awk '{files=files " " $1} END {print files}'

Predefined useful variables :
$0 : the whole line
$1 : the current first field
$2 : the current second field
$NF : the current last field
and so for…
NR : the number of the current row
NF : the number of the field of the current row
BEGIN {action} : before iterations and processing, execute that action
END {action} : after all iterations and processing, execute that action

* default action for « print » : print the (filtered) row

* To execute a command with awk :
cat fooFile |awk '{system("myCommand" $1)}'

SCRIPT

* prolog sh :
#!/bin/bash

* prolog bash
#!/bin/sh -x

* prolog .sh with echo on :
#!/bin/bash -x

* test files or types
* test if variable string value (n)ot empty
test -n « $MY_VAR »
echo $? where == 0 -> true : false)

HISTORY COMMANDS

Clear history of the current session :
cat > ~/.bash_history && history -c && exit
or
history -c (clear the history by deleting all the entries)
history -w (write the current history to the history file)


SHELL/TERMINAL SHORTCUT

ctrl+SHIFT+w : close the current tab
ctrl+K : copy text from the cursor in the shell clipboard
ctrl+U : copy text until the cursor in the shell clipboard

ctrl+Y : paste it
ctrl+R : search in the command history (more recent to the oldest)

A standard PS1 value in case of issue (set it after executing bash):
PS1="\n\[\e[0;97m\][\D{%d%m%y-%H%M}][\u@\h \W] \n\[\e[0m\]\$"
Or:
export PS1="\e[0;32m\u@\h \w $ \e[m\n"

+Also instead of « linux », using « xterm » :
export TERM='xterm'

PS1='$(whoami)@$(hostname):$(pwd) '

Color codes

txtblk='\e[0;30m' # Black - Regular
txtred='\e[0;31m' # Red
txtgrn='\e[0;32m' # Green
txtylw='\e[0;33m' # Yellow
txtblu='\e[0;34m' # Blue
txtpur='\e[0;35m' # Purple
txtcyn='\e[0;36m' # Cyan
txtwht='\e[0;37m' # White
 
bldblk='\e[1;30m' # Black - Bold
bldred='\e[1;31m' # Red
bldgrn='\e[1;32m' # Green
bldylw='\e[1;33m' # Yellow
bldblu='\e[1;34m' # Blue
bldpur='\e[1;35m' # Purple
bldcyn='\e[1;36m' # Cyan
bldwht='\e[1;37m' # White
 
unkblk='\e[4;30m' # Black - Underline
undred='\e[4;31m' # Red
undgrn='\e[4;32m' # Green
undylw='\e[4;33m' # Yellow
undblu='\e[4;34m' # Blue
undpur='\e[4;35m' # Purple
undcyn='\e[4;36m' # Cyan
undwht='\e[4;37m' # White
 
bakblk='\e[40m'   # Black - Background
bakred='\e[41m'   # Red
badgrn='\e[42m'   # Green
bakylw='\e[43m'   # Yellow
bakblu='\e[44m'   # Blue
bakpur='\e[45m'   # Purple
bakcyn='\e[46m'   # Cyan
bakwht='\e[47m'   # White
 
txtrst='\e[0m'    # Text Reset

SUDO COMMON COMMANDS

* execute a command as sudoer with the current user :
sudo cmd

* open a shell as sudoer for a user (by default root if not user specified):
sudo su fooUser

* open a shell as (root) sudoer by keeping the env of the current user:
sudo su –preserve-environment or -p

useful flags :
-E : indicates to the security policy that the we want to preserve existing (E)environment variables.

* log as root user (not working if root account disallowed : for example on Ubuntu)
su

* log as root with the current user super rights :
sudo su

* execute a command as a user (redirection pipes keep rights of the current shell):
su foouser -c cmd

* execute a command from a new shell with sudoeur rights for the current user :
sudo sh -c « cmd… > /etc/ works now »

* open a shell as a user (no system user) :
su – foouser
The – is an alias for –loggin to mean « with environement of the user as if logged »

* open a shell as a user by specifying the shell (useful for system users ) :
su – foouser -s /bin/sh

* – : in many commands, it represents the standard input.

SHELL SETTINGS

Display the current shell used :
echo $SHELL

Set the current shell used :
SHELL=anyShell
ex: SHELL=/bin/bash

Variable (it is not set by default) to define the profile to load for non-interactive, non-login shell executions :
BASH_ENV="anyBashrc"
Ex: BASH_ENV="/root/.bashrc" or BASH_ENV="~/.bashrc"

SHELL COMMAND BINDING

bind -l : list all readline functions
bind -p : list the keybindings and the corresponding function names (even those commented). Ordered by keybinding
bind -P : same thing but ordered and groupped by function names
bind -q anyFn : list keybinding for a function
bind TAB:anyfn : add the binding TAB to the function « anyfn »

Tricks

* Have the complete function with TAB (default) but also the « menu-complete » function with SHIFT+TAB :
bind ‘ »\e[Z »:menu-complete’

* Know the character sequence for a key or a key sequence :
In terminal : Ctrl+V, then enter the key or the key seqence.
Ex:
SHIFT+TAB -> ^[[Z -> binding to enter: \e[Z
F12 -> ^[[24~ -> binding to enter: \e[24~

COMPRESS/UNCOMPRESS

* TAR compact and compress :
tar -cvzf file.tar.gz foo directory

* TAR uncompact the compressed archive (in the current directory):
tar -xvzf file.tar.gz

x: extract
c: compact
t: list files
v: verbose
z: with gzip
f: file to extract

other useful flags :
--strip-components=NUMBER
strip NUMBER leading components from file names on extraction

-C, –directory=DIR
Change to DIR before performing any operations.

Examples
Download a tar (git) from a http server and extract its content into the /opts/git folder :
cd /opts && mkdir git && cd git && curl -o git.tar.gz "http://do.com/git-2.31.1.tar.gz" && tar -xvf git.tar.gz --strip-components=1

Zip command

* ZIP no recursively files/directories (foo/ will be empty in the zip) :
zip archive.zip foo/ a.txt b.txt

* ZIP recursively files/directories :
zip -r archive.zip foo/ a.txt b.txt

* ZIP list the content of the archive :
unzip -l file.zip
OR :
less file.zip

* ZIP uncompress an archive in the current directory :
unzip file.zip

* ZIP uncompress an archive in a specific directory :
unzip file.zip -d destination_folder
* GZIP : compress
Compress each file (not recursive) in its own gz file (uncompressed original files are deleted).
gzip *
Compress each file (not recursive) in its own gz file without deleting uncompressed original files.
gzip --keep *

Helpful flags :
-r : recursive

CUSTOMIZING UBUNTU/LINUX

* display the current shell
echo $0

* display the env variables
env

* echo the current desktop graphical layer user (XDG-> X Desktop Group) :
echo $XDG_CURRENT_DESKTOP

*list schemas for gnome :
gsettings list-schemas

*list all key-values for a schema :
gsettings list-recursively org.gnome.desktop.wm.keybindings

*set a key binding to minimize a windows with « Windows » + « End » keys :
settings set org.gnome.desktop.wm.keybindings minimize « [‘<Super>End’] »

*add a shortcut in the Desktop

Create a file with the x attribute and with a .desktop extension such as :

[Desktop Entry]
Version=1.0
Type=Application
Terminal=false
Icon=idea.svg
Exec=sh /opt/idea-IC-192.6817.14/bin/idea.sh
Name=IntelliJ

– To add the shortcut only in the Desktop : create the file in ~/Desktop
– To allow the shortcut to be visible by any users and to be visible too in activities : create the file in /usr/share/applications/

GEDIT

* Next match : Ctrl+G
* Previous match : Ctrl+Shift+G

NAUTILUS

* Ctrl+L : show address bar

VIM commands

Issues
Problem:
– End, home keys don’t move the cursor but change the current content.
– no syntax color 
Solution :
export TERM=’xterm’

Problem :
preference set in vim are not preseved after vim restart
Solution :
Update ~/.vimrc with the settings.

Problem :
– The comments in the file are not displayed in a visible color. These are too dark
Solution: :set background=dark instead of the default :set background=light. The colors are then automatically correctly set.
To make it permanent, update ~/.vimrc with the settings:set background=dark

Commands history: enter the beginning of the passed command and enter up arrow to find all matching to.

Enable/Disable color/syntax :
:syntax on
:syntax off

Modes:
: command mode
i insertion mode
Ctrl+v column mode


Set a vim property :
:set property
Examples :
– display line numbers :
:set number

Delete all content :
:%d
% means all the lines
d : delete

Undo/Redo :

undo : u
redo : ctrl+r or :redo command

Visual mode (V)
* y to copy/yank selected text
* y to copy/delete selected text
* p to past it

Default mode
* delete from the cursor to the end of the line : d$
* delete from the cursor to the begining of the line : d0

* delete in the black hole : use _d instead of d

* copy current line : yy or Y
* delete/cut curent line : dd
* paste it after the current line : p
* paste it before the current line : P
* find : /wordToSearch
default : senstitive case
In vimrc, set ignorecase to change the default

* to do an insensitive case search (\c) : /mySearch\c
* to do a sensitive case search (\C) : /mySearch\C
* find and replace (or substitute) foo by bar in the current line: :s/foo/bar/g
* find and replace (or substitute) foo by bar in all lines: :%s/foo/bar/g

* next occurrence : n
* previous occurrence : N

* Syntax of find/replace may use / or any other char like sed command:
Ex: replace all foo by/foo/ :
:%s@foo@//@g

Sort command

– sort by column : -k FOO_COLUMN.
ex: -k3 sort by the third colum (ASC)
– reverse the order : -r
– compare human readable numbers (e.g., 2K 1G) : -h
– compare according to string numerical value : -n

Cp command

cp -a fromPath/. destPath/ : copy all content (files and directory) of a current directory to another one by preserving all files/directories attributes.

It is a shortcut for cp -Dr --preserve=all fromPath destPath : copy resources (files/directory) by preserving links along files/directories attributes : mode,ownership,timestamps, etc…
We could also  specify attributes to preserve such as : --preserve=mode,timestamps

Copy the full path of source files to a specific directory (so directories are also created and copied) :
--parents

SHELL BASIC commands

cd – : go back to the previous directory
$FOO or ${FOO} : refer the FOO variable value
"$FOO something" : resolve the FOO variable
'$FOO something' : doesn’t resolve the FOO variable
$(anyCommand) : execute the anyCommand command in a subshell and return it as a value
(...) : group commands and execute them inside a subshell
{ ... } : group commands and execute them inside of the current shell.
The program x=2; { x=4; }; echo $x prints 4,
while x=2; (x=4); echo $x prints 2.
braces require spaces around them and a semicolon before closing, while parentheses don’t.

ls -a : list all files (that is even these starting with the . char)

rm -rf * : rm all content of the current directory
rm -rf foo/* : rm all content of the foo directory
rm -rf */ : rm only directories of the current directory
which fooCmd : locate a command

EOF : useful for multi-line input :

For example :

cat <<EOF > print.sh
#!/bin/bash
echo \$PWD
echo $PWD
EOF

produces print.sh such as :

#!/bin/bash
echo $PWD
echo /home/user

REDIRECTIONS COMMAND

Redirect the std output to … : 1>
Redirect the error output to … : 2>
Redirect both to (Bash 4 only)… : &>
Redirect error output to std output : 2>&1
Redirect std output to a file and error output to std ouput (that is the same file) :
1>foo.txt 2>&1
Warn : the order matters! First we need to redirect one of them to a file and then to redirect the other one in the same file.

Copy std input to out.txt FILE while it still outputs on the std and error output :
grep foo . | tee out.txt
Copy standard and error input to out.txt FILE while it still outputs on the std and error output :
grep foo . 2>&1 | tee out.txt

FILE MANIPULATION

* truncate OPTION file : shrink or extend the size of a file to the specified size. If the file doesn’t exist, it is created.

useful flags :
-s foosize : set the size (in bytes) of the file
-c : do not create any files


dd command
Ex : Create a file of 1000MB :
dd if=/dev/zero of=output.dat bs=1000M count=1

Unity for bs :
M : Mb
G : Gb

Helpful params:
status=none : disable information about writing

SUBSTRING AND REPLACE

* shell substring on constant : ${VAR:offset-zero-base:length}
example : input-> fooVar=12345 and want->34
echo ${fooVar:2:2}

* shell string replace on constants/variables :
${var/pattern/replacement}

Basic example :
dog=Max
echo ${dog/Ma/Pa} // echo Pax

Other example (we wee that by default only the first match is replaced):
dog=Matata
echo ${dog/ta/xa} // echo Maxata

To go over that limitation we prefix the pattern (here ta) with / such as
dog=Matata
echo ${dog//ta/xa} // echo Maxaxa

* Example with a loop to rename all extensions (mov to wav) of files in the current directory :
for f in *.mov; do mv -- "$f" "${f%.mov}.wav"; done

* shell string matching with groups on constants/variables

Example :

reg='(.*)([0-9]+.[0-9]+.[0-9].*)(.jar)' && [[ "spring-boot-docker-kubernetes-example-sboot-1.0.0-SNAPSHOT.jar" =~ $reg ]]
echo "${BASH_REMATCH[0]}"
spring-boot-docker-kubernetes-example-sboot-1.0.0-SNAPSHOT.jar
echo "${BASH_REMATCH[1]}"
spring-boot-docker-kubernetes-example-sboot-
echo "${BASH_REMATCH[2]}"
1.0.0-SNAPSHOT
echo "${BASH_REMATCH[3]}"
.jar

cut command

* cut – remove sections from each line of files

syntax : cut OPTIONS… -f=LIST | -c=LIST | -b=LIST
useful flags :
-f, –fields=LIST : select only these field
-c, –characters=LIST : select only these characters
-b, –bytes=LIST : select only these bytes
-d, –delimiter=DELIM : use DELIM instead of TAB for field delimiter
To specify a inline char delimited, example a tabulation :
Use the Verbatim mode of shell (Ctrl+V) after -d and then Tab key :
cut -f2 -d' ' infile
example : input : aa_bb_cc and want : bb
cut -d’_’ -f2

Enclosing args with double quotes is often safer

In many bash commands it is generally a good practice to enclose args with &quot &quot when these args may be misinterpreted because of the remaining of the command (case for args that are patterns or that contain spaces) .
Example :
echo "the foo -bar -rab" | grep foo -bar -rab
It fails to interpret : foo -bar -rab as the regex.
Instead of it will consider foo as the regex and bar as 3 flags for the grep command
Here these 3 flags are valid for the grep command. So it will execute it without error but will not produce the expected result.
Which is misleading.
The correct way is :
echo "the foo -bar -rab" | grep "foo -bar -rab"

Escaping with \ or sometimes ‘ ‘

In some commands or contexts, some chars have a special meaning for the command.
But sometimes, we need to specify that literal character and not that special meaning.
Very command case : in many bash commands it is generally a good practice to enclose args with  »  » (look above point).
As a rule of thumb, two ways to fix that :
– escape problematic character with \ such as \ » for « .
– replace  »  » by ‘ ‘ in any one of the conflict side.
Beware  » also prevents shell interpolation. So to use with caution.

Some examples :
– we use grep to search a pattern with  »  » chars. 
Suppose that we want to find only matches with literally « hello David » chars for the current hello.txt file :

"hello David" yes :)
hello David no :(

grep "hello David" hello.txt
will match the two lines.
To match only the first line, we need to tell to grep to interpret  » as a character to match.
But doubling the double quotes will produce the same result :
grep ""hello David"" hello.txt
To get the expected result we need to escape  » with \ to mean don’t interpret it in another way than text :
grep "\"hello David\"" hello.txt

– we want to specify an exclamation mark in the url or in one of parameters of the curl.
Without escaping it will not work because the shell interpret that.
To avoid that, we escape the ! character with \ such as \!

– we want to post JSON inline data with the curl command.
Adding the JSON inside -d « my json… » without any care will produce an undesirable result because the json fields are also enclosed with double quotes.
Example :

curl -POST -H "Content-Type: application/json" -d "{"firstname":"david", "lastname":"bepson"}"  "localhost:30000/person"

Here curl send the json data : {f
To send {"firstname":"david", "lastname":"bepson"}, we need to prevent curl from interpreting the  » that enclose the field as as the end delimiter to define the -d arg.

Solution with \ escape :
curl -POST -H "Content-Type: application/json" -d "{\"firstname\":\"david\", \"lastname\":\"bepsonos\"}" "localhost:30000/person"
While it works, it looks verbose.

Solution with replacement of the outer  »  » specified by -d by ‘ ‘.
In that way, no possible conflict any longer between the -d arg end delimiter and the double quotes defined in the JSON.
curl -POST -H "Content-Type: application/json" -d '{"firstname":"david", "lastname":"bepson"}' "localhost:30000/person"

Expanding

ANSI-C Quoting
In some commands, we may need that backslash-escaped characters be expanded and interpreted such as a new line or a a tabulation.
To achieve that we could use the quote $'FOO_CHAR'
The word expands to string, with backslash-escaped characters replaced as specified by the ANSI C standard.
Example without and with :

$ echo 'foo\n'
foo\n
$ echo $'foo\n'
foo
 
$

echo -e flag
An alternative way is using echo with the -e flag that means « enable interpretation of backslash escapes ».
By default the reverse : -E is used with echo.
Example :

echo -e "foo\nbar\nfoobar"
foo
bar
foobar

Translating or deleting chars

tr command.
It Translate, squeeze, and/or delete characters from standard input, writing to standard output.
Syntax :
tr [OPTION]... SET1 [SET2]

Flags :
-d, --delete
delete characters in SET1, do not translate

-s, --squeeze-repeats
replace each sequence of a repeated character that is listed in the last specified SET, with a single occurrence of that character

Variables

* Define a variable (with a value or not) in the current process :
myVar=toto

* Define a variable (with a value or not) in the current process and all its children :
export myVar=toto

* Sort a tabular output on a specific column : fooInput | sort
To numerical sort on the 4th column : df -h | sort -n -k 4

Regex syntax

Shell pattern matching

find and locate commands can compare file names, or parts of file names, to shell patterns.
A shell pattern is a string that may be raw such as :  « foobar » that will match exactly to « foobar ».
But it may also contain wildcards.
Patterns with  wildcards have to be enclosed with quotes to prevent the shell from expanding them itself.

Wildcards :
*
Matches any zero or more characters.

?
Matches any one character.

[string]
Matches exactly one character that is a member of the string. This is called a character class.
It may also :
– contain ranges such as [a-z0-9_] that matches a lowercase letter, a number, or an underscore.
– negate a class by placing a ‘!’ or ‘^’ immediately after the opening bracket such as [^A-Z@]’ matches any character except an uppercase letter or an @ sign.

\
Removes the special meaning of the character that follows it. This works even in character classes.

Basic vs Extended Regular Expressions

Basic syntax is the default in sed and in grep.
To use the Extended one, we specify the -E flag.

In basic regular expressions the meta-characters :
?, +, {, |, (, and ) lose their special meaning.
We need to use the backslashed versions for get it : \?, \+, \{, \|, \(, and \)

Ce contenu a été publié dans Non classé. Vous pouvez le mettre en favoris avec ce permalien.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *