Notes on Python

Installing a different python 2.7
sudo yum install python27
rpm -ql python2.7-stack-2.7.12-1.9.g680e0d1.x86_64

sudo yum -y install gcc gcc-c++
sudo yum -y install python-devel libffi-devel openssl-devel
/opt/python-2.7/bin/python -m ensurepip --default-pip
/opt/python-2.7/bin/python  -m pip install  requests[security] --user
pip install --upgrade pip

python2.7 -m pip install pudb
python2.7 -m

print "web_response=", web_response

Notes on Shell Programming

Notes on Shell Programming

Special Parameters


The number of arguments passed to the program; or the number of parameters set by executing the set statement


Collectively references all the positional parameters as $1, $2, ...


Same as $*, except when double-quoted ("$@") collectively references all the positional parameters as "$1", "$2", ...


The name of the program being executed


The process id number of the program being executed

grep -v "$name" phonebook > /tmp/phonebook$$


The process id number of the last program sent to the background for execution


The exit status of the last command not executed in the background


The current option flags in effect

Other Variables Used by the Shell Variable


The directories to be searched whenever cd is executed without a full path as argument.


The name of a file that the shell executes in the current environment when started interactively.


The editor used by fc. If not set, ed is used.


If set, it specifies a file to be used to store the command history. If not set or if the file isn't writable, $HOME/.sh_history is used.


If set, specifies the number of previously entered commands accessible for editing. T



The Internal Field Separator characters; used by the shell to delimit words when parsing the command line, for the read and set commands, when substituting the output from a back-quoted command, and when performing parameter substitution. Normally, it contains the three characters space, horizontal tab, and newline.



The process id number of the program that invoked this shell (that is, the parent process).

PS1: The primary command prompt, normally "$ ".

PS2: The secondary command prompt, normally "> ".


Parameter Substitution

$parameter or ${parameter}

To access the tenth and greater arguments, you must write ${10}.


whatever was previously stored inside $2 will be assigned to $1, The old value of $1 will be irretrievably lost. When this command is executed, $# (the number of arguments variable) is also automatically decremented by one.

You can shift more than one "place" at once by writing a count immediately after shift, as in

shift 3


Substitute the value of parameter if it's set and non-null; otherwise, substitute value.


Substitute the value of parameter if it's set; otherwise, substitute value.


Substitute the value of parameter if it's set and non-null; otherwise, substitute value and also assign it to parameter.


Substitute the value of parameter if it's set; otherwise, substitute value and also assign it to parameter.


Substitute the value of parameter if it's set and non-null; otherwise, write value to standard error and exit. If value is omitted, write parameter: parameter null or not set instead.


Substitute the value of parameter if it's set; otherwise, write value to standard error and exit. If value is omitted, write parameter: parameter null or not set instead.


Substitute value if parameter is set and non-null; otherwise, substitute null.


Substitute value if parameter is set; otherwise, substitute null.


Substitute the length of parameter. If parameter is * or @, the result is not specified.


Substitute the value of parameter with pattern removed from the left side. The smallest portion of the contents of parameter matching pattern is removed. Shell filename substitution characters (*, ?, [...], !, and @) may be used in pattern.


Same as #pattern except the largest matching pattern is removed.


Same as #pattern except pattern is removed from the right side.


Same as ##pattern except the largest matching pattern is removed from the right side. Quoting

'…' Removes special meaning of all enclosed characters

"…" Removes special meaning of all enclosed characters except $, `, and \


Removes special meaning of character c that follows; inside double quotes removes special meaning of $, `, ", newline, and \that follows, but is otherwise not interpreted; used for line continuation if appears as last character on line (newline is removed)

Using the Backslash for Continuing Lines

The Backslash Inside Double Quotes

The backslash inside these quotes removes the meaning of characters that otherwise would be interpreted inside double quotes (that is, other backslashes, dollar signs, back quotes, newlines, and other double quotes). If the backslash precedes any other character inside double quotes, the backslash is ignored by the shell and passed on to the program:

Command Substitution

The Back Quote `command`

The $(...) Construct $(command)


The expr Command(for older linux)

expr 10 + 20 / 2

Each operator and operand given to expr must be a separate argument,

expr 17 * 6

The shell saw the * and substituted the names of all the files in your directory!

expr 17 \* 6

only use the command substitution mechanism to assign the output from expr back to the variable:

$ i=$(expr $i + 1) Add 1 to i

like the shell's built-in integer arithmetic, expr only evaluates integer arithmetic expressions. You can use awk or bc if you need to do floating point calculations.



if who | grep "^$user " > /dev/null


echo "$user is logged on"


if command






if command



elif command2


elif commandn




The test Command

test String Operators

Operator Returns TRUE (exit status of 0) if

string1 = string2, string1 != string2, string

-n string

string is not null (and string must be seen by test).

-z string

string is null (and string must be seen by test).

An Alternative Format for test

[ expression ]

spaces must appear after the [ and before the ].

[ "$name" = julio ]

Integer Operators

int1 -eq int2

int1 -ge int2

int1 -gt int2

int1 -le int2

int1 -lt int2

int1 -ne int2

File Operators

-d file file is a directory.

-e file file exists.

-f file file is an ordinary file.

-r file file is readable by the process.

-s file file has nonzero length.

-w file file is writable by the process.

-x file file is executable.

-L file file is a symbolic link.

The Logical Negation Operator !

The Logical AND Operator –a

The Logical OR Operator –o

[ -f "$mailfile" -a -r "$mailfile" ]


Use parentheses in a test expression to alter the order of evaluation; and make sure that the parentheses are quoted because they have a special meaning to the shell.

[ \( "$count" -ge 0 \) -a \( "$count" -lt 10 \) ]

The exit Command

The case Command

hour=$(date +%H)

case "$hour"


0? | 1[01] ) echo "Good morning";;

1[2-7] ) echo "Good afternoon";;

* ) echo "Good evening";;


The -x Option for Debugging Programs

You can trace the execution of any program by typing sh -x followed by the name of the program.

The Null Command :

The && and || constructs implement shortcut logic

enable you to execute a command based on whether the preceding command succeeds or fails.

sort bigdata > /tmp/sortout && mv /tmp/sortout bigdata

[ -z "$EDITOR" ] && EDITOR=/bin/ed

grep "$name" phonebook || echo "Couldn't find $name"

The for Command

for i in 1 2 3


echo $i


The $@ Variable

Whereas the shell replaces the value of $* with $1, $2, ..., if you instead use the special shell variable "$@" it will be replaced with "$1", "$2", ... .

The for Without the List

for var




Shell automatically sequences through all the arguments typed on the command line, just as if you had written

for var in "$@"




The while Command

The while loop is often used in conjunction with the shift command to process a variable number of arguments typed on the command line.

while [ "$#" -ne 0 ]


echo "$1"



The until Command

until who | grep "^$user " > /dev/null


sleep 60



When the break is executed, control is sent immediately out of the loop, where execution then continues as normal with the command that follows the done.

break n the n innermost loops are immediately exited.

continue n causes the commands in the innermost n loops to be skipped; but execution of the loops then continues as normal.

Executing a Loop in the Background

An entire loop can be sent to the background for execution simply by placing an ampersand after the done.

for file in memo[1-4]


run $file

done &

I/O Redirection on a Loop

Input/Output redirected into the loop applies to all commands in the loop that read their data from standard input or write to standard can also redirect the standard error output from a loop, simply by tacking on a 2> file after the done.

You can override redirection of the entire loop's input or output by explicitly redirecting the input and/or output of commands inside the loop. To force input or output of a command to come from or go to the terminal, use the fact that /dev/tty always refers to your terminal.

for file


echo "Processing file $file" > /dev/tty

done > output

Piping Data Into and Out of a Loop

A command's output can be piped into a loop, and the entire output from a loop can be piped into another command in the expected manner.

for i in 1 2 3 4


echo $i

done | wc -l

Typing a Loop on One Line

for i in 1 2 3 4; do echo $i; done

if commands can also be typed on the same line using a similar format:

if [ 1 = 1 ]; then echo yes; fi

The getopts Command

getopts options variable

The getopts command is designed to be executed inside a loop. Each time through the loop, getopts examines the next command line argument and determines whether it is a valid option. This determination is made by checking to see whether the argument begins with a minus sign and is followed by any single letter contained inside options. If it does, getopts stores the matching option letter inside the specified variable and returns a zero exit status.

If the letter that follows the minus sign is not listed in options, getopts stores a question mark inside variable before returning with a zero exit status. It also writes an error message to standard error.

If no more arguments are left on the command line or if the next argument doesn't begin with a minus sign, getopts returns a nonzero exit status.

To indicate to getopts that an option takes a following argument, you write a colon character after the option letter on the getopts command line.

getopts mt: option

If getopts doesn't find an argument after an option that requires one, it stores a question mark inside the specified variable and writes an error message to standard error. Otherwise, it stores the actual argument inside a special variable called OPTARG.

Another special variable called OPTIND is initially set to one and is updated each time getopts returns to reflect the number of the next command-line argument to be processed.

while getopts mt: option


case "$option"


m) mailopt=TRUE;;

t) interval=$OPTARG;;

\?) echo "Usage: mon [-m] [-t n] user"

exit 1;;



shiftcount=$((OPTIND – 1))

shift $shiftcount


Reading and Printing Data

The read Command read variables

You can use the –r option to read to prevent shell interpreting the backslash character.

while read –r line

Special echo Escape Characters \c

\c tells echo to leave the cursor right where it is after displaying the last argument and not to go to the next line.

The printf Command

printf "format" arg1 arg2 ...

printf Conversion Specification Modifiers

- Left justify value.

+ Precede integer with + or -.

(space) Precede positive integer with space character.

# Precede octal integer with 0, hexadecimal integer with 0x or 0X.

Width Minimum width of field; * means use next argument as width.


Minimum number of digits to display for integers; maximum number of characters to display for strings; * means use next argument as precision.

More on Subshells

A subshell can't change the value of a variable in a parent shell, nor can it change its current directory.

The . Command

Its purpose is to execute the contents of file in the current shell, A subshell is not spawned to execute the program.

The (...) and { ...; } Constructs

They are used to group a set of commands together.

The first form causes the commands to be executed by a subshell, the latter form by the current shell.

For {}., if the commands enclosed in the braces are all to be typed on the same line, a space must follow the left brace, and a semicolon must appear after the last command.

.profile File

/etc/profile and $HOME/.profile

I/O Redirection

< file, > file

>| file

Redirect standard output to file; file is created if it doesn't exist and zeroed if it does; the noclobber (-C) option to set is ignored.

>> file

Like >, only output is appended to file if it already exists.

Redirect the standard output for a command to standard error by writing

command >& 2

Collect the standard output and the standard error output from a program into the same file.

command >foo 2>>foo or command >foo 2>&1

Because the shell evaluates redirection from left to right on the command line, the last example cannot be written: command 2>&1 > foo

Inline Input Redirection

command <<word


the shell uses the lines that follow as the standard input for command, until a line that contains just word is found.

The shell performs parameter substitution for the redirected input data, executes back-quoted commands, and recognizes the backslash character. However, any other special characters, such as *, |, and ", are ignored. If you have dollar signs, back quotes, or backslashes in these lines that you don't want interpreted by the shell, you can precede them with a backslash character. Alternatively, if you want the shell to leave the input lines completely untouched, you can precede the word that follows the << with a backslash.

If the first character that follows the << is a dash (-), leading tab characters in the input will be removed by the shell.

<& digit Standard input is redirected from the file associated with file descriptor digit.

>& digit Standard output is redirected to the file associated with file descriptor digit.

<&- Standard input is closed.

>&- Standard output is closed.

<> file Open file for both reading and writing.

Shell Archives

One or more related shell programs can be put into a single file and then shipped to someone.

cat > program_name <<\THE-END-OF-DATA


The exec Command

Because the exec'ed program replaces the current one, there's one less process hanging around; also, startup time of an exec'ed program is quicker, due to the way the Unix system executes processes.

exec can be used to close standard input and reopen it with any file that you want to read. To change standard input to file:

exec < file

Redirection of standard output is done similarly: exec > report

If you use exec to reassign standard input and later want to reassign it someplace else, you can simply execute another exec. To reassign standard input back to the terminal, you would write

exec < /dev/tty

The set Command

General Format: set options args

This command is used to turn on or off options as specified by options. It is also used to set positional parameters, as specified by args.

Each single letter option in options is enabled if the option is preceded by a minus sign (-), or disabled if preceded by a plus sign (+).

set Options


-- option tells set not to interpret any subsequent arguments on the command line as options. It also prevents set from displaying all your variables if no other arguments follow.


Automatically export all variables that are subsequently defined or modified.


Print each shell command line as it is read.


Print each command and its arguments as it is executed,

set with No Arguments

Using set to Reassign Positional Parameters

There is no way to directly assign a value to a positional parameter;

These parameters are initially set on execution of the shell program. The only way they may be changed is with the shift or the set commands. If words are given as arguments to set on the command line, those words will be assigned to the positional parameters $1, $2, and so forth. The previous values stored in the positional parameters will be lost forever. So

set a b c

assigns a to $1, b to $2, and c to $3. $# also gets set to 3.

The IFS Variable stands for Internal Field Separator.

echo "$IFS" | od -b

The readonly Command

readonly –p gets a list of your read-only variables

The unset Command

unset x Remove x from the environment

This causes the shell to erase definitions of the variables or functions listed in names. Read-only variables cannot be unset. The –v option to unset specifies that a variable name follows, whereas the –f option specifies a function name. If neither option is used, it is assumed that variable name(s) follow.

The eval Command

eval command-line

the net effect is that the shell scans the command line twice before executing it.


ls $pipe wc -l

The shell takes care of pipes and I/O redirection before variable substitution, so it never recognizes the pipe symbol inside pipe.

eval ls $pipe wc –l

The first time the shell scans the command line, it substitutes | as the value of pipe. Then eval causes it to rescan the line, at which point the | is recognized by the shell as the pipe symbol.

Get the last parameter: eval echo \$$#

The wait Command

wait process-id

where process-id is the process id number of the process you want to wait for. If omitted, the shell waits for all child processes to complete execution.

The trap Command

trap commands signals

Commonly Used Signal Numbers

0 Exit from the shell

1 Hangup

2 Interrupt (for example, Delete, Ctrl+c key)

15 Software termination signal (sent by kill by default)

trap "rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit" 1 2

The value of WORKDIR and $$ will be substituted at the time that the trap command is executed. If you wanted this substitution to occur at the time that either signal 1 or 2 was received (for example, WORKDIR may not have been defined yet), you can put the commands inside single quotes:

trap 'echo logged off at $(date) >>$HOME/logoffs' 0

trap 'rm $WORKDIR/work1$$ $WORKDIR/dataout$$; exit' 1 2

trap with No Arguments

Executing trap with no arguments results in the display of any traps that you have changed.

Ignoring Signals

If the command listed for trap is null, the specified signal will be ignored when received.

trap "" 2

If you ignore a signal, all subshells also ignore that signal. However, if you specify an action to be taken on receipt of a signal, all subshells will still take the default action on receipt of that signal.

If you execute trap : 2

and then execute your subshells, then on receiving the interrupt signal the current shell will do nothing (it will execute the null command), but all active subshells will be terminated (they will take the default action—termination).

Resetting Traps

After you've changed the default action to be taken on receipt of a signal, you can change it back again with trap if you simply omit the first argument;

trap 1 2


name () { command; ... command; }

Removing a Function Definition

To remove the definition of a function from the shell, you use the unset command with the –f option.

unset –f nu


Unix Shell Programming, Third Edition

Small Tips on Notepad++

                        Small Tips on Notepad++

Notepad++ is a great free source code editor and Notepad replacement, which supports several programming languages.

Useful Notepadd+ keyboard shortcuts
Ctrl+o      Open
Ctrl-Z      Undo
Ctrl-Y       Redo
Ctrl-D      Duplicate Current Line
Ctrl-L       Delete Current Line
F3      Fine Next
Shift-F3        Find Previous
Ctrl-G      Launch GoToLine Dialog
Ctrl-W      Close Current Document
F11         Toggle Full Screen Mode
Ctrl-Tab        Next Document
Ctrl-Shift-Tab Previous Document
Ctrl-Shift-Up  Move Current Line Up
Ctrl-Shift-Down   Move Current Line Down   
Ctrl-Shift-BackSpace     Delete to start of line
Ctrl-Shift-Delete      Delete to end of line
Ctrl-U      Convert to lower case
Ctrl-Shift-U    Convert to UPPER CASE
Shift-Tab (selection of several lines) 
Remove Tabulation or Space (outdent)

Formatting xml
1. Select the xml code
2. TextFX -> HTML Tidy -> Tidy: Reindent XML
Delete the first word/space/line no’s
TextFX > TextFX Tools > Delete Line Numbers or First word

Remove the trailing spaces
TextFX > TextFX Edit > Trim Trailing Spaces

Remove Blank Lines
TextFX > TextFX Edit > Delete Blank Lines
TextFX > TextFX Edit > Delete Surplus Blank Lines
Install notepad++ plugin
    Just search and download plugin from, and extract it, you will find .dll files, copy them to notepad++/plugins/ directory, and restart np++.

Useful notepad++ plugins
Notepadd++ already includes:
Mime Tools   Base64 encode/decode,
TextFx          provides many practical function such as upper/lower case, escape/unescape, auto-close XML,
nppExport     Export file to RTF/HTML format.

Explorer Plugin – a file browser
Windows Manager - help to switch a file to another.

Multi Clipboard - add up to ten text phrases to the clipboard.
Compare Plugin - show the difference between 2 files.
XML Tools - XML Checker. Can auto-close tags and provides other useful features.

Customize Notepad++
    In this article, the author introduces a way to port ANY Textmate theme over to Notepad++.


Useful Linux(Unix) Commands

Useful Linux/Unix Commands

find [pathnames] [conditions]
Conditions may be grouped by enclosing them in \( \) (escaped parentheses), negated with ! (use \! in the C shell), given as alternatives by separating them with -o, or repeated (adding restrictions to the match; usually only for -name, -type, -perm).
Conditions and actions
-atime(ctime,mtime) +n | -n | n, -amin(cmin,mmin)+n | -n | n
Find files last accessed,changed,modified more than n (+n), less than n (-n), or exactly n days ago or n minutes ago.
-anewer file, -cnewer file
Find files that were accessed,changed after file was last modified.
-exec command { } \;
Run the Linux command, from the starting directory on each file matched by find, When command runs, the argument { } substitutes the current file. Follow the entire sequence with an escaped semicolon (\;).
-maxdepth num Do not descend more than num levels of directories.
-mindepth num Begin applying tests and actions only at levels deeper than num levels.
-user user,-group gname
-name pattern
Find files whose names match pattern. Filename metacharacters may be used but should be escaped or quoted.
-newer file
-ok command { }\;
Same as -exec but prompts user to respond with y before command is executed.
-perm nnn
-print Print the matching files and directories, using their full pathnames. Return true.
-size n[c] Find files containing n blocks, or if c is specified, n characters long.
-type c
c can be b (block special file), c (character special file), d (directory), p (fifo or named pipe), l (symbolic link), s (socket), or f (plain file).
-false Return false value for each file encountered.
-ilname(iname,ipath,iregex) pattern: A case-insensitive version of -lname,-name,-path,-regex.
-lname pattern
Search for files that are symbolic links, pointing to files named pattern.
-nouser The file's user ID does not correspond to any user.
-nogroup The file's group ID does not correspond to any group.
-path pattern
find files whose names match pattern. Expect full pathnames relative to the starting pathname.
find / -type d -name 'man*' -user ann -print
Find and compress files whose names don't end with .gz:
gzip `find . \! -name '*.gz' -print`
Search the system for files that were modified within the last two days
find / -mtime -2 -print
Recursively grep for a pattern down a directory tree:
find /book -print | xargs grep '[Nn]utshell'
Remove all empty files under current directory (prompting first):
find . -size 0 -ok rm {} \;
find . -name "*.txt" -ok mv {} junkdir \;
find / -name core | xargs /bin/rm -f
find / -name core -exec /bin/rm -f '{}' \; # same thing
find / -name core -delete # same if using Gnu find
#find all files newer than FILE, and delete
find /dir -type f -newer /path/to/FILE -exec rm \{\} \;
#find all files newer than 1 day, and tar them to file.tar
find /dir -type f -mtime -1 | tar -c -T - -f file.tar
Find all old string and replace them with new value in all files
find . -name '*.java' -exec sed -i 's/net.jcip.examples/concurrency/g' '{}' \;
find . -type f \( -name "*.sh" -o -name "*.java" \)
find / -type f -mtime -7 | xargs tar -rf weekly_incremental.tar
show only directory:
ls -al /root | grep '^d'
find /root -type d
find ./ -name "f[Oo][Oo]" -print
find . -name '*.log' -exec grep -Hn 'ERROR' {} \;
Find all the hidden files
find . -type f -name ".*"
Finding the Top 5 Big Files
find . -type f -exec ls -s {} \; | sort -n -r | head -5
find . -name \*.log -exec tail {} \;

Secure Copy with scp
scp name-of-source name-of-destination
Format of name-of-source and name-of-destination:
Field Default for Local Host Default for Remote Host
Username(Followed by @) Invoking user’s username Same
Hostname(Followed by :) Localhost Localhost
Port number(Preceded by #) 22 22
Directory path Current (invoking) directory Username’s remote home directory
Handling of Wildcards
scp for SSH1 and OpenSSH has no special support for wildcards in filenames. It simply lets the shell expand them:
scp *.txt
Wildcards in remote file specifications are evaluated on thelocal machine, not the remote.
scp1*.txt . Bad idea!
Always escape your wildcards so they are explicitly ignored by the shell and passed to scp1:
scp1\*.txt .
scp2 does its own regular expression matching after shell-wildcard expansion is complete.
-r Recursive Copy of Directories
scp -r /usr/local/bin
Clever way to eecursively copr directories
tar cf - /usr/local/bin | ssh tar xf -
-p Preserving Permissions
-u Automatic Removal of Original File
Safety Features(Only available on scp2)
d option ensures that name-of-destination is a directory.
n option instructs the program to describe its actions but not perform any copying.

ftp [options] [hostname]
-d Enable debugging.
-g Disable filename globbing.
-i Turn off interactive prompting.
-v Verbose. Show all responses from remote server.
![command [args]]
Invoke an interactive shell on the local machine.
append local-file [remote-file]
Append a local file to a file on the remote machine.
ascii, binary
cd remote-directory
cdup Change working directory of remote machine to its parent directory.
chmod [mode] [remote-file]
delete remote-file
get remote-file [local-file]
Toggle filename expansion for mdelete, mget, and mput. If globbing is turned off, the filename arguments are taken literally and not expanded.
lcd [directory] Change working directory on local machine.
ls [remote-directory]
mkdir remote-directory-name
mget remote-files
mput [local-files]
mdelete remote-files
newer remote-file [local-file] Get file if remote file is newer than local file.
open host [port]
put local-file [remote-file]
pwd Print name of the current working directory on the remote machine.
reget remote-file [local-file]
Retrieve a file (like get), but restart at the end of local-file. Useful for restarting a dropped transfer.
rename [from] [to]
Rename file from on remote machine to to.
rmdir remote-directory-name]
Delete a directory on the remote machine.
send local-file [remote-file] Synonym for put.
system Show type of operating system running on remote machine.

Compress and decompress
tar [options] [tarfile] [other-files]
-c, --create Create a new archive.
-d, --compare Compare the files stored in tarfile with other-files.
-r, --append Append other-files to the end of an existing archive.
-t, --list Print the names of other-files if they are stored on the archive.
-u, --update Add files if not in the archive or if modified.
-x, --extract, --get Extract other-files from an archive.
-A, --catenate, --concatenate Concatenate a second tar file on to the end of the first.
--atime-preserve Preserve original access time on extracted files.
--exclude=file Remove file from any list of files.
-k, --keep-old-files
When extracting files, do not overwrite files with similar names. Instead, print an error message.
-m, --modification-time
Do not restore file modification times; update them to the time of extraction.
-p, --same-permissions, --preserve-permissions
Keep ownership of extracted files same as that of original permissions.
Remove originals after inclusion in archive.
-K file, --starting-file file Begin tar operation at file file in archive.
-N date, --after-date date Ignore files older than date.
-O, --to-stdout Print extracted files on standard out.
tar cvf /dev/rmt0 /bin /usr/bin, tar cvf home.tar ~ *.txt, tar tvf home.tar, tar xvf home.tar ~/tmpdir
Create an archive of the current directory and store it in a file backup.tar:
tar cvf - `find . -print` > backup.tar
gzip [options] [files]
-c, --stdout, --to-stdout Print output to standard output, and do not change input files.
-d, --decompress, --uncompress
-f, --force
-r, --recursive
-n, --no-name
-S suffix, --suffix suffix
-t, --test Test compressed file integrity.
gzip *, gunzip *, gzip -r dir
gzip -c file1 > foo.gz
gzip -c file2 >> foo.gz
cat file1 file2 | gzip > foo.gz
gzip -c file1 file2 > foo.gz
gunzip [options] [files]
zcat [options] [files]
Read one or more files that have been compressed with gzip or compress and write them to standard output. zcat is identical to gunzip -c and takes the options -fhLV.

grep [options] pattern [files]
Operator TypeExamplesDescription
Literal Characters
Match a character exactly
a A y 6 % @Letters, digits and many special
characters match exactly
\$ \^ \+ \\ \?Precede other special characters
with a \ to cancel their regex special meaning
\n \t \rLiteral new line, tab, return
Anchors and assertions^Starts with
$Ends with
Character groups
any 1 character from the group
[aAeEiou]any character listed from [ to ]
[^aAeEiou]any character except aAeEio or u
[a-fA-F0-9]any hex character (0 to 9 or a to f)
.any character at all
apply to previous element
+1 or more ("some")
egrep only
*0 or more ("perhaps some")
?0 or 1 ("perhaps a")
Alternation|either, or
egrep only
Grouping( )group for count and save to variable
egrep only
-c, --count
Print only a count of matched lines. With -v or --revert-match option, count nonmatching lines.
-e pattern, --regexp=pattern
-f file, --file=file Take a list of patterns from file, one per line.
-h, --no-filename Print matched lines but not filenames (inverse of -l).
-i, --ignore-case
-l, --files-with-matches
List the names of files with matches but not individual matched lines;
-n, --line-number Print lines and their line numbers.
-r, --recursive Recursively read all files under each directory. Same as -d recurse.
-v, --revert-match Print all lines that don't match pattern.
-w, --word-regexp Match on whole words only.
-x, --line-regexp Print lines only if pattern matches the entire line.
-L, --files-without-match List files that contain no matching lines.
List the number of users who use bash:
grep -c /bin/bash /etc/passwd
List header files that have at least one #include directive:
grep -l '^#include' /usr/include/*
List files that don't contain pattern:
grep -c pattern files | grep :0
egrep [options] [regexp] [files]
egrep -n '(old|new)\.doc?' files
find . | grep freeze.*\.log
xargs [options] [command]
Execute command (with any initial arguments), but read remaining arguments from standard input instead of specifying them directly. xargs allows command to process more arguments than it could normally handle at once.
Xargs' power lies in the fact that it can take the output of one command(ls or find), in this case find, and use that output as arguments to another command.
-a Read from file instead of stdin
-d Custom delimiters
-i[string], --replace[=string]
Edit all occurrences of , or string, to the names read in on standard input. Unquoted blanks are not considered argument terminators.
-l[lines], --max-lines[=lines] -L lines
Allow no more than 1, or lines, nonblank input lines on the command line, Using this we can concatenate n lines into one.
-n args, --max-args=args
Allow no more than args arguments on the command line. May be overridden by -s.
-p, --interactive Prompt for confirmation before running each command line. Implies -t.
-P max, --max-procs=max
Allow no more than max processes to run at once. The default is 1. A maximum of 0 allows as many as possible to run at once.
-r, --no-run-if-empty
Tell xargs when to Quit,do not run command if standard input contains only blanks.
-s max, --max-chars=max Allow no more than max characters per command line.
-t, --verbose Verbose mode. Print command line on standard error before executing.
-x, --exit If the maximum size (as specified by -s) is exceeded, exit.
find / -print | xargs grep pattern > out &
Run diff on file pairs (e.g., f1.a and f1.b, f2.a and f2.b ...):
echo * | xargs -t -n2 diff
Display file, one word per line (same as deroff -w):
cat file | xargs -n1
ls olddir | xargs -i -t mv olddir/ newdir/
find ~ -type f -mtime +1825 | xargs -r ls -l
Lists all directories
find . -maxdepth 1 -type d -print | xargs echo
find . -maxdepth 1 -type d -print | xargs -t -i {} echo
xargs -a foo -d, -L 1 echo
echo "foo,bar,baz" | xargs -d, -L 1 echo
ls | xargs -L 4 echo, ls | xargs -l4 echo
-b, --background
Logging and Input File Options
-a logfile, --append-output=logfile
-d, --debug
-i file, --input-file=file
Read URLs from file, in which case no URLs need to be on the command line.
F, --force-html
When input is read from a file, force it to be treated as an HTML file. This enables you to retrieve relative links from existing HTML files on your local disk, by adding "" to HTML, or using the --base command-line option.
-B URL, --base=URL
When used in conjunction with -F, prepends URL to relative links in the file specified by -i.
Download Options
-t number, --tries=number
-O file,--output-document=file
-c, --continue Continue getting a partially-downloaded file.
-T seconds, --timeout=seconds
Directory Options
-nd, --no-directories
Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering.
-x, --force-directories
The opposite of -nd---create a hierarchy of directories
-nH, --no-host-directories Disable generation of host-prefixed directories.
-P prefix, --directory-prefix=prefix
HTTP Options
--http-user=user, --http-passwd=password
Recursive Retrieval Options
-r,--recursive Turn on recursive retrieving.
-l depth, --level=depth Specify recursion maximum depth level depth.
-k, --convert-links
After the download is complete, convert the links in the document to make them suitable for local viewing.
-K, --backup-converted
When converting a file, back up the original version with a .orig suffix.
-p, --page-requisites
This option causes Wget to download all the files that are necessary to properly display a given HTML page.
Recursive Accept/Reject Options
-A acclist --accept acclist
-R rejlist --reject rejlist
Specify comma-separated lists of file name suffixes or patterns to accept or reject.
wget -c
wget --convert-links -r -p -o log
wget -p --convert-links
Download multiple files on command line using wget
wget -cb -o /tmp/download.log -i /tmp/download.txt
nohup command [arguments]
The nohup command is used to start a command that will ignore hangup signals and will append stdout and stderr to a file.
TTY output is appended to either nohup.out or $HOME/nohup.out
The nohup command will not execute a pipeline or a command list. You can save a pipeline or list in a file and then run it using the sh (default shell) or the bash command.
tr [options] [string1 [string2]]
Translate characters—copy standard input to standard output, substituting characters from string1 to string2 or deleting characters in string1.
-c, --complement
Complement characters in string1 with respect to ASCII 001-377.
-d, --delete
Delete characters in string1 from output.
-s, --squeeze-repeats
Squeeze out repeated output characters in string2.
-t, --truncate-set1
Truncate string1 to the length of string2 before translating.
echo "12345678 9247" | tr 123456789 computerh
cat file | tr '[A-Z]' '[a-z]'
tr "set1" "set2" < input.txt
tr "set1" "set2" < input.txt > output.txt
Turn spaces into newlines (ASCII code 012):
tr ' ' '\012' < file
Strip blank lines from file and save in new.file(or use 011 to change successive tabs into one tab):
cat file | tr -s "" "\012" > new.file
tr -d : < file > new.file
tee [options] files
Accept output from another command and send it both to the standard output and to files.
-a, --append Append to files; do not overwrite.
ls -l | tee savefile
su [option] [user] [shell_args]
Create a shell with the effective user-ID user. If no user is specified, create a shell for a privileged user (that is, become a superuser). Enter EOF to terminate.
-, -l, --login
Go through the entire login sequence (i.e., change to user's environment).
-c command, --command=command
Execute command in the new shell and then exit immediately. If command is more than one word, it should be enclosed in quotes
su -c 'find / -name \*.c -print' nobody
-m, -p, --preserve-environment
Do not reset environment variables.
-s shell, --shell=shell
Execute shell, not the shell specified in /etc/passwd, unless shell is restricted.
sudo - execute a command as another user
spwan a shell for specified user,The exit command (or CONTROL-D) terminates the spwaned shell, returning the user to his former shell and prompt.
cause sudo to run the specified command as a user other than root. To specify a uid instead of a username, use #uid.
The -- flag indicates that sudo should stop processing command line arguments. It is most useful in conjunction with the -s flag.
udo shutdown -r +15 "quick reboot"

nl: number lines of files
tac [options] [file] prints files in reverse
Monitoring Programs
ps -efH
The -H parameter organizes the processes in a hierarchical format, showing which processes started which other processes.
Real-time process monitoring: top
Stopping processes:kill, killall
kill `ps -aef | grep dscli | awk '{print $2}'`
Linux Process Signals
Signal Name Description
1 HUP Hang up.
2 INT Interrupt.
3 QUIT Stop running.
9 KILL Unconditionally terminate.
11 SEGV Segment violation.
15 TERM Terminate if possible.
17 STOP Stop unconditionally, but don’t terminate.
18 TSTP Stop or pause, but continue to run in background.
19 CONT Resume execution after STOP or TSTP.
The generally accepted procedure is to first try the TERM signal. If the process ignores that, try the INT or HUP signals. If the program recognizes these signals, it’ll try to gracefully stop doing what it was doing before shutting down. The most forceful signal is the KILL signal. When a process receives this signal, it immediately stops running. This can lead to corrupt files.
The killall command
Kill processes by command name. If more than one process is running the specified command, kill all of them. Treat command names that contain a / as files; kill all processes that are executing that file.
-i Prompt for confirmation before killing processes.
Monitoring Disk Space
Mounting media:mount -t type device directory
By default, the mount command displays a list of media devices currently mounted on the system: mount
the types you’re most likely to run into are:
vfat: Windows long filesystem.
ntfs: Windows advanced filesystem used in Windows NT, XP, and Vista.
iso9660: The standard CD-ROM filesystem.
To manually mount the USB memory stick at device /dev/sdb1 at location /media/disk: mount -t vfat /dev/sdb1 /media/disk

The -o option allows you to mount the filesystem with a comma-separated list of additional
options. The popular options to use are:
ro: Mount as read-only.
rw: Mount as read-write.
user: Allow an ordinary user to mount the filesystem.
check=none: Mount the filesystem without performing an integrity check.
loop: Mount a file.
Mount iso file:
mkdir mnt; mount -t iso9660 -o loop MEPIS-KDE4-LIVE-DVD 32.iso mnt
List of filesystems to be mounted and options to use when mounting them.
List of filesystems that are currently mounted and the options with which they were mounted.
The umount command:umount [directory | device ]
umount /home/rich/mnt
Using the df command
The df command shows each mounted filesystem that contains data.
-h, --human-readable
Print sizes in a format friendly to human readers, usually as an M for megabytes or a G for gigabyte.
-m, --megabytes;-k, --kilobytes;
Using the du command
The du command shows the disk usage for a specific directory, it displays all of the files, directories, and subdirectories under the current directory, and it shows how many disk blocks each file or directory takes.
-s, --summarize Print only the grand total for each named directory.
-c, --total In addition to normal output, print grand total of all arguments.
-b, -m
-h, --human-readable(same as df)
-s, --summarize       display only a total for each argument
  -x, --one-file-system  skip directories on different file systems
  -X FILE, --exclude-from=FILE  Exclude files that match any pattern in FILE.
      --exclude=PATTERN  Exclude files that match PATTERN.
      --max-depth=N     print the total for a directory (or file, with --all)
                          only if it is N or fewer levels below the command
                          line argument;  --max-depth=0 is the same as
      --time            show time of the last modification of any file in the
                          directory, or any of its subdirectories
      --time=WORD       show time as WORD instead of modification time:
                          atime, access, use, ctime or status
      --time-style=STYLE  show times using style STYLE:
                          full-iso, long-iso, iso, +FORMAT
                          FORMAT is interpreted like `date'
'du' - Finding the size of a directory
du -h /var/log
du -s /var/log
This displays a summary of the directory size. It is the simplest way to know the total size of directory.
$ du -S
Display the size of the current directory excluding the size of the subdirectories that exist within that directory.
$ df -h | grep /dev/hda1 | cut -c 41-43
du -ch | grep total
This would have only one line in its output that displays the total size of the current directory including all the subdirectories.
strings [options] files
Search each file specified and print any printable character strings found that are at least four.
strings /bin/ls
cpio flags [options]
Copy file archives in from or out to tape or disk, or to another location on the local machine. Each of the three flags -i, -o, or -p accepts different options.
-i, --extract [options] [patterns]
Copy in (extract) from an archive files whose names match selected patterns. Patterns should be quoted or escaped so they are interpreted by cpio, not by the shell. If pattern is omitted, all files are copied in. Existing files are not overwritten by older versions from the archive unless -u is specified.
-o, --create [options]
Copy out to an archive a list of files whose names are given on the standard input.
-p, --pass-through [options] directory
Copy (pass) files to another directory on the same system. Destination pathnames are interpreted relative to the named directory.
Comparison of valid options
Options available to the -i, -o, and -p flags are shown here.
i: bcdf mnrtsuv B SVCEHMR IF
o: 0a c vABL VC HM O F
p: 0a d lm uv L V R
-0, --null
Expect list of filenames to be terminated with null, not newline. This allows files with a newline in their names to be included.
-d, --make-directories Create directories as needed.
-f, --nonmatching
Reverse the sense of copying; copy all files except those that match patterns.
-O file Archive the output to file, which may be a file on another machine.
-t, --list
Print a table of contents of the input (create no files). When used with the -v option, resembles output of ls -l.
-u, --unconditional Unconditional copy; old files can overwrite new ones.
-v, --verbose Print a list of filenames processed.
ls | cpio -ov > directory.cpio
The `-o' option creates the archive, and the `-v' option prints the names of the files archived as they are added. Notice that the options can be put together after a single `-' or can be placed separately on the command line. The `>' redirects the cpio output to the file `directory.cpio'.
cpio -idv < tree.cpio
find . -print -depth | cpio -ov > tree.cpio
find . -depth -print0 | cpio --null -pvd new-dir
Some new options are the `-print0' available with GNU find, combined with the `--null' option of cpio. These two options act together to send file names between find and cpio, even if special characters are embedded in the file names. Another is `-p', which tells cpio to pass the files it finds to the directory `new-dir'.
find . -name "*.old" -print | cpio -ocBv > /dev/rst8
Generate a list of files whose names end in .old using find; use list as input to cpio.
cpio -icdv "*save*" < /dev/rst8
Restore from a tape drive all files whose names contain save (subdirectories are created if needed).
Move a directory tree: find . -depth -print | cpio -padm /mydir
In copy-pass mode, cpio copies files from one directory tree to another, combining the copy-out and copy-in steps without actually using an archive. It reads the list of files to copy from the standard input; the directory into which it will copy them is given as a non-option argument.
Open/Extract ISO files
mkdir /mnt/iso; mount -o loop *.iso /mnt/iso
top updates its display regularlyevery three seconds by default.
You can terminate processes by command name or by PID. When you terminate a process
by name, you save yourself the hassle of looking up the PID, but if there is more than one
process of the same name running, you will kill them all.

To kill by command name: $ killall xclock
If the process doesn't terminate within a few seconds, add the -KILL argument: $ killall -KILL xclock
kill -l shows the signal names and numbers.
starting a process with a lower (or higher) priority than normal?
The nice command starts a process with a lower-than-normal priority. The priority can be reduced by any value from 1 to 19 using the -n argument; without -n, the priority is reduced by a value of 10.
$ nice -n 15 xboard
To raise the priority of a process, you must be root; supply a negative priority adjustment
between 1 (slight boost in priority over normal) to 20 (highest priority):
# nice -n -12 xboard
changing the priority of an existing process
renice is the tool for this:
# renice 2 27365; # renice -5 27365
Adding and managing users from the command line
For users, there are useradd, usermod, and userdel; for groups, there are groupadd, groupmod, and groupdel.
The express way to add a user is to use useradd and then set the new user's password using passwd: # useradd jane; # passwd jane
usermod is used to adjust the parameters of existing accounts.
usermod -c "Jane Lee" jane
the userdel command deletes a user. The -r option specifies that the user's
home directory and mail spool (/var/spool/mail/ user) should also be removed:#userdel -r jane
groupadd groupname
The only option commonly used is -g, which lets you manually select the group ID (useful
if converting data from an old system):
# groupadd -g 781 groupname
# groupmod -g 947 groupname
# groupmod -n newname groupname
# groupdel groupname
Check All groups: cat /etc/group
We can directly modify this file to add user to group.
Add a existing user to existing group
Add existing user tony to ftp supplementary/secondary group with usermod command using -a option.
add the user to the supplemental group(s). Use only with -G option :
# usermod -a -G ftp tony
Change existing user tony primary group to www: # usermod -g www tony
Add a new user to primary group
To add a user tony to group developers use following command:
# useradd -g developers tony; id tony
Add a new user to secondary group
Use useradd command to add new users to existing group (or create a new group and then add user). If group does not exist, create it. Syntax:
useradd -G {group-name} username
Managing user passwords with passwd
passwd; passwd jane
The root user can also delete a password from an account (so a user can log in with just a
username): # passwd -d jane
To find out the password status of an account, use -S: # passwd -S jane
Managing groups and delegating group maintenance from the command line
The gpasswd command can be used to set a group password. This is rarely done. However,
it is also used to manage groups and, better yet, to delegate group administration to any user.
To specify the members of a group, use the -M option: # gpasswd -M jane,richard,frank audit
You can also add or delete individual group users using the -a and -d options:
# gpasswd -a audrey audit
# gpasswd -d frank audit
Delegation is performed with the -A (administrator) option: # gpasswd -A jane audit
Control Access to Files
Using group permissions
The group identity can be changed at any time using the newgrp command, and verified with the id command:newgrp audit;
The current group identity (also called real group ID) affects only the creation of files and directories; existing files and directories keep their existing group, and a user can access files accessible to any group to which she belongs.
chgrp modifies the group ownership of an existing file: chgrp audit report.txt
Using chgrp and newgrp is cumbersome. A much better solution is to use the SGID permission on directories, which automatically sets the group ownership
chgrp soccer game_scores
chmod u=rwx,g=rwxs,o= game_scores
ls -ld game_scores
drwxrws--- 2 richard soccer 4096 Oct 12 19:46 game_scores
Because the SGID permission is set, any file created in that directory is automatically owned
by the group soccer and can be accessed by other group membersexactly what is needed for collaboration within a group. The SGID permission is automatically applied to any directory created within games_scores, too.
Default permissions
This requested permission is limited by the current umask, which is an octal value representing the permissions that should not be granted to new files.
You can set the umask with the shell command of the same name: umask 037
umask by itself displays the current mask value.
changing a file's owner and group at the same time
The chown command permits you to specify a group after the username, separated by a colon.
chown barbara:smilies /tmp/input
Clean file content:    echo "" > file
System Admin
ulimit - get and set user limits
Syntax: ulimit [-SHacdfilmnpqstuvx] [limit](on ubuntu)
The ulimit utility allows you to limit almost everything in Linux except disk storage (for which purpose quota is used).
Seeing What Is Currently Set: ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 20
file size (blocks, -f) unlimited
pending signals (-i) 16382
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) unlimited
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
ulimit -H -a; ulimit -S -a
Setting a Value:
Setting a process limit:ulimit -u 30
Since neither -H nor -S were specified, the value set is both hard and soft.
Setting Soft Limits: [ulimit -S] ulimit -S -u 100
Setting Hard Limits: ulimit -H
A hard limit really is a rule. Once set, it cannot be exceeded.
The two ways to set the hard limit are to not specify anything ( ulimit –u 100 ), which effectively sets both the hard and soft limits, or use the –H parameter: ulimit –H –u 100 .
A bash forkbomb, any user with shell access to your box could take it down
$ :(){ :|:& };:
Miscellaneous Commands
* To set a value to unlimited, use the word itself: ulimit -u unlimited .
* To see only one value, specify that parameter. For example, to see the soft value of user processes, enter: ulimit -Su
* Default values are set in /etc/profile but can -- in some implementations -- also be derivatives of values set in /etc/initscript or /etc/security/limits.conf.
Changing the User Max File Descriptor Limit:
* add the following lines to /etc/security/limits.conf
* soft nofile 8192
* hard nofile 8192
* soft limit, is the default max value you get when you login
* hard limit, is the max value to which you can raise using the ulimit command.
sysctl - configure kernel parameters at runtime
sysctl is used to modify kernel parameters at runtime. The parameters available are those listed under /proc/sys/. If you wish to keep settings persistent across reboots you should edit /etc/sysctl.conf
Exploring sysctl variables: sysctl -a
Reducing swappiness: sysctl vm.swappiness=0

Resources: Directory of Linux Commands)


Java (159) Lucene-Solr (112) Interview (61) All (58) J2SE (53) Algorithm (45) Soft Skills (39) Eclipse (33) Code Example (31) JavaScript (23) Linux (23) Spring (22) Windows (22) Tools (21) Web Development (20) Nutch2 (18) Bugs (17) Debug (16) Defects (14) Text Mining (14) Troubleshooting (14) J2EE (13) Network (13) PowerShell (11) Problem Solving (10) Chrome (9) Design (9) How to (9) Learning code (9) Performance (9) UIMA (9) html (9) Http Client (8) Maven (8) Security (8) bat (8) blogger (8) Big Data (7) Continuous Integration (7) Database (7) Google (7) Guava (7) JSON (7) Shell (7) System Design (7) ANT (6) Coding Skills (6) Lesson Learned (6) Programmer Skills (6) Scala (6) Tips (6) css (6) Algorithm Series (5) Cache (5) IDE (5) adsense (5) xml (5) AIX (4) Become a Better You (4) Code Quality (4) Dynamic Languages (4) GAE (4) Git (4) Good Programming Practices (4) Jackson (4) Memory Usage (4) Miscs (4) OpenNLP (4) Project Managment (4) Spark (4) Testing (4) ads (4) regular-expression (4) Android (3) Apache Spark (3) Concurrency (3) Distributed (3) Eclipse RCP (3) English (3) Happy Hacking (3) IBM (3) J2SE Knowledge Series (3) JAX-RS (3) Jetty (3) Life (3) Restful Web Service (3) Review (3) Script (3) regex (3) seo (3) .Net (2) Android Studio (2) Apache (2) Apache Procrun (2) Architecture (2) Batch (2) Bit Operation (2) Build (2) Building Scalable Web Sites (2) C# (2) C/C++ (2) CSV (2) Career (2) Cassandra (2) Fiddler (2) Firefox (2) Google Drive (2) Gson (2) How to Interview (2) Html Parser (2) Http (2) Image Tools (2) JQuery (2) Jersey (2) LDAP (2) Logging (2) Python (2) Software Issues (2) Storage (2) Text Search (2) xml parser (2) AOP (1) Application Design (1) AspectJ (1) Chrome DevTools (1) Cloud (1) Codility (1) Data Mining (1) Data Structure (1) ExceptionUtils (1) Exif (1) Feature Request (1) FindBugs (1) Greasemonkey (1) HTML5 (1) Httpd (1) I18N (1) IBM Java Thread Dump Analyzer (1) Invest (1) JDK Source Code (1) JDK8 (1) JMX (1) Lazy Developer (1) Mac (1) Machine Learning (1) Mobile (1) My Plan for 2010 (1) Netbeans (1) Notes (1) Operating System (1) Perl (1) Problems (1) Product Architecture (1) Programming Life (1) Quality (1) Redhat (1) Redis (1) RxJava (1) Solutions logs (1) Team Management (1) Thread Dump Analyzer (1) Visualization (1) boilerpipe (1) htm (1) ongoing (1) procrun (1) rss (1)

Popular Posts