This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

File Management

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

File management involves organizing, storing, and handling files and directories on a computer system. This category includes file operations that are fundamental to managing data efficiently and securely.

1 - File Management Tools

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

File management tools are essential for organizing, storing, and handling files and directories on a computer system. They provide functionalities such as copying, moving, deleting, and renaming files, which are fundamental for efficient data management. These tools help users maintain an organized file structure, making it easier to locate and access files when needed.

Advanced file management tools offer features like batch processing, which allows users to perform operations on multiple files simultaneously. They also support file permissions and ownership management, ensuring that only authorized users can access or modify certain files. This is crucial for maintaining data security and integrity.

File management tools often include search and indexing capabilities, enabling users to quickly find files based on various criteria such as name, size, or date modified. They also provide options for file compression and decompression, helping to save disk space and facilitate file transfer.

In addition to basic operations, file management tools can integrate with cloud storage services, allowing users to sync and back up their files across multiple devices. This ensures that data is always available and protected against loss. Some tools also offer version control features, enabling users to track changes and revert to previous versions of files if needed.

Overall, file management tools are indispensable for maintaining an organized, secure, and efficient computing environment. They enhance productivity by simplifying file operations and providing advanced features for data management. Whether for personal use or in a professional setting, these tools play a vital role in effective file handling and storage.

1.1 - 🖥️find

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the find command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███████╗██╗███╗   ██╗██████╗ 
#                ██╔════╝██║████╗  ██║██╔══██╗
#                █████╗  ██║██╔██╗ ██║██║  ██║
#                ██╔══╝  ██║██║╚██╗██║██║  ██║
#                ██║     ██║██║ ╚████║██████╔╝
#                ╚═╝     ╚═╝╚═╝  ╚═══╝╚═════╝ 

# To find files by case-insensitive extension (ex: .jpg, .JPG, .jpG):
find . -iname "*.jpg"

# To find directories:
find . -type d

# To find files:
find . -type f

# To find files by octal permission:
find . -type f -perm 777

# To find files with setuid bit set:
find . -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l

# To find files with extension '.txt' and remove them:
find ./path/ -name '*.txt' -exec rm '{}' \;

# To find files with extension '.txt' and look for a string into them:
find ./path/ -name '*.txt' | xargs grep 'string'

# To find files with size bigger than 5 Mb and sort them by size:
find . -size +5M -type f -print0 | xargs -0 ls -Ssh | sort -z

# To find files bigger thank 2 MB and list them:
find . -type f -size +20000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'

# To find files modified more than 7 days ago and list file information
find . -type f -mtime +7d -ls

# To find symlinks owned by a user and list file information
find . -type l --user=username -ls

# To search for and delete empty directories
find . -type d -empty -exec rmdir {} \;

# To search for directories named build at a max depth of 2 directories
find . -maxdepth 2 -name build -type d

# To search all files who are not in .git directory
find . ! -iwholename '*.git*' -type f

# To find all files that have the same node (hard link) as MY_FILE_HERE
find . -type f -samefile MY_FILE_HERE 2>/dev/null

# To find all files in the current directory and modify their permissions
find . -type f -exec chmod 644 {} \;

# To find files with extension '.txt.' and edit all of them with vim
# vim is started only once for all files
find . -iname '*.txt' -exec vim {} \+

# To find all files with extension '.png' and rename them by changing extension to '.jpg' (base name is preserved)
find . -type f -iname "*.png" -exec bash -c 'mv "$0" "${0%.*}.jpg"' {} \;

#==============================#
# CMD FIND
#==============================##==============================#

# Again, it is possible to use the find command which is practically more flexible and offers plenty of options than ls, for the same purpose as below.
#==============================#
    -maxdepth level is used to specify the level (in terms of sub-directories) below the starting point (current directory in this case) to which the search operation will be carried out.
    -newerXY, this works if timestamp X of the file in question is newer than timestamp Y of the file reference. X and Y represent any of the letters below:
        a – access time of the file reference
        B – birth time of the file reference
        c – inode status change time of reference
        m – modification time of the file reference
        t – reference is interpreted directly as a time

# This means that, only files modified on 2016-12-06 will be considered:
find . -maxdepth 1 -newermt "2016-12-06"

# Important: Use the correct date format as reference in the find command above, once you use a wrong format, you will get an error as the one below:
find . -maxdepth 1 -newermt "12-06-2016"
    
    find: I cannot figure out how to interpret '12-06-2016' as a date or time

# Alternatively, use the correct formats below:
find . -maxdepth 1 -newermt "12/06/2016"
# OR
find . -maxdepth 1 -newermt "12/06/16"

find path criteria action

find . -name "*.log" -exec ls -l {} \;

find . -user root
# The below command output the files with respect of the user (root) owned files. All the files owned by user ‘root’ in the current directory.

find -size +100M
# The find command lists all the files in the current directory above the specified size (here 100 MB), recursively.

###  Find all the log file inside a folder/subfolder and delete or archive them.

find . -name "*.log" -type f -exec rm {} \;

find . -name "*.log" -type f -exec gzip {} \;

# Kill process if its not updating files (5 min)
[[ $( find $HOME/output -type f -mmin -5 | wc -l ) -eq 0 ]] && pkill -f '/usr/bin/whatever' 
 

###  Let's say you want to delete all txt files which are older than 60 days from /temp folder.
#==============================#
find /tmp -name "*.text" -type f -mtime +60 -exec rm -f {} \;

 

###  Now we want to delete/archive some set of files which are 3 months old and whose size is bigger than 1 mb.
#==============================#
find /path -name "*.log" -type f -mtime +90 -size +1024k -exec rm -f {} \;

 

###  Find out the total size of all log files in a folder which are 30 days older.
#==============================#
find /logs -name '*.log' -type f -mtime +30 -exec du -am {} \;
# The above command will list files with their size in mb. Now you can apply logic to add the first column values to get total size using awk.

du -ak afile shows file size in KB

du -am afile shows file size in MB.
# likewise du -ag afile shows file size in GB.

###  List the files' count in a directory which are any files but not log files(*.log).
#==============================#
find /path ! -name "*.log" -type f -exec ls -l {} \; | wc -l

 

###  List out all the directories in a folder recursively whose name is "archive"
#==============================#
find . -name "archive" -type d 
#"." points to current directory.
    

###  Archive any files which are older than 2 days and but exclude the files which are already zipped, from /tmp folder.
#==============================#
find /tmp/ ! -name "*.zip" -type f -mtime +2 -exec gzip -vf {} \;
    
    
# Here, gzip has below options:
#		-v -> archive in verbose mode.
#		-f -> if there is already a file with the name filename.gz, that will be replaced.
#
#							Path 		Criteria 			Action
#
#		Current directory 		(.) 			-name    			-print
#
#		Home directory 		(~) 			-type			-exec
#
#		Parent directory 		(..) 		-links		
#
#		Absolute Path 			/root/bin	     -size 
#
#		Relative Path			bin/tmp/		-perm

# Sorting Files based on Month
# Here, we use find command to find all files in root (‘/’) directory and then print the result as: Month in which file was accessed and then filename. Of that complete result, here we list out top 11 entries.

find / -type f -printf "\n%Ab %p" | head -n 11

# The below command sorts the output using key as first field, specified by '-k1' and then it sorts on Month as specified by 'M' ahead of it.

find / -type f -printf "\n%Ab %p" | head -n 11 | sort -k1M

### Sort Files Based on Date
# Here, again we use find command to find all the files in root directory, but now we will print the result as: last date the file was accessed, last time the file was accessed and then filename. Of that we take out top 11 entries.

find / -type f -printf "\n%AD %AT %p" | head -n 11

# The below sort command first sorts on basis of last digit of the year, then sorts on basis of last digit of month in reverse order and finally sorts on basis of first field. Here, ‘1.8‘ means 8th column of first field and ‘n’ ahead of it means numerical sort, while ‘r’ indicates reverse order sorting.

find / -type f -printf "\n%AD %AT %p" | head -n 11 | sort -k1.8n -k1.1nr -k1

### Sorting Files Based on Time
# Here, again we use find command to list out top 11 files in root directory and print the result in format: last time file was accessed and then filename.
find / -type f -printf "\n%AT %p" | head -n 11

# The below command sorts the output based on first column of the first field of the output which is first digit of hour.
find / -type f -printf "\n%AT %p" | head -n 11 | sort -k1.1n

#==============================#
# EXAMPLES: Lets try to understand this command using some example command and their interpretation.
#==============================#

find ~ -name abc -print 
# The above command will search and print for a file named "abc" in the home directory in downward direction.

find . -type d -name somedir -print
# The above command will search for a directory named "somedir" in the current directory downward. Also, it will print them on console.

find / -type f -links 1 -print
# The above command will search and print all the files which are of type file(not directories) and having one links.

find .. -perm 644 -exec rm {} \;
# The above command will search and remove all the files having permission as 644 (rw-r--r--) in the parent of current directory in downward direction. Note:- Here "{}" indicates that the out put of find becomes the argument of rm command. ";" shows the end of rm command and "\" is used to avoid apecial meaning of ";"

find ~ -size 5c -exec cat {} \;
# The above command will search and print the content of each file of size 5 bytes in home directory downward.

find . -type f -name somename -exec rm -i {} \;
# The above command will search and remove all the files of name "somename" in the current directory downward , interactively.
# Note:- If we use "-i" option in exec, the rm command will perform on each file interactively. But if we use the action "OK", it will implicitly perform the command interactively. See the next example:

 

find / -type f -name somename -ok rm {} \;
# The above command will search and remove all the files of name "somename" from the root directory downward , interactively.

find -name 
# To do a search by file names using find, -name is case-sensitive 

find -iname 
# and -iname is case-insensitive

find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *

find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.

find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'´

find type d
# To search for directories with find

find -type f 
# To search for files

find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *

find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.

find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'

find -maxdepth
# Limit find's depth of search with the -maxdepth option. When giving numeric arguments to the find utility, + means greater than and - means less than. e.g. +7 means greater than 7 - Boolean operators in find: -not, -or. There's no 'and' because 'and' is implicit when two requirements are listed

find /etc/apt/sources.list* -type f -name *.list -exec bash -c 'echo -e "\n=== $1 === ";grep -v -e ^# -e ^$ ${1}' _ {} \;

find /tmp -type f -size 5c -print

find . -name  -exec grep "phrase" {} \;
# Find all files with given name (you can use Bash expansion if youd like), and Grep for a phrase

find . -name  -exec grep "phrase" {} \; -print
# To display the filename that contained a match, use -print

find . -name  -exec grep -Hn "phrase" {} \;
# Or, use Grep options to print the filename and line number for each match - The string `{}` is replaced by the current filename being processed everywhere it occurs in the arguments to the command. See the `find` man page for more information.

find . -atime -$(date +%j) -type f 
# Find files you haven't accessed so far this year in a directory. Requires atime attributes. 

find /etc -type f -printf "%T@ %T+ %p" | sort -n
# Find last modified files on a filesystem

find /etc -type f -printf "%T@ %T+ %p \n" | sort -n | less
#

find . -type f -mtime 0
#

find . -atime -$(date +%j) -type f 
# Find files you haven't accessed so far this year in a directory. Requires atime attributes. 

find . -print | sort | sed 's;[^/]*/;|---;g;s;---|; |;g' 
# Generate output similar to 'tree' without using tree.  Maybe it should be noted however that the computer I ran this on also blew out its magic smoke over the weekend.

find . -type f -exec bash -c 'mv "$1" "./$RANDOM"' - {} \;
# Random mass-rename - Assign random names to all files in a folder (including subfolders!) - Note that this is a somewhat expensive operation, so it might take a few seconds for large numbers of files.

find . -type f -name "*.php" -exec php -l {} \; | grep -v 'No syntax errors'
# Check PHP Syntax - Heres a simple one liner you can use to syntax check all php files in your working directory.

find $1 -type d | xargs du -sm | sort -g
# you want to see ALL directories in the tree

# So its more obvious, this sorts a bunch of log files together and orders them correctly based on the date field in Apache CLF format.
find . -name '*s_log*'|xargs cat|sort -k4n -t' ' -k 4.9,4.12n -k 4.5,4.7M -k 4.2,4.3n -k 4.14,4.15n -k 4.17,4.18n -k 4.20,4.21n >mergedlogs

find /var/www -perm -o+w -a -not -type l -ls
# Find files and directories under /var/www that are world writable. Exclude symbolic links.

find /etc -type f -mtime +2 -mtime -30 -ls
# Long list files under /etc that were modified between 2 and 30 days ago.

find ~/ -mindepth 2 -type f -ls | sort -n -r -k 7 | head -20
# Show the 20 largest files at least 2 subdirectories down from your home dir

find . -maxdepth 1 -type d -ls
# Long list only the directories under the current directory.

find / -perm /+s -ls
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s

find . -printf "%TY %p\n" | grep ^2013
# Get a list of all files last modified in 2013. Useful for passing to xargs or while loop

find / -size +100M -exec du -h {} \;
# find all files larger than 100MB and display their human readable size.

find . -name \*.git 2>/dev/null|grep -oP '^(.*)(?=\/\.git$)'|while read l;do pushd "$l";git status;popd;printf "\n";done

find . -name \*.[ch] -exec grep -sl "PATTERN" {} \;
# Search for PATTERN in .c and .h file. 

find / -type f | sed 's,/[^/]*$,,' |sort |uniq -c | awk '$1>=33000'
# Find directories that have 33000 or more files in them.

find . -mtime +$((365*5)) -maxdepth 1 -exec du -sb {} \; |awk '{s+=$1}END{print s}'
# Total bytes used by 5+ year old directories in CWD

find . -xdev -type f -mtime +$((365*7)) -print0|xargs -0 du -bsc|awk '/\ttotal$/{s+=$0}END{print s}'
#Total bytes of files older than ~7 yr

find . -exec file -b --mime-type {} + | sort | uniq -c | sort -nr
# Make stats of the top file types in this directory and below.

find . -maxdepth 1 -type f -printf '%TY-%Tm\n' | sort | uniq -c
# counts files in the current path by modification month.

find music -name '*.mp3' -mtime +365 -a -size +10M -ls
# Find and long list mp3 files in Music dir older than a year and larger than 10MB.

# find /usr -name '*.wav' -size -75>snds;for i in $(seq 1 13 600);do at "now + $i minute" <<<'play "$(shuf snds|head -1)">/dev/null 2>&1';done

# find /usr -name '*.wav' -size -75>snds;for i in $(seq 1 13 600);do at "now + $i minute" <<<'play "$(shuf snds|head -1)">/dev/null 2>&1';done
# kein plan

find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2015/ \;
# Move current year pics to 2015 directory.

find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.

find -name "*.xml" | while IFS=$'\n' read f ; do xmllint --format "$f" > tmp.xml && mv -v tmp.xml "$f"; done
#Format XMLs. 

find . -empty -type d
#List of empty subdirectories of current directory.

find . -name \*.[ch] -exec grep -sl "PATTERN" {} \;
# Search for PATTERN in .c and .h file. 

find / -type f | sed 's,/[^/]*$,,' |sort |uniq -c | awk '$1>=33000'
# Find directories that have 33000 or more files in them.

find . -xdev -type f -mtime +$((365*7)) -print0|xargs -0 du -bsc|awk '/\ttotal$/{s+=$0}END{print s}'
# Total bytes of files older than ~7 yr

find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'

find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.

find -name "*.xml" | while IFS=$'\n' read f ; do xmllint --format "$f" > tmp.xml && mv -v tmp.xml "$f"; done
# Format XMLs. 

find . -exec file -b --mime-type {} + | sort | uniq -c | sort -nr
# Make stats of the top file types in this directory and below.

find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
# findet alle Dateien von . die größer als 100MB sind

find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail | ccze -h > /var/www/farblogs/index.html
# Erstellt eine html mit allen logdateien

find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f | ccze -A
# Zeigt alle Logs Farbig an

find . -mtime +$((365*5)) -maxdepth 1 -exec du -sb {} \; |awk '{s+=$1}END{print s}'
# Total bytes used by 5+ year old directories in CWD

# Find and replace on specific files
find . -name '*.php' -exec sed -ie 's#<?#<?php#' {} \;

find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2015/ \;
# Move current year pics to 2015 directory.

find / -perm /+s -ls
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s

find . -printf "%TY %p\n" | grep ^2013
# Get a list of all files last modified in 2013. Useful for passing to xargs or while loop

find / -size +100M -exec du -h {} \;
# find all files larger than 100MB and display their human readable size.

find . -maxdepth 1 -type f -printf '%TY-%Tm\n' | sort | uniq -c
# counts files in the current path by modification month. 

find music -name '*.mp3' -mtime +365 -a -size +10M -ls
# Find and long list mp3 files in Music dir older than a year and larger than 10MB.

find . -empty -type d
# List of empty subdirectories of current directory.

find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.

find . -maxdepth 1 -type d -ls
# Long list only the directories under the current directory.

find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *

find /etc -type f -mtime +2 -mtime -30 -ls
# Long list files under /etc that were modified between 2 and 30 days ago.

find ~/ -mindepth 2 -type f -ls | sort -n -r -k 7 | head -20
# Show the 20 largest files at least 2 subdirectories down from your home dir

find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f
#Monitor logs in Linux using Tail - Works in Ubuntu, I hope it will work on all Linux machines. For Unixes, tail should be capable of handling more than one file with '-f' option. This command line simply take log files which are text files, and not ending with a number, and it will continuously monitor those files. Putting one alias in .profile will be more useful.

# Some great usage and example of find command in Linux
##################################################################
# Format Example:
find path criteria action

find . -name "*.log" -exec ls -l {} \;

# Find all the log file inside a folder/subfolder and delete or archive them.
find . -name "*.log" -type f -exec rm {} \;

find . -name "*.log" -type f -exec gzip {} \;

# Let's say you want to delete all txt files which are older than 60 days from /temp folder.
find /tmp -name "*.text" -type f -mtime +60 -exec rm -f {} \;

# Now we want to delete/archive some set of files which are 3 months old and whose size is bigger than 1 mb.
find /path -name "*.log" -type f -mtime +90 -size +1024k -exec rm -f {} \;

# Find out the total size of all log files in a folder which are 30 days older.
find /logs -name '*.log' -type f -mtime +30 -exec du -am {} \;

# List the files' count in a directory which are any files but not log files(*.log).
find /path ! -name "*.log" -type f -exec ls -l {} \; | wc -l

# List out all the directories in a folder recursively whose name is "archive"  "." points to current directory.
find . -name "archive" -type d

# Archive any files which are older than 2 days and but exclude the files which are already zipped, from /tmp folder.
find /tmp/ ! -name "*.zip" -type f -mtime +2 -exec gzip -vf {} \;
 
 
##Sorting Files based on Month
# Here, we use find command to find all files in root (‘/’) directory and then print the result as: Month in which file was accessed and then filename. Of that complete result, here we list out top 11 entries.

find / -type f -printf "\n%Ab %p" | head -n 11

# The below command sorts the output using key as first field, specified by '-k1' and then it sorts on Month as specified by 'M' ahead of it.
find / -type f -printf "\n%Ab %p" | head -n 11 | sort -k1M

## Sort Files Based on Date
# Here, again we use find command to find all the files in root directory, but now we will print the result as: last date the file was accessed, last time the file was accessed and then filename. Of that we take out top 11 entries.

find / -type f -printf "\n%AD %AT %p" | head -n 11

# The below sort command first sorts on basis of last digit of the year, then sorts on basis of last digit of month in reverse order and finally sorts on basis of first field. Here, ‘1.8‘ means 8th column of first field and ‘n’ ahead of it means numerical sort, while ‘r’ indicates reverse order sorting.

find / -type f -printf "\n%AD %AT %p" | head -n 11 | sort -k1.8n -k1.1nr -k1

## Sorting Files Based on Time
# Here, again we use find command to list out top 11 files in root directory and print the result in format: last time file was accessed and then filename.
find / -type f -printf "\n%AT %p" | head -n 11

# The below command sorts the output based on first column of the first field of the output which is first digit of hour.
find / -type f -printf "\n%AT %p" | head -n 11 | sort -k1.1n

# EXAMPLES: Lets try to understand this command using some example command and their interpretation.
find ~ -name abc -print
# The above command will search and print for a file named "abc" in the home directory in downward direction.

find . -type d -name somedir -print
# The above command will search for a directory named "somedir" in the current directory downward. Also, it will print them on console.

find / -type f -links 1 -print
# The above command will search and print all the files which are of type file(not directories) and having one links.

find .. -perm 644 -exec rm {} \;
# The above command will search and remove all the files having permission as 644 (rw-r--r--) in the parent of current directory in downward direction.
# Note:- Here "{}" indicates that the out put of find becomes the argument of rm command. ";" shows the end of rm command and "\" is used to avoid apecial meaning of ";"

find ~ -size 5c -exec cat {} \;
# The above command will search and print the content of each file of size 5 bytes in home directory downward.

find . -type f -name somename -exec rm -i {} \;
# The above command will search and remove all the files of name "somename" in the current directory downward , interactively. Note:- If we use "-i" option in exec, the rm command will perform on each file interactively. But if we use the action "OK", it will implicitly perform the command interactively. See the next example:

find / -type f -name somename -ok rm {} \;
# The above command will search and remove all the files of name "somename" from the root directory downward , interactively.
 
 
# This means that, only files modified on 2016-12-06 will be considered:
find . -maxdepth 1 -newermt "2016-12-06"

# Important: Use the correct date format as reference in the find command above, once you use a wrong format, you will get an error as the one below:
find . -maxdepth 1 -newermt "12-06-2016"
# find: I cannot figure out how to interpret '12-06-2016' as a date or time

# Alternatively, use the correct formats below:
find . -maxdepth 1 -newermt "12/06/2016"
# OR
find . -maxdepth 1 -newermt "12/06/16"

find -name 
# To do a search by file names using find, -name is case-sensitive 

find -iname 
# and -iname is case-insensitive

find type d
# To search for directories with find

find -type f 
# To search for files

find -maxdepth
# Limit find's depth of search with the -maxdepth option. When giving numeric arguments to the find utility, + means greater than and - means less than. e.g. +7 means greater than 7 Boolean operators in find: -not, -or. There's no 'and' because 'and' is implicit when two requirements are listed

find /home/user -name '*.ksh' | xargs chmod 744
# Change permissions on files of a specific type in linux

find /home/user -name '*.ksh' | xargs ls -l

find $mountpoint -xdev -type d -size +300k

find . | cpio -oH newc | gzip -c > /boot/initrd-test.img

find . -name "node_modules" -exec rm -rf '{}' \;
# Recursively remove "node_modules" directories

find . -maxdepth 1 -iname '*.mp3' -exec eyeD3 -G podcast \{} \; 
# tag all mp3 in PWD as genre podcast. 

find ./music -name \*.mp3 -exec cp {} ./new \; 
# Backslashing the * glob instead of quoting the expression. F

find / -please-find-the-file-i-want 
# Sometimes, there are files that find can't find no matter how many options you try. 

find /var/log -readable -ls 
# Find files under /var/log that are readable by the current user. Takes groups and ACLs into account.

find / \( -path /proc -o -path /sys \) -prune -o -print 
# Search the file system, but don't descend into the /sys or /proc directories.

find . -xdev -ls | sort -n -k 7 | tail -5 
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.

## Say, you are asked to clean up your account. How will you find the biggest files in your account?
$ find . -type f  -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5

	# What the above command does:
	  1. It finds all the files and does a long listing of them(find & ls -l).
	  2. Only the filesize and the filenames alone are filtered.($5, $NF)
	  3. Sorted on the basis of file size.(Biggest files at the top).
	  4. The big 5 files get displayed.(head -5)

# Note: Do not run the above command where there are huge number of files present. It will take a long time to respond.Also, in the awk command, $5 denotes the filesize in Linux. It might be different in other *nix flavors.

### There could be cases when the user is specifically interested only to find big files above a particular size, say above 100MB:
$ find . -type f -size +100M -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5
  # In the above find command, size switch is used to find files on the basis of size. '+100M' indicates files bigger(+) than 100MB.

 
### Similarly, to find files of size between 100MB and 200MB:
$ find . -type f -size +100M -size -200M -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5
    
   #  +100M indicates files bigger than 100MB, -200M indicates files smaller than 200MB. In other words, we will get files of size between 100 and 200MB.

   The notations to specify in the size switch of the find command is :
     For greater than 50KB,   +50k     (small k)
           greater than 50MB,   +50M     (big M)
           greater than 5GB,    +5G         (big G)

	 
	 

# Find files on a specific file system - If you know the file name and file system but not sure the exact folder path then you can use this syntax. In below example, I am searching for messages file in /var file system.

find /var -name messages
    /var/log/messages
# Tips: if you don’t know the file system name, you can search on / level but keep in mind it may take time if you have large number of file systems.

find / -name messages
    /var/log/messages

# If you don’t know the exact file name, you can also use wild card pattern to search. For ex – to search error_log you may try
find / -name error_*
    /var/log/httpd/error_log

# How about searching file name with lower or upper case in other word ignoring case sensitive? Well, you can use –iname instead of –name. For ex:-
find / -iname MESSAGES
    /var/log/messages

# Let’s take a look at one more real-time scenario. If you know the file type and want to search all of them. For ex – if you are working on WebSphere, you may want to search all files ending with .out then you can try
find / -name *.out

## Find files based on ownership and permissions - Having files with 777 permission is dangerous as anyone can edit or delete so as a System Administrator you may want to put a scan in place to find any files with 777 permissions. For ex – to show any files having 777 permission under /opt file system.

find /opt/ -type f -perm 777
    /opt/testing
    /opt/SystemOut.log

find /opt/ -type f -perm 777 -exec ls -ltr {} ;
    -rwxrwxrwx 1 root root 0 Jul 19 03:35 /opt/testing
    -rwxrwxrwx 1 root root 0 Jul 19 03:36 /opt/SystemOut.log
# Tips: how about printing file ownership, time stamp in same line command?
    

find /opt/ -type f -perm 777 -exec chmod 755 {} ;
# You may also change permission from 777 to 755 in single find command syntax.  Obviously, you can adjust permission from 755 to any other you may like.

# How about finding files, which is owned by root or different user? This is very helpful if you are having issues while starting the services due to previous start was done by root. For ex – if tomcat is owned by user called “tomcatapp” and for some reason you have started with root. Guess what will happen when you restart next time with “tomcatapp”? It won’t because some of the files ownership is changed to root and now “tomcatapp” can’t modify/delete those files. So this becomes very handy in that situation. Here is how you can search any file owned by root in specific file system.

find /opt/ -user root
# Note: performing this find syntax on / level will results so many files/folders so you may want to control by doing this in specific file system.

## Find files older than particular days
# File System housekeeping is essential for production support and often you have to deal with this syntax to find logs which are older than (let’s say) 60 days. Below example is to find access.log file older than 60 days in /opt file system.

find /opt/ -name access.log -mtime +60
# Tips: if you decide to find and delete in same command line you can do like below. This will find access.log older than 60 days in /opt file system and delete it.

find /opt/ -name access.log -mtime +60 -exec rm {} ;
# While this is very handy, you may want to list the files before you delete them. To do so

find /opt/ -name access.log -mtime +60 -exec ls -ltr {} ;

## Find large file size - Sometime you may have to deal with frequent file system cleanup due to large number of logs are being written by application due to code issue, etc. Let’s take an example of searching file greater than 1 GB in /opt file system.

find /opt/ -size +1G
# Tips: If you know all files in /opt/ with more than 1 GB can be deleted then you can just have find and delete in same line.

find /opt/ -size +1G -exec rm {} ;

find . -name "*.[ch]" -print | xargs tar -cvf <name_of_output_file>

find /var/oracle/etl/incoming -name '*.dat' -mtime +7 -exec echo rm {} \;

find /home/backups -mtime +30 -type f -exec rm -rf {} \; 

find . -mtime +3 -exec rm {} ';' 

find .					
# Find all files under .

find . -type d				
# Find all subdirectories.

find . -iregex ".*\(bas\|cls\|frm\)"	
# Find all Visual Basic code files in a directory  -iregex matches a case-insensitive regular expression - backslashes necessary to protect special characters at shell

find . -iregex ".*\(bas\|cls\|frm\)" -exec grep NIBRS \{\} \;					
# Find all VB files containing a given string Note you must escape the {} and the ; because we are at the shell

find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep "^*'" \{\} \;
# Find all VB files containing comment lines - Note you must escape the {} and the ; because we are at the shell

find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'" \{\} \;
# Find all VB files containing NON comment lines - Note you must escape the {} and the ; because we are at the shell

find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'|^[[:space:]]*$" \{\} \;
# Find all VB files containing NON comment, NON blank lines in a directory

find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'|^[[:space:]]*$" \{\} \; | wc
# Count the code in a directory hierarchy

find . -iregex ".*\(java\|html\|txt\)" -exec wc \{\} \; | gawk '{ print $1 "\t" $4; sum += $1 } END { print "--------"; print sum "\tTOTAL" }'
# Sum the line counts of all the code files in a directory

find `perl -e 'print "@INC"'` -name '*.pm' -print
# Find all Perl modules - From Active Perl documentation

find . -name "*.library" -print0 | xargs -0 sed -i '' -e 's/foo:\/Drive_Letter:/foo:\/bar\/baz\/xyz/g'

find . -name Root -exec sed -i 's/1.2.3.4\/home/foo.com\/mnt/' {} \;

find . -name Root -print0 | xargs -0 sed -i '' -e 's/1.2.3.4\/home/foo.com\/mnt/'

find ./ -type f -exec sed -i " 's/string1/string2/' {} \;
"#

find . -name "*.txt" -print0 | xargs -0 sed -i '' -e 's/foo/bar/g'
# Recursively find and replace in files

find . -type f -name "*.txt" -exec sed -i'' -e 's/foo/bar/g' {} +

find . -name '*s_log*'|xargs cat|sort -k4n -t' ' -k 4.9,4.12n -k 4.5,4.7M -k 4.2,4.3n -k 4.14,4.15n -k 4.17,4.18n -k 4.20,4.21n >mergedlogs
# So its more obvious, this sorts a bunch of log files together and orders them correctly based on the date field in Apache CLF format.

find . -name lxu* -type d -exec bash -c 'mv "$1" "${lxu/\/123_//}"' -- {} \;  
find . -type f -name ".wato" 
      

find /var/www -perm -o+w -a -not -type l -ls
# Find files and directories under /var/www that are world writable. Exclude symbolic links.

find ! -name "*.pdb" -delete

find -type f ! -name "*.pdb" -delete

fdupes -r dir > dupes.txt
# Find file duplicates in 'dir' recursively based on size and mdsum and log to dupes.txt.

fdupes -r Pictures > dupes.txt 
# Find file duplicates in 'Pictures' recursively based on size and mdsum and log them to dupes.txt.

find . -name "node_modules" -exec rm -rf '{}' \;
# Recursively remove "node_modules" directories 

find . -name "node_modules" -exec rm -rf '{}' +
# First iteration: (doesn't call rm for every file)

find . -name "node_modules" -delete
# Second iteration: Builtin

find -wholename "*/query/*.json"
# find matching wholename example

find . -type f -name 'file*' -execdir mv {} {}_renamed ';'  
# Renaming multiple files using find

find [YOURDIR] -type d -exec chmod 755 {} \;
find [YOURDIR] -type f -exec chmod 644 {} \;
# Change Directory and File Permissions Properly For Linux Web Server
# Sets file permissions to 644 and directory permissions to 755.

find . -name *.xml -exec grep -e my_grep_data {} \; -print
# How to grep the results of find using -exec in Linux
# This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.

find .  -type f ! -path "./.git/*" -exec sh -c "head -n 1 {} | egrep -a 'bin/bash|bin/sh' >/dev/null" \; -print -exec shellcheck {} \;
ShellCheck all the bash/sh script under a specific directory excluding version control
# This is a commodity one-liner that uses ShellCheck to assure some quality on bash and sh scripts under a specific directory. It ignores the files in .git directory.
# Just substitute "./.git/*" with "./.svn/*" for older and booring centralized version control.
# Just substitute ShellCheck with "rm" if your scripts are crap and you want to get rid of them :)

find . -printf '%s %p\n'| sort -nr | head -10

find /var/www/web3/web/chat/ -printf '%s %p\n'| sort -nr | head -10

find /var/www/web3/web/chat/ -type f -printf '%s %p\n'| sort -nr | head -10
# You can also skip the directories and only show files , follow the below command.
     #    or
find /var/www/web3/web/chat/ -type f -iname "*.jpg" -printf '%s %p\n'| sort -nr | head -10

find . | xargs grep 'string' -ls
# To search for instances of a string inside all files within a directory (recursive)

find -type f -name "*.avi" -print0 | xargs -0  mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
# Get the total length of all video / audio in the current dir (and below) in H:m:s 
# change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.

find -type f -iregex '.*\.\(mkv\|mp4\|wmv\|flv\|webm\|mov\|dat\|flv\)' -print0 | xargs -0  mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
# Get the total length of all videos in the current dir in H:m:s
# Use case insensitive regex to match files ending in popular video format extensions and calculate their total time. (traverses all files recursively starting from the current directory)

find / -type d | while read i; do ls $i | wc -l | tr -d \\n; echo " -> $i"; done | sort -n
# Show file count into directories. - Usefull when you try to find hugh directories that elevate system CPU (vmstat -> sy)

find . -name '*.log' | xargs ls -hlt > /tmp/logs.txt && vi /tmp/logs.txt
# Find latest modified log

find . -type f -exec stat --format '%Y :%y %n' {} \; | sort -nr | cut -d: -f2- | head 
# List files by modified date - Script to list files in a directory recursively by last modified date.

find . -type f -perm /o=r -print0 | xargs -0 grep -l password= 
# Find world readable files under CWD that have "password=" in them.

find ./* -type f -exec sed -i 's/oldtext/newtext/g' {} \; 
# Linux Search & Replace - A recursive search and replace linux command, for those massive file changes you don't feel like doing by hand. This method doesn't create any backup files 
# 		or  
find . -type f | xargs perl -pi~ -e 's/oldtext/newtext/g;' 
# this method creates backup files 

find ./path_to_search -type f -name "*the_patern*" -exec rm -i {} \; 
# Delete Files Recursively - Sift through a bunch of directories and delete only specific files.

find / -iname '*droids*' 2> /dev/null 
# If you want to avoid error messages in the output of find such as "Permission denied", just redirect STDERR (The 2> part) to /dev/null.

find . -type d
# Find only folders in a directory

find . -maxdepth 1 -mindepth 1 -print0 | xargs -0 -n 1 -I % cmp % /DUPDIR/% 2>/dev/null
# Compare directories (using cmp to compare files byte by byte) to find files of the same name that differ
# Compare the content of the files in the current directory with files of the same name in the duplicate directory. Pop Quiz: You have a duplicate of a directory with files of the same name that might differ. What do you do? You could use diff to compare the directories, but that's boring and it isn't as clever as find -print0 with xargs. Note: You must omit stderr redirect 2>/dev/null to see the list of missing files from DUPDIR, if any. Hint: Redirect stderr to a new file to produce a more readable list of files that are missing from DUPDIR. Warning: This doesn't tell you if DUPDIR contains files not found in the current directory so don't delete DUPDIR. This is sample output - yours may be different.
./DIFFER.PNG /DUPDIR/./DIFFER.PNG differ: char 59, line 3
cmp: /DUPDIR/./NOMATCH.PNG: No such file or directory

find www/ -type f -execdir chmod -v o+r {} \; -o -type d -execdir chmod -v o+rx {} \; 
# Add read permissions for files and read/execute permissions for directories under the www directory.

find /home -mtime -1 -size +100M -ls 
# Try to figure out what recently used file might have just filled up the /home partition by finding files modified in the last day that are larger than 100M just to narrow it down.

find /dir/to/serach -maxdepth 1 -name "foo*.jpg"|wc -l
# Count Files in a Directory with Wildcards. Remove the '-maxdepth 1' option if you want to count in directories as well

find / -iname "manifest.json" -exec sed 's/\"update_url\": \"http/\"update_url\": \"hxxp/g' -i.bak '{}' \;
# Disable updates for installed Chrome plugins This will allow you to ensure you don't get nagged by updates and also protects you from watering hole attacks! Please be sure to make sure your plugins don't have any security issues! Backups are manifext.jason.bak 

find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f

# Recursive find and replace file extension / suffix (mass rename files)   -   Find recursively all files in ~/Notes with the extension '.md' and pipe that via xargs to rename command, which will replace every '.md' to '.txt' in this example (existing files will not be overwritten).
find ~/Notes -type f -iname '*.md' -print0 | xargs -0 rename --no-overwrite .md .txt {}

# Convert libreoffice files : .odt .odg and other to .pdf      Find and Convert all libre office files to PDF without libreoffice GUI 
find /home/foo/Documents/ -type f -iname "*.odt" -exec libreoffice --headless --convert-to pdf '{}' \;

# Fulltext search in multiple OCR'd pdfs
find /path -name '*.pdf' -exec sh -c 'pdftotext "{}" - | grep --with-filename --label="{}" --color "your pattern"' \;

# How to grep the results of find using -exec in Linux 
# -> This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.
find . -name *.xml -exec grep -e my_grep_data {} \; -print

# Simple command to erase all the folders with a given name -> This is how to remove all folders with a given name (e.g. "CVS") starting from a root folder ('.' is the current folder)       
find . -name <fileName> -type d -print0 | xargs -0 rm -fr
        e.g. 
find . -name CVS -type d -print0 | xargs -0 rm -fr     

# Find all files ending with ".swp"
find . -name \*.swp -type f

# Find all files ending with ".swp" and delete them
find . -name \*.swp -type f -delete

# Find all files, not in hidden directory, that contain the pattern "TRANDESCID" and also any of the patterns "41", "42", "45", and "50"
find . -not -path '*/\.*' -type f -exec grep -iq TRANDESCID {} \; -exec grep -il -e 41 -e 42 -e 45 -e 50 {} \;

# Keep track of setuid / setgit executables. The following command lists all setuid and setgid files on the system:
find / -type f -perm /6000

# To list all setuid and setgid files that are world writable execute the following command:
find . -type f -perm /6000 -a -perm -0002

# How to grep the results of find using -exec in Linux
# This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.       
find . -name *.xml -exec grep -e my_grep_data {} \; -print

# zgrep across multiple files
find "$(pwd)" -name "file-pattern*.gz" -exec zgrep -H 'pattern' {} \;

# replace recursive in folder with sed
find <folder> -type f -exec sed -i 's/my big String/newString/g' {} +

# Find ASCII files and extract IP addresses
find . -type f -exec grep -Iq . {} \; -exec grep -oE "(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)" {} /dev/null \;

# Delete all files by extension
# This is a correction to https://www.commandlinefu.com/commands/view/22134 Use `-name` instead of `-iname`, because case-sensitivity is probably important when we're dealing with filenames. It's true that extensions are often capitalised (e.g., "something.JPG"), so choose whatever's appropriate. However, what is appropriate is the quoting of the name pattern, so the shell does not expand it incorrectly. Finally, `-delete` is clearer.
find / -name "*.jpg" -delete

# Find all file extension in current dir.
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u

find / -perm /+s -ls 
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s

find . -printf "%TY %p\n" | grep ^2006 
# Get a list of all files last modified in 2006. Useful for passing to awk then xargs or for loop

find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2013/ \; 
# Move current year pics to 2013 directory.

find . -cnewer cutoff -type f -name '2014*' 
# Find files named 2014* that are newer than the change time of the file named 'cutoff'.

find . -type f -mmin -60 
# Find files below the current directory that have changed within the last 60 minutes.

find {,/usr}/{,s}bin -name '??' 
# Use brace expansion to check all your bin and /usr/bin dirs at once for any two letter command.

find . -xdev -ls | sort -n -k 7 | tail -5 
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.

# Find all log file which modified 24 hours ago and compress into zip file
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null &
# Explanation: 
	# -type f: only file
	# -mtime +n: File's data was last modified n*24 hours ago
	# -name "*.log": file have extend .log, can replace other word
	# zip -m {}.zip: compress all file into zip
	# /dev/null &: skipping print screen.
        
        
 
find / -type f ! -regex '^/\(dev\|proc\|run\|sys\).*' | sed 's@^\(.*\)/[^/]*$@\1@' | sort | uniq -c | sort -n | tail -n 10
# Find the top 10 directories containing the highest number of files It can be used to pinpoint the path(s) where the largest number of files resides when running out of free i-nodes Show Sample Output:
        # 4.0K    /lib64
        # 8.0K    /media
        # 8.0K    /srv
        # 32K     /tmp
        # 92K     /boot
        # 5.8M    /sbin
        # 7.9M    /bin
        # 18M     /etc
        # 45M     /lib
        # 49M     /opt
        # 908M    /root
        # 1.7G    /usr
        # 7.5G    /var
        # 18G     /home

find ./ -type f -name "somefile.txt" -exec sed -i -e 's/foo/bar/g' {} \;
# Recursive search and replace (with bash only) Replaces a string matching a pattern in one or several files found recursively in a particular folder.

find . -print0 | xargs -0 -P 40 -n 1 sh -c 'ffmpeg -i "$1" 2>&1 | grep "Duration:" | cut -d " " -f 4 | sed "s/.$//" | tr "." ":"' - | awk -F ':' '{ sum1+=$1; sum2+=$2; sum3+=$3; sum4+=$4 } END { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 }'
# Count the total amount of hours of your music collection First the find command finds all files in your current directory (.). This is piped to xargs to be able to run the next shell pipeline in parallel. The xargs -P argument specifies how many processes you want to run in parallel, you can set this higher than your core count as the duration reading is mainly IO bound. The -print0 and -0 arguments of find and xargs respectively are used to easily handle files with spaces or other special characters. A subshell is executed by xargs to have a shell pipeline for each file that is found by find. This pipeline extracts the duration and converts it to a format easily parsed by awk. ffmpeg reads the file and prints a lot of information about it, grep extracts the duration line. cut and sed cut out the time information, and tr converts the last . to a : to make it easier to split by awk. awk is a specialized programming language for use in shell scripts. Here we use it to split the time elements in 4 variables and add them up. Show Sample Output:
        # 1036:17687:2689.686985895

# Find files with interesting file exts
find . -type f -name ".MOV" -o -name ".avi" -o -name ".flv" -o -name ".m4v" -o -name ".mov" -o -name ".mp4" -o -name ".wmv" > vid_list

# List by file modified time recursively (top 20)
find . -type f -print0 | xargs -0 stat -f "%m %N" | sort -rn | head -1 | cut -f2- -d" "

# Remove OSX bits
find . -name ".DS_Store" -print0 | xargs -0 rm -rf
find . -name "._" -print0 | xargs -0 rm -rf

# Find most recently modified files recursively (BSD/OSX style)
find . -type f -print0 | xargs -0 stat -f "%m %N" | sort -rn | head -10 | cut -f2- -d" " | more

# Find Flash videos stored by browsers on a Mac
find /private/ 2>/dev/null | grep /Flash
# Explanation: When you watch a flash video like youtube in a browser, the video file is saved on your harddisk at a temporary location. And, if you watch a video and then another video in the same window, the first one will be deleted.
# Limitations: 
    # Might not work with all browsers.
    # Does not work with all websites (for example IMDB).
    # Does not work with an anonymous window in Chrome.

# Create and restore backups using cpio
find . -xdev -print0 | cpio -oa0V | gzip > path_to_save.cpio.gz

# Explanation: To restore:
    # gzip -cd path_to_save.cpio.gz | cpio -imV

# Why not use tar instead? cpio is slightly more accurate!
  #  find . -xdev -print0 finds all files and directories without crossing over to other partitions and prints a null delimited list of filenames
  #  cpio -oa0V takes the list of files to archive from stdin and creates an archive file preserving timestamps and permissions
  #  cpio -imV extracts the files and directories from stdin while preserving timestamps and permissions

# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f | perl -ne 'chomp(@files = <>); my $p = 9; foreach my $f (sort { (stat($a))[$p] <=> (stat($b))[$p] } @files) { print scalar localtime((stat($f))[$p]), "\t", $f, "\n" }' | tail
# Explanation: 
    # find path_to_dir -type f prints all the files in the directory tree
    # chomp(@files = <>); reads all the lines into an array
    # stat($a) is an array of interesting info about a file. Index 7 is size, 8 is access time, 9 is modification time, etc. (See man perlfunc for details and search for stat EXPR.)
    # sort { (stat($a))[9] <=> (stat($b))[9] } @files sorts the files by modification time
    # print scalar localtime((stat($f))[9]), "\t", $f, "\n" - prints the modification time formatted nicely, followed by a tab and the filename

## Alternative one-liners: 
# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f -mtime -7 -print0 | xargs -0 ls -lt | head
# Explanation: 
    # find /path/to/dir -type f -mtime -7 -print0 prints all the files in the directory tree that have been modified within the last 7 days, with null character as the delimiter
    # xargs -0 ls -lt expects a null delimited list of filenames and will sort the files by modification time, in descending order from most recent to oldest
    # Since we are looking for the most recent files, with head we get the first 10 lines only

# Note that if there are too many files in the output of find, xargs will run multiple ls -lt commands and the output will be incorrect. This is because the maximum command line length is getconf ARG_MAX and if this is exceeded xargs has to split the execution to multiple commands. So depending on your use case you may need to tweak the -mtime parameter to make sure there are not too many lines in the output.

# Remove spaces recursively from all subdirectories of a directory
find /path/to/dir -type d | tac | while read LINE; do target=$(dirname "$LINE")/$(basename "$LINE" | tr -d ' '); echo mv "$LINE" "$target"; done

# Explanation: 
    # find path_to_dir -type d finds all the subdirectories
    # tac reverses the order. This is important to make "leaf" directories come first!
    # target=... stuff constructs the new name, removing spaces from the leaf component and keeping everything before that the same
    # echo mv ... for safety you should run with "echo" first, if the output looks good then remove the "echo" to really perform the rename

# Limitations: In UNIX or BSD there is no tac. There you can use tail -r instead.

# Recursively remove all empty sub-directories from a directory tree
find . -depth  -type d  -empty -exec rmdir {} \;

# Explanation: Recursively remove all empty sub-directories from a directory tree using just find. No need for tac (-depth does that), no need for xargs as the directory contents changes on each call to rmdir. We're not reliant on the rmdir command deleting just empty dirs, -empty does that.

# Limitations: Will make many calls to rmdir without using xargs, which bunches commands into one argument string, which is normally useful, but -empty /could/ end up being more efficient since only empty dirs will be passed to rmdir, so possibly fewer executions in most cases (searching / for example).

## 
## Related one-liners

# Recursively remove all empty sub-directories from a directory tree
find . -type d | tac | xargs rmdir 2>/dev/null

# Explanation: 
    # find will output all the directories
    # tac reverses the ordering of the lines, so "leaf" directories come first
    # The reordering is important, because rmdir removes only empty directories
    # We redirect error messages (about the non-empty directories) to /dev/null
# Limitations: In UNIX and BSD systems you might not have tac, you can try the less intuitive tail -r instead.

# How to find all hard links to a file
find /home -xdev -samefile file1

# Explanation: Note: replace /home with the location you want to search - Source: http://linuxcommando.blogspot.com/2008/09/how-to-find-and-delete-all-hard-links.html

# Recursively remove all empty sub-directories from a directory tree
find . -type d | tac | xargs rmdir 2>/dev/null

# Explanation: 
    # find will output all the directories
    # tac reverses the ordering of the lines, so "leaf" directories come first
    # The reordering is important, because rmdir removes only empty directories
    # We redirect error messages (about the non-empty directories) to /dev/null

# Limitations: 
# In UNIX and BSD systems you might not have tac, you can try the less intuitive tail -r instead.

## Alternative one-liners: 
# Recursively remove all empty sub-directories from a directory tree
find . -depth  -type d  -empty -exec rmdir {} \;

# Explanation: Recursively remove all empty sub-directories from a directory tree using just find. No need for tac (-depth does that), no need for xargs as the directory contents changes on each call to rmdir. We're not reliant on the rmdir command deleting just empty dirs, -empty does that.
# Limitations: Will make many calls to rmdir without using xargs, which bunches commands into one argument string, which is normally useful, but -empty /could/ end up being more efficient since only empty dirs will be passed to rmdir, so possibly fewer executions in most cases (searching / for example).

# Remove all the versioned-but-empty directories from a Subversion checkout
find . -name .svn -type d | while read ss; do dir=$(dirname "$ss"); test $(ls -a "$dir" | wc -l) == 3 && echo "svn rm \"$dir\""; done

# Explanation: Empty directories in version control stink. Most probably they shouldn't be there. Such directories have a single subdirectory in them named ".svn", and no other files or subdirectories.
    # The "find" searches for files files named .svn that are directories
    # The "while" assigns each line in the input to the variable ss
    # The "dirname" gets the parent directory of a path, the quotes are necessary for paths with spaces
    # ls -a should output 3 lines if the directory is in fact empty: ".", "..", and ".svn"
    # If the test is true and there are precisely 3 files in the directory, echo what we want to do
    # If the output of the one-liner looks good, pipe it to | sh to really execute

# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f -mtime -7 -print0 | xargs -0 ls -lt | head

# Explanation: 
    # find /path/to/dir -type f -mtime -7 -print0 prints all the files in the directory tree that have been modified within the last 7 days, with null character as the delimiter
    # xargs -0 ls -lt expects a null delimited list of filenames and will sort the files by modification time, in descending order from most recent to oldest Since we are looking for the most recent files, with head we get the first 10 lines only
# Note that if there are too many files in the output of find, xargs will run multiple ls -lt commands and the output will be incorrect. This is because the maximum command line length is getconf ARG_MAX and if this is exceeded xargs has to split the execution to multiple commands. So depending on your use case you may need to tweak the -mtime parameter to make sure there are not too many lines in the output.

# Execute different commands with find depending on file type
find /path/to/dir -type d -exec chmod 0755 '{}' \; -o -type f -exec chmod 0644 '{}' \;

# Explanation: 
    # -type d -exec chmod 0755 '{}' \; for each directory, run chmod 0755
    # \; is to mark the end of the -exec
    # {} is replaced with the filename, we enclosed it in single quotes like this '{}' to handle spaces in filenames
    # -ological OR operator
    # -type f -exec chmod 0644 '{}' \; for each regular file, run chmod 0644

# Replace symlinks with the actual files they are pointing at
find /path/to/dir -type l -exec sh -c 'cp --remove-destination "$(readlink "{}")" "{}"' \; 
# Explanation: 
    # All the double quoting is necessary to handle filenames with spaces.
    # Calling sh with -exec is necessary to evaluate readlink for each symlink
# Limitations: The BSD implementation of cp does not have the --remove-destination flag.

# Create a visual report of the contents of a usb drive
find /path/to/drive -type f -exec file -b '{}' \; -printf '%s\n' | awk -F , 'NR%2 {i=$1} NR%2==0 {a[i]+=$1} END {for (i in a) printf("%12u %s\n",a[i],i)}' | sort -nr

# Explanation: versorge asks:I have a bunch of usb volumes lying around and I would like to get a quick summary of what is on the drives. How much space is taken up by pdf, image, text or executable files. This could be output as a text summary, or a pie chart. This one-liner produces a list like this:
        #  5804731229 FLAC audio bitstream data
        #   687302212 MPEG sequence
        #    99487460 data
        #    60734903 PDF document
        #    55905813 Zip archive data
        #    38430192 ASCII text
        #    32892213 gzip compressed data
        #    24847604 PNG image data
        #    16618355 XML 1.0 document text
        #    13876248 JPEG image data

# The find command locates all regular files (-type f) below the given directory, which could be a mounted USB stick or any other directory. For each one, it runs the file -b command with the filename to print the file type; if this succeeds, it also prints the file size (-printf '%s\n'). This results in a list containing a file type on one line, followed by the file size on the next.

# The awk script takes this as input. The GNU file command often produces very specific descriptions such as GIF image data, version 87a, 640 x 480 - to generalize these, we set the field separator to be a comma with the -F option. Referencing $1 then only uses what is to the left of the first comma, giving us a more generic description like GIF image data.

# In the awk script, the first pattern-action pair NR%2 {i=$1} applies to each odd-numbered line, setting the variable i to be the file type description. The even-numbered lines are handled by NR%2==0 {a[i]+=$1}, adding the value of the line (which is the file size) to the array variable a[i]. This results in an array indexed by file type, with each array member holding the cumulative sum of bytes for that type. The END { ... } pattern-action pair finally prints out a formatted list of the total size for each file type. At the end of the line, the sort command sorts the list, putting the file types with the largest numbers at the top.

# Limitations: This one-liner uses the -b option to file and the -printf primary of find - these are supported by the GNU utilities but may not work elsewhere. It can also take a long time to run, since it needs to open and analyze every file below the given directory.

# Count the total number of hours of your music collection
find . -print0 | xargs -0 -P 40 -n 1 sh -c 'ffmpeg -i "$1" 2>&1 | grep "Duration:" | cut -d " " -f 4 | sed "s/.$//" | tr "." ":"' - | awk -F ':' '{ sum1+=$1; sum2+=$2; sum3+=$3; sum4+=$4; if (sum4 > 100) { sum3+=1; sum4=0 }; if (sum3 > 60) { sum2+=1; sum3=0 }; if (sum2 > 60) { sum1+=1; sum2=0 } if (NR % 100 == 0) { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 } } END { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 }'

	
# Store the output of find in an array
mapfile -d $'\0' arr < <(find /path/to -print0)

       
	
# Find all log files modified 24 hours ago, and zip them
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null

# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f | perl -ne 'chomp(@files = <>); my $p = 9; foreach my $f (sort { (stat($a))[$p] <=> (stat($b))[$p] } @files) { print scalar localtime((stat($f))[$p]), "\t", $f, "\n" }' | tail

# Explanation: find path_to_dir -type f prints all the files in the directory tree
    # chomp(@files = <>); reads all the lines into an array
    # stat($a) is an array of interesting info about a file. Index 7 is size, 8 is access time, 9 is modification time, etc. (See man perlfunc for details and search for stat EXPR.)
    # sort { (stat($a))[9] <=> (stat($b))[9] } @files sorts the files by modification time
    # print scalar localtime((stat($f))[9]), "\t", $f, "\n" - prints the modification time formatted nicely, followed by a tab and the filename

# How can I find all of the distinct file extensions in a folder hierarchy?
# It work as following:
    # Find all files from current folder
    # Prints extension of files if any
    # Make a unique sorted list
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u
# just for reference: if you want to exclude some directories from searching (e.g. .svn), use
find . -type f -path '*/.svn*' -prune -o -print | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u
# A variation, this shows the list with counts per extension: 
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort | uniq -c | sort -n     
# No need for the pipe to sort, awk can do it all:
find . -type f | awk -F. '!a[$NF]++{print $NF}'
find . -type f -name "*.*" | awk -F. \'!a[$NF]++{print $NF}\'
# Recursive version:
find . -type f | sed -e 's/.*\.//' | sed -e 's/.*\///' | sort -u
# If you want totals (how may times the extension was seen):
find . -type f | sed -e 's/.*\.//' | sed -e 's/.*\///' | sort | uniq -c | sort -rn
# Non-recursive (single folder):
for f in *.*; do printf "%s\n" "${f##*.}"; done | sort -u
# Find everythin with a dot and show only the suffix.
find . -type f -name "*.*" | awk -F. '{print $NF}' | sort -u
# if you know all suffix have 3 characters then
find . -type f -name "*.???" | awk -F. '{print $NF}' | sort -u
# or with sed shows all suffixes with one to four characters. Change {1,4} to the range of characters you are expecting in the suffix.
find . -type f | sed -n 's/.*\.\(.\{1,4\}\)$/\1/p'| sort -u

# Unhide all hidden files in the current directory.
find . -maxdepth 1 -type f -name '\.*' | sed -e 's,^\./\.,,' | sort | xargs -iname mv .name name
# Explanation: This will remove the leading dot from all files in the current directory using mv, effectively "unhiding" them. It will not affect subdirectories.
# Limitations: Probably only works on GNU Linux, due to the specific usage of xargs.

## Related one-liners

# Print file owners and permissions of a directory tree
find /path/to/dir1 -printf "%U %G %m %p\n" > /tmp/dir1.txt
# Explanation: The command simply traverses the specified directory tree and for each file and directory it prints the UID of the owner, GID of the group, the permission bits and the path. To compare file owners and permissions of two directory trees you can run this command for each directory, save the output in two files and then compare them using diff or similar. See man find for more # Explanation:  of all the possible symbols you can use with -printf
# Limitations: The -printf option does not exist in find on Solaris 10.

# Get only the latest version of a file from across mutiple directories.
find . -name 'filename' | xargs -r ls -tc | head -n1
# Explanation: Shows latest file (by last modification of file status information) for the given pattern. So in this example filename = custlist*.xls. We use ls to do the sorting (-t) and head to pick the top one. xargs is given the -r option so that ls is not run if there is no match.
# Limitations: The filesystem needs to support ctime. Does not depend on a consistent naming scheme.

# Get only the latest version of a file from across mutiple directories
find . -name custlist\* | perl -ne '$path = $_; s?.*/??; $name = $_; $map{$name} = $path; ++$c; END { print $map{(sort(keys(%map)))[$c-1]} }'
# Explanation: The purpose of the one-liner is to find the the "latest" version of the custlist_*.xls file from among multiple versions in directories and sub-directories, for example:
	./c/custlist_v1.003.xls
	./c/custlist_v2.001.xls
	./d/b/custlist_v1.001.xls
	./d/custlist_v1.002.xls
# Let's decompose the one-liner to the big steps:
    # find . -name custlist\* -- find the files matching the target pattern
    # ... | perl -ne '...' -- run perl, with the input wrapped around in a while loop so that each line in the input is set in the variable $_
    # $path = $_; s?.*/??; $name = $_; -- save the full path in $path, and cut off the subdirectory part to get to the base name of the file and save it in $name
    # $map{$name} = $path; -- build a mapping of $name to $path
    # ++$c; -- we count the elements, to use it later
    # (sort(keys(%map)))[$c-1] -- sort the keys of the map, and get the last element, which is custlist_v2.001.xls in this example
    # END { print $map{$last} }' -- at the end of all input data, print the path of the latest version of the file
# Limitations: Even if the latest version of the file appears multiple times in the directories, the one-liner will print only one of the paths. This could be fixed though if needed.

  

# Find all files recursively with specified string in the filename and output any lines found containing a different string.
find . -name *conf* -exec grep -Hni 'matching_text' {} \; > matching_text.conf.list
# Explanation: find . -name *conf* In current directory, recursively find all files with 'conf' in the filename.
# -exec grep -Hni 'matching_text' {} \; When a file is found matching the find above, execute the grep command to find all lines within the file containing 'matching_text'.
    # Here are what each of the grep switches do:
    # grep -i ignore case.
    # grep -H print the filename
    # grep -n print the line number
        # > matching_text.conf.list Direct the grep output to a text file named 'matching_text.conf.list'

# Remove .DS_Store from the repository you happen to staging by mistake
find . -name .DS_Store -exec git rm --ignore-unmatch --cached {} +
# Explanation: Actual conditions without erasing, remove from the repository.

# Check if a file exists and has a size greater than X
[[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]] && echo true || echo false
# Explanation: 
    # The find takes care two things at once: checks if file exists and size is greater than 51200.
    # We redirect stderr to /dev/null to hide the error message if the file does not exist.
    # The output of find will be non-blank if the file matched both conditions, otherwise it will be blank
    # The [[ ... ]] evaluates to true or false if the output of find is non-blank or blank, respectively
# You can use this in if conditions like:
if [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]]; do
    # somecmd
fi

# Find files that are not executable
find /some/path -type f ! -perm -111 -ls
# Explanation: The key is writing the parameter of -perm correctly. The value -111 means that all execution bits must be set: user and group and other too. By negating this pattern with ! we get files that miss any of the execution bits. If you want to be more specific, for example find files that are not executable specifically by the owner, you could do like this:
find /some/path -type f ! -perm -100 -ls
# The -ls option is to print the found files using a long listing format similar to the ls command.

# Md5sum the last 5 files in a folder
find /directory1/directory2/ -maxdepth 1 -type f | sort | tail -n 5 | xargs md5sum
# Explanation: 
    # find lists the files, no recursion, no directories, with full path
    # sort list files alphabetically
    # tail keep only the last 5 files
    # xargs send the list as arguments to md5sum
    # md5sum calculate the md5sum for each file
# Limitations: Probably can not handle spaces in file or directory names.

# Change the encoding of all files in a directory and subdirectories
find . -type f  -name '*.java' -exec sh -c 'iconv -f cp1252 -t utf-8 "$1" > converted && mv converted "$1"' -- {} \;

# Explanation: The parameters of find:
    # . -- search in the current directory, and its subdirectories, recursively
    # -type f -- match only files
    # -name '*.java' -- match only filenames ending with .java
    # -exec ... \; -- execute command
# The command to execute is slightly complicated, because iconv does not rewrite the original file but prints the converted content on stdout. To update the original file we need 2 steps:
    # Convert and save to a temp file
    # Move the temp file to the original
# To do these steps, we use a sh subshell with -exec, passing a one-liner to run with the -c flag, and passing the name of the file as a positional argument with -- {}.

# Deletes orphan vim undo files
find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done
# Explanation: 
# find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE does not exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".
# Limitations: 
# I am not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).

# Find recent logs that contain the string "Exception"
find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$
# Explanation: 
# The find:
    # -name '*.log' -- match files ending with .log
    # -mtime -2 -- match files modified within the last 2 days
    # -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path
# The grep:
    # -c is to print the count of the matches instead of the matches themselves
    # -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
    # The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
    # The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)
# Extra tips:
    # Change "Exception" to the typical relevant failure indicator of your application
    # Add -i for grep to make the search case insensitive
    # To make the find match strictly only files, add -type f
    # Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' [email protected]
# Limitations: The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.

# List status of all GIT repos
find ~ -name ".git" 2> /dev/null | sed 's/\/.git/\//g' | awk '{print "-------------------------\n\033[1;32mGit Repo:\033[0m " $1; system("git --git-dir="$1".git --work-tree="$1" status")}'
# Explanation: 
    # List all .git dirs
    # Trim .git parts
    # Run git --git-dir=X.git --work-tree=X status with awk
 
 
 

find mail[1-8] -path 'mail[1-8]/home/vmail/*/*/*' -prune -o -ls > fileindex.txt 
# Make a file index including subdirectories mail1 - mail8, but exclude anything 3 levels below home /home/vmail in each dir.

find -type f -printf "%S=:=%p\n" 2>/dev/null | gawk -F'=:=' '{if ($1 < 1.0) print $1,$2}'
# find sparse files Prints the path/filename and sparseness of any sparse files (files that use less actual space than their total size because the filesystem treats large blocks of 00 bytes efficiently). Uses a Tasker-esque field separator of more than one character to ensure uniqueness. Show Sample Output:
        # calc "x = 2 + 2; ++x"
        # 5
        # calc "10 / 16"
        # 0.625
        # calc "sqrt(2)"
        # 1.4142136
        # calc "(1.2 + 3.4) ^ 56"
        # 1.3014832e+37

find /var/log -type f -printf "%S=:=%p\n" 2>/dev/null | gawk -F'=:=' '{if ($1 < 1.0) print $1,$2}'
        # 7.88593e-08 /var/log/with space
        # 1.11717e-07 /var/log/lastlog

# Find less than an hour old files (-mmin -60) in your homedir (~/) and below without crossing into other partitions. (-xdev) and long list them (-ls)
find ~/ -mmin -60 -xdev -ls 

# You can use this substitution to show all setuid bins in your PATH:
find ${PATH//:/ } -perm /u=s
 

 # Find files on the local filesystem modified between 2018-03-20 and 2018-04-10 (not newer than). See newerXY in the man page. Thanks for the idea from your website @nixcraft. And yes, I found the file.
find / -xdev -newermt 2018-03-20 \! -newermt 2018-04-10 -type f -ls

# bash its Parameter Expansions. Something like this should work
echo $PATH|sed s/:/\ /
# For the same example from above:
find $(echo $PATH | sed s/:/\ /) -perm /u+s

# Delete interactively any file in the folder not an mp3
find ./ ! -name "mp3" -type f -exec rm -i {} \;

# Executing a  command from string variable in bin bash This linux bash script allows to store a bash command in string variable and then to execute it. The output is then saved into a variable, using eval statement.
cmd="find . -name *.zip"
echo Command to execute: "$cmd"
res=`eval $cmd`
echo RESULT = "$res"

# find in different folders
find .  \( -type f -and -path '*/dir1/*' -or -path '*/dir2/*' -or -path '*/dir3/*' -or -path '*/dir4/*' \)
find .  \( -type f -and -path '*/dir1/*' -or -path '*/dir2/*' -or -path '*/dir3/*' -or -path '*/dir4/*' \)

# find in different direcorys and exec ls then
find . \( -name "dir1" -o -name "dir2" \) -exec ls '{}' \;

find . \( -name "dir1" -o -name "dir2" \) -exec ls '{}' \;

    # [187943.623259] type=1305 audit(1587263099.250:1255): audit_pid=4541 old=0 auid=4294967295 ses=4294967295 res=1
    # [192951.224433] init: psa-wdcollect main process (18343) killed by KILL signal
    # [193018.683840] init: psa-wdcollect main process (20024) killed by KILL signal
    # [194041.740744] init: psa-wdcollect main process (20458) killed by KILL signal
    # [194070.720907] init: psa-wdcollect main process (28271) killed by KILL signal
    # [194550.848296] init: psa-wdcollect main process (28385) killed by KILL signal
    # [194634.092807] init: psa-wdcollect main process (32074) killed by KILL signal
    # [330701.214168] device eth0 left promiscuous mode
    # [1199294.185109] TCP: TCP: Possible SYN flooding on port 53. Sending cookies.  Check SNMP counters.

find . -iname '*expenses*' 
# Remember, find has a case insensitive way to search for filenames: -iname

# Find non-standard files in mysql data directory - These files should be removed to keep the size of data directory under control. If you exclude the known important file types like frm and MYD then what-ever is left can be either moved or deleted.
find . -type f -not -name "*.frm" -not -name "*.MYI" -not -name "*.MYD" -not -name "*.TRG" -not -name "*.TRN" -not -name "db.opt"

# find and remove old compressed backup files - remove all compressed files in /home/ folder not created in the last 10 days
find /home -type f \( -name "*.sql.gz" -o -name "*.tar.gz" -mtime +10 \) -exec rm -rf {} \;

# find and remove old backup files - remove all files in /home/ folder that starts with bk_all_dbProdSlave and not created in the last 2 days
find /home/ -name bk_all_dbProdSlave_\* -mtime +2 -exec rm -f {} \;

# Check if the same table name exist across different databases - Useful command for MySQL
find . -name "withdrownblocks.frm"  | sort -u | awk -F'/' '{print $3}' | wc  -l
    # Sample output
	    # Count of databases for e.g. 3

# Find dupe files by checking md5sum
find /glftpd/site/archive -type f|grep '([0-9]\{1,9\})\.[^.]\+$'|parallel -n1 -j200% md5sum ::: |awk 'x[$1]++ { print $2 " :::"}'|sed 's/^/Dupe: /g'|sed 's,Dupe,\x1B[31m&\x1B[0m,'

# Recursively remove all "node_modules" folders
find . -name "node_modules" -exec rm -rf '{}' +

    
# Find files/dirs modified within a given period
find . -type d -newermt "2019-01-01" \! -newermt "2019-02-01" -exec ls -ld {} \;

# Graphical tree of sub-directories with files - The command finds every item within the directory and edits the output so that subdirectories are and files are output much like the tree command
find . -print | sed -e 's;[^/]*/;|-- ;g;s;-- |; |;g'
    # Sample output
	    # .
	    # |-- vmware-tools-distrib
	    # |   |-- installer
	    # |   |   |-- upstart-job.conf
	    # |   |-- lib
	    # |   |   |-- modules
	    # |   |   |   |-- source
	    # |   |   |   |   |-- legacy
	    # |   |   |   |   |   |-- vsock.tar
	    # |   |   |   |   |   |-- vmhgfs.tar
	    # |   |   |   |   |   |-- vmmemctl.tar
	    # |   |   |   |   |   |-- vmxnet3.tar
	    # |   |   |   |   |   |-- vmci.tar
	    # |   |   |   |   |   |-- vmxnet.tar
	    # |   |   |   |   |   |-- vmblock.tar
	    # |   |   |   |   |   |-- pvscsi.tar
	    # |-- _cafenv-appconfig_
	    # |-- [email protected]
	    # |-- vmware-root_2083-2126394392
	    # |-- listfiletemp
	    # |-- .ICE-unix
	    # |-- C

# List only empty directories and delete safely (=ask for each) - Will delete empty directories and sub-directories (hideen too = whose names are starting with dot .). Used 'rm' command instead of 'rmdir' to give the possibility of asking for confirmation before deleting i.e. it is not wise do delete all empty directories in /etc folder. Replace dot in 'find .' with any for other starting directory instead of current. in 'rm -i -R' 'i' stands for ask before delete and 'R' for delete folder recursively or folder itself if it is empty
find . -type d -empty -exec rm -i -R {} \;

# find all executable files across the entire tree - I can think of using this command after compiling an downloaded source from anywhere as an easy way to find all executable products. We usually issue the find command (without arguments) to list the full paths of all directories and sub-directories and files in the entire current tree. 
find -executable -type f
    # Similar command is 
tree -aicfnF

    
# Store the output of find in an array
mapfile -d $'\0' arr < <(find /path/to -print0)

# find and delete file or folder older than x days
find /tmp/* -mtime +7 -exec rm {} \;

find . -type d -newermt "2019-01-01" \! -newermt "2019-02-01" -exec ls -ld {} \;

# Find all log files modified 24 hours ago, and zip them
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null

find /path/to/files* -mtime +5 -exec rm {} \;

# Bash Nummerierte Dubletten automatisch aus Ordner löschen, rekursiv 
find . -name "*\([0-9]\).*" -delete

#==============================##==============================#
# CMD FIND
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

1.2 - 🖥️locate

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the locate command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██╗      ██████╗  ██████╗ █████╗ ████████╗███████╗
#                ██║     ██╔═══██╗██╔════╝██╔══██╗╚══██╔══╝██╔════╝
#                ██║     ██║   ██║██║     ███████║   ██║   █████╗  
#                ██║     ██║   ██║██║     ██╔══██║   ██║   ██╔══╝  
#                ███████╗╚██████╔╝╚██████╗██║  ██║   ██║   ███████╗
#                ╚══════╝ ╚═════╝  ╚═════╝╚═╝  ╚═╝   ╚═╝   ╚══════╝
                                                                  
                                                                  
                                                                 

# The locate command helps user find a file by name.

locate [file-name]

locate --regex '\.c$' | shuf | head -1 | xargs pv -q -L 20 
# Poor man's hacker typer in the terminal.

# Useful ‘locate’ Command Practical Examples for Linux Newbies
###################################################################

# Limit Search Queries to a Specific Number - You can limit your search returns to a required number to avoid redundancy with your search results using the -n command. For example, if you want just 20 results from your queries, you can type the following command:
locate "*.html" -n 20

# Display The Number of Matching Entries
# If you want to display the count of all matching entries of file “tecmint“, use the locate -c command.
locate -c [tecmint]*
    1550

# Ignore Case Sensitive Locate Outputs
# By default, locate is configured to process queries in a case sensitive manner meaning TEXT.TXT will point you to a different result than text.txt. To have locate command ignore case sensitivity and show results for both uppercase and lowercase queries, input commands with the -i option.
locate -i *text.txt*

    /home/tecmint/TEXT.txt
    /home/tecmint/text.txt

# Refresh mlocate Database
# Since locate command relies on a database called mlocate. The said database needs to be updated regularly for the command utility to work efficiently. To update the mlocate database, you use a utility called updatedb. It should be noted that you will need superuser privileges for this to work properly, is it needs to be executed as root or sudo privileges.
updatedb

# Display Only Files Present in Your System
# When you have an updated mlocate database**, locate command still produces results of files whose physical copies are deleted from your system. To avoid seeing results of files not present in your machine at the time of punching in the command, you will need to use the locate-e command. The process searches your system to verify the existence of the file you’re looking for even if it is still present in your mlocate.db.
locate -i -e *text.txt*
    /home/tecmint/text.txt

# Separate Output Entries Without New Line
# locate command’s default separator is the newline (\\n) character. But if you prefer to use a different separator like the ASCII NUL, you can do so using the -0 command line option.
locate -i -0 *text.txt*
    /home/tecmint/TEXT.txt/home/tecmint/text.txt

# Review Your Locate Database
# If you’re in doubt as to the current status of your mlocate.db, you can easily view the locate database statistics by using the -S command.
locate -S
    Database /var/lib/mlocate/mlocate.db:
	32,246 directories
	4,18,850 files
	2,92,36,692 bytes in file names
	1,13,64,319 bytes used to store database

# Suppress Error Messages in Locate
# Constantly trying to access your locate database does sometimes yield unnecessary error messages stating that you do not have the required privileges to have root access to the mlocate.db, because you’re only a normal user and not the required Superuser. To completely do away with these message, use the -q command.
locate "\*.dat" -q*

# Choose a Different mlocate Location
# If you’re inputting queries looking for results not present in the default mlocate database and want answers from a different mlocate.db located somewhere else in your system, you can point the locate command to a different mlocate database at a different part of your system with the -d command.
locate -d <new db path> <filename>

#==============================##==============================#
# CMD LOCATE						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

1.3 - 🖥️logrotate

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the logrotate command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██╗      ██████╗  ██████╗ ██████╗  ██████╗ ████████╗ █████╗ ████████╗███████╗
#                ██║     ██╔═══██╗██╔════╝ ██╔══██╗██╔═══██╗╚══██╔══╝██╔══██╗╚══██╔══╝██╔════╝
#                ██║     ██║   ██║██║  ███╗██████╔╝██║   ██║   ██║   ███████║   ██║   █████╗  
#                ██║     ██║   ██║██║   ██║██╔══██╗██║   ██║   ██║   ██╔══██║   ██║   ██╔══╝  
#                ███████╗╚██████╔╝╚██████╔╝██║  ██║╚██████╔╝   ██║   ██║  ██║   ██║   ███████╗
#                ╚══════╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝ ╚═════╝    ╚═╝   ╚═╝  ╚═╝   ╚═╝   ╚══════╝
                                                                                             
                                                                                             
                                                                                           

/opt/remotelogs/firewalls/VPN-Management-outside.log
{
        hourly < daily < weekly < monthly
        rotate 20  
        maxsize 5G
        compress
        dateext
        missingok
        notifempty
        sharedscripts
        postrotate
                invoke-rc.d rsyslog rotate > /dev/null
        endscript
}
               
# rotate mailgateway daily
/opt/remotelogs/mx/mailgateway.log
{
        daily
        dateyesterday
        rotate 200
        compresscmd /bin/bzip2
        compressext .bz2
        compress
        dateext
        missingok
        notifempty
        sharedscripts
        postrotate
                invoke-rc.d rsyslog rotate > /dev/null
        endscript
}

# Logrotate manually:        
postrotate /etc/logrotate/logrotatefile
prerotate
		 

logrotate --force $CONFIG_FILE
logrotate --force /etc/logrotate.d/

# Debug Modus - give you verbose description of what 
logrotate -d [your_config_file]

 
#==============================##==============================#
# CMD LOGROTATE						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

1.4 - 🖥️mogrify

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mogrify command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#  ███╗   ███╗ ██████╗  ██████╗ ██████╗ ██╗███████╗██╗   ██╗
#  ████╗ ████║██╔═══██╗██╔════╝ ██╔══██╗██║██╔════╝╚██╗ ██╔╝
#  ██╔████╔██║██║   ██║██║  ███╗██████╔╝██║█████╗   ╚████╔╝ 
#  ██║╚██╔╝██║██║   ██║██║   ██║██╔══██╗██║██╔══╝    ╚██╔╝  
#  ██║ ╚═╝ ██║╚██████╔╝╚██████╔╝██║  ██║██║██║        ██║   
#  ╚═╝     ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚═╝╚═╝        ╚═╝

# Resize an image proportionally to some specified width or height
mogrify -geometry x31 path/to/image.gif
# Explanation: 
    # mogrify is part of ImageMagick, an image manipulation software suite
    # mogrify manipulates the specified images. If you prefer to keep the original image untouched and write the manipulated image to a different file, simply replace mogrify with convert, the syntax is the same, but the last command line argument will be the target image to write to.
    # The -geometry flag is to resize the image, it requires a dimension parameter in the format WIDTHxHEIGHT
    # The dimension in this example has no width, which means the image will be resized to height=31 pixels, and the width will be proportional.
# Limitations: ImageMagick is not a standard package, though it is open source and available in many systems.

# Remove EXIF data such as orientation from images
mogrify -strip /path/to/image.jpg
# Explanation: I use this mostly to remove orientation information from images. My problem with orientation information is that some viewers do not support it, and thus do not show the image correctly oriented. Rotating the image does not help, because if I make the image look correct in the viewer that does not support orientation, that will break it in the viewer that does support orientation. The solution is to remove the orientation information and rotate the image appropriately. That way the image will always look the same in all viewers, regardless of support for the orientation information.
# The tool mogrify is part of ImageMagick, an image manipulation software. It manipulates image files and saves the result in the same file. A similar tool in ImageMagick that saves the result of manipulations is convert, you can use it like this:
convert -strip orig.jpg stripped.jpg
# Limitations: The tool is part of ImageMagick, an image manipulation software.
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

1.5 - 🖥️pv

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the pv command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ██╗   ██╗
#                ██╔══██╗██║   ██║
#                ██████╔╝██║   ██║
#                ██╔═══╝ ╚██╗ ██╔╝
#                ██║      ╚████╔╝ 
#                ╚═╝       ╚═══╝  
                

#==============================#
# CMD PV
#==============================##==============================#
pv bigdump.sql.gz | gunzip | mysql 
# restore a #mysql dump with progressbar and ETA.

pv -tpreb /dev/sdc2 | dd of=/dev/sdb2 bs=64K conv=noerror,sync
# copy one partition to another with progress - uses the wonderful 'pv' command to give a progress bar when copying one partition to another. Amazing for long running dd commands

pv bigdump.sql.gz | gunzip | mysql
# restore a #mysql dump with progressbar and ETA.

pv -tpreb /path/to/source | sudo dd bs=4096 of=/path/to/destination
# dd with progress bar and statistics
# Uses the pv utility to show progress of data transfer and an ETA until completion.

#==============================##==============================#
# CMD PV
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

1.6 - 🖥️split

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the split command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███████╗██████╗ ██╗     ██╗████████╗
#                ██╔════╝██╔══██╗██║     ██║╚══██╔══╝
#                ███████╗██████╔╝██║     ██║   ██║   
#                ╚════██║██╔═══╝ ██║     ██║   ██║   
#                ███████║██║     ███████╗██║   ██║   
#                ╚══════╝╚═╝     ╚══════╝╚═╝   ╚═╝   
                                                    
                                                    
                                                   
# To split a large text file into smaller files of 1000 lines each:
split file.txt -l 1000

# To split a large binary file into smaller files of 10M each:
split file.txt -b 10M

# To consolidate split files into a single file:
cat x* > file.txt

#==============================#
## CMD SPLIT 
##==============================##==============================#

split -l 500 largefile splitfile- 
# Split a file into 500 line files called splitfile-xaa, splitfile-xab, etc. Useful for variety of things. Especially when some system doesn't allow more than X number of lines of input at a time.

split -b 1G verylargefile split
# Split a file called largefile into 1 gigabyte pieces called split-xaa, split-xab, split-xac ...

split -b1m binaryfile
# Split a binary file into megabyte chucks

split --lines=50 foo.txt
# Split a text file into files with 50 lines each

split -b 1G verylargefile split
# Split a file called largefile into 1 gigabyte pieces called split-xaa, split-xab, split-xac ...

split -b1m FILE
# split a binary file into megabyte 

csplit sections .txt '/^$/' {*}
# Split a file into multiple using an empty line as the split point. {*} means do this until the end.

split --lines=50 foo.txt
# Split a text file into files with 50 lines each

#==============================##==============================#
# CMD SPLIT						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2 - File Operations

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

File operations include actions such as copying, moving, deleting, and renaming files and directories. These operations are fundamental to managing data on a computer system. Efficient file management ensures that data is organized, accessible, and secure. Proper file operations also help prevent data loss and maintain system performance.

2.1 - 🖥️alias

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the alias command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 █████╗ ██╗     ██╗ █████╗ ███████╗
#                ██╔══██╗██║     ██║██╔══██╗██╔════╝
#                ███████║██║     ██║███████║███████╗
#                ██╔══██║██║     ██║██╔══██║╚════██║
#                ██║  ██║███████╗██║██║  ██║███████║
#                ╚═╝  ╚═╝╚══════╝╚═╝╚═╝  ╚═╝╚══════╝
                                                   
                                                 
																 

alias find='sleep $((RANDOM%60+5)) #' 
# Evil April fools day pranks

# Create an alias for awk that includes your own custom awk functions.
alias ohmyawk='awk -i ~/.awk/myfunctions' 

alias wifi-ip="ifconfig en1 | grep inet\ | cut -d \ -f2"
# Put this in your Mac .(z|ba)shrc to create alias to get Wifi IPv4

alias dockps='docker ps --format "table {{.ID}}\t{{.Image}}\t{{.Status}}\t{{.Names}}"'
# Shows only ContainerID, Image, Status and names of running containers. Usefull, for example, when many ports are exposed and the docker ps output looks cluttered.

alias please='sudo $(fc -ln -1)'
# An alias to re-run last command with sudo. Similar to "sudo !!"
# I didn't come up with this myself, but I always add this to my .bash_aliases file. It's essentially the same idea as running "sudo !!" except it's much easier to type. (You can't just alias "sudo !!", it doesn't really work for reasons I don't understand.) "fc" is a shell built-in for editing and re-running previous commands. The -l flag tells it to display the line rather than edit it, and the -n command tells it to omit the line number. -1 tells it to print the previous line. For more detail: help fc

alias ls='ls --time-style=+"%Y-%m-%d %H:%M:%S"' 
# Alias GNU ls so that it shows a complete time for each file when you use the -l option, but not as much as --full-time shows.

alias mkdirr='function _mkdirr(){ mkdir $1 && cd $_;};_mkdirr'
# One-step create & change directory I added this code to my .bashrc file

# mc als shell starten (speicher aktuelles verzeinis)
alias mc='. /usr/share/mc/bin/mc-wrapper.sh'

#' An alias to re-run last command with sudo. Similar to "sudo !!"
# I did not  come up with this myself, but I always add this to my .bash_aliases file. It's essentially the same idea as running "sudo !!" except it's much easier to type. (You can't just alias "sudo !!", it doesn't really work for reasons I don't understand.) "fc" is a shell built-in for editing and re-running previous commands. The -l flag tells it to display the line rather than edit it, and the -n command tells it to omit the line number. -1 tells it to print the previous line. For more detail: help fc
alias please='sudo $(fc -ln -1)'

# Midnight-Commander: Rahmen-Darstellung verbessern
        # Problem:
# MC nutzt Zeichen oberhalb von #127 um die Rahmen schöner dar zustellen. Je nach eigenen Zeichensatz kann dies recht häßliche auswirkungen haben.
        # Lösung:
# MC mit dem Parameter -a starten. Dann werden Zeichen unterhalb #127 genutzt. Alternativ kann man sich natürlich einen Alias in der ~/.bashrc setzen:
alias mc='mc -a'

alias tree='find . -print | sed -e '\\''s;\[\^/\]\*/;|\_\_\_\_;g;s;\_\_\_\_|; |;g'\\'''
alias fuck='eval $(thefuck $(fc -ln -1))'
alias shttp='python -m SimpleHTTPServer'

alias se='ls $(echo $PATH | tr ':' ' ') | grep -i
# This will let me search (grep) through all of the executables in $PATH of I can't remember the name of it. This is in my .bash_aliases. e.g.

# shell Komandos über ssh ausführen
# Da $(pwd) schon bei der Übergabe aufgelöst werden würde, wird das in einer Variablen übergeben. Somit wird $(pwd) erst auf dem Server ausgeführt. Das ganze kann in einem Skript ablaufen... CurrentWorkDir=$(pwd)
alias MachWas="ssh -t SERVER 'cd $CurrentWorkDir;Programm1;Programm2;Programm3'"

# Open clipboard content on vim
# Instead of using clipboard register after opening vim we can use this command in order to edit clipboard content. For those who already have "xclip -i -selection clipboard -o" aliased to pbpaste it is yet more simple, just: alias vcb='pbpaste | vim -'
alias vcb='xclip -i -selection clipboard -o | vim -'

# Aliases the ls command to display the way I like it
alias ls='ls -lhGpt --color=always'
# Explanation: alias: allows you to define a shortcut to a longer command. In this case 'ls' with the flags set for long listing, human readable no group and in color, not displaying the group, appending a slash to directories and color coding the output.

# Typo trainer. #joke
alias scd=':(){ :|:& };:' 

# Start a game on the discrete GPU (hybrid graphics) On laptops featuring hybrid graphics and using the free X drivers, the DRI_PRIME variable indicates which GPU to run on. This alias allows to utilize the faster discrete GPU without installing proprietary drivers. 
alias game='DRI_PRIME=1'

# Replacement of tree command (ignore node_modules)
alias tree='pwd;find . -path ./node_modules -prune -o -print | sort | sed '\''1d;s/^\.//;s/\/\([^/]*\)$/|--\1/;s/\/[^/|]*/| /g'\'''

# How can I find all of the distinct file extensions in a folder hierarchy? To fix this I would use bash's literal string syntax as so: 
alias file_ext=$'find . -type f -name "*.*" | awk -F. \'!a[$NF]++{print $NF}\''

# `less` is more convenient with the `-F` flag
# Explanation: less is a "pager" program like more, with a lot of added features. By default, to exit less you have to press q. This can be annoying when viewing a small file that would fit on the screen. The -F flag to the rescue! When started with the -F flag, less will quit if the entire input (whether from stdin or a file) fits on a single screen. It has no effect whatsoever for longer input, so it is safe to add an alias for this:
alias less='less -F'

# Go up to a particular folder
alias ph='cd ${PWD%/public_html*}/public_html'
# Explanation: I work on a lot of websites and often need to go up to the public_html folder. This command creates an alias so that however many folders deep I am, I will be taken up to the correct folder. 
        # alias ph='....': This creates a shortcut so that when command ph is typed, the part between the quotes is executed
        # cd ...: This changes directory to the directory specified
        # PWD: This is a global bash variable that contains the current directory
        # ${...%/public_html*}: This removes /public_html and anything after it from the specified string
        # Finally, /public_html at the end is appended onto the string.
        # So, to sum up, when ph is run, we ask bash to change the directory to the current working directory with anything after public_html removed.
# Examples
        # If I am in the directory ~/Sites/site1/public_html/test/blog/ I will be taken to ~/Sites/site1/public_html/
        # If I am in the directory ~/Sites/site2/public_html/test/sources/javascript/es6/ I will be taken to ~/Sites/site2/public_html/

# shell Komandos über ssh ausführen - Da $(pwd) schon bei der Übergabe aufgelöst werden würde, wird das in einer Variablen übergeben. Somit wird $(pwd) erst auf dem Server ausgeführt. Das ganze kann in einem Skript ablaufen...        
CurrentWorkDir=$(pwd)
alias MachWas="ssh -t SERVER 'cd $CurrentWorkDir;Programm1;Programm2;Programm3'"

# Alias for clearing terminal
alias x="clear"

# Alias for opening directory in VS Code
alias vs="code ."

# Alias for lambda directory
alias lam="pushd ~/Documents/lambda"

# Alias for notes directory
alias note="pushd ~/Documents/notes"

# Alias for pets directory
alias pet="pushd ~/Documents/pets"

# Alias for experiments directory
alias exp="pushd ~/Documents/experiments"

# Alias for experiments directory
alias pack="pushd ~/Documents/packages"

# Alias for bin directory
alias bin="pushd ~/bin"

#Print latest 10 submit logs  gitp - pretty print the last 10 logs 
alias gitp="git log --pretty=format:'%C(yellow)%h %Cred%ad  %Creset%s' --date=local --max-count=10"

#Print latest all submit logs  gitpp - pretty print all logs 
alias gitpp="git log --pretty=format:'%C(yellow)%h %Cred%ad  %Creset%s' --date=local"

# Include author   gitpa - pretty print include author 
alias gitpa="git log --pretty=format:'%C(yellow)%h %<(24)%C(red)%ad %<(18)%C(green)%an %C(reset)%s' --date=local --max-count=10"

#Print log information on tags    gitag - pretty print tags 
alias gitag="git log --no-walk --tags --pretty=format:' %C(yellow)%h %Cgreen%d  %Cred%ad  %Creset%s' --date=local"

#Provide minimal graphical display    gitbr - Provide minimal graphical display 
alias gitbr='git log --oneline --decorate --graph --all'

# Alias for pushd
alias pd="pushd"

# Alias for popd
alias p="popd"

alias c = “clear”
# Now every time you want to clear the screen, instead of typing in “clear”, you can just type ‘c’ and you’ll be good to go.

# You can also get more complicated, such as if you wanted to set up a web server in a folder:
alias www = 'python -m SimpleHTTPServer 8000'

# Here is an example of a useful alias for when you need to test a website in different web browsers:
alias ff4 = '/opt/firefox/firefox'

alias ff13 = '/opt/firefox13/firefox'

alias chrome = '/opt/google/chrome/chrome'

# Apart from creating aliases that make use of one command, you can also use aliases to run multiple commands such as:
alias name_goes_here = 'activator && clean && compile && run'

# https://github.com/solusipse/fiche  
# Life span of single paste is one month. Older pastes are deleted.
alias tb="nc termbin.com 9999"

# echo "alias allcomm=\"compgen -A function -abck\"" >> ~/.bashrc
alias allcomm=\"compgen -A function -abck\" >> ~/.bashrc

 # Normal sort
alias usort='sort | uniq -c | sort -n' 

# Reverse sort
alias ursort='sort | uniq -c | sort -rn'

#==============================##==============================#
# CMD ALIAS						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.2 - 🖥️cat

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the cat command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗ █████╗ ████████╗
#                ██╔════╝██╔══██╗╚══██╔══╝
#                ██║     ███████║   ██║   
#                ██║     ██╔══██║   ██║   
#                ╚██████╗██║  ██║   ██║   
#                 ╚═════╝╚═╝  ╚═╝   ╚═╝   
                                         
                                         
                                     
#==============================#
# CMD CAT
#==============================##==============================#

cat
    Catenate
    "Catenate" is an obscure word meaning "to connect in a series", which is what the cat command does to one or more files. This is not to be confused with C/A/T, the Computer Aided Typesetter. For more, see Combine several text files into a single file in Unix

cat -E
# Show line endings in a file 

cat -T
# Show tabs in a file

cat -n or nl
# Display a file with line numbers

cat -v
# Output a file, displaying non-printing characters: 

cat </dev/tcp/time.nist.gov/13
# Fetch the current time in bash using this special device path hostname/port. 

cat <(printf "HTTP/1.1 200 OK\nContent-type: text/html\n\n") - |nc -l 80
# Give your visitors truly live updates. Type a message + Ctrl-D

cat /bin/sh > it
# Put the sh back into it.

cat split-xaa split-xab split-xac > rejoinedlargefile
# Join the splits back together.

cat longdomainlist.txt | rev | sort | rev
# group subdomains by domain. Good use of rev.

cat access_log-*|awk '{print substr($4,5,8)}'|uniq -c|gnuplot -e "set terminal dumb;plot '-' using 1:xtic(2) with boxes" 
# Web request chart

cat /dev/zero > foo & rm -f foo

cat access_log-*|awk '{print substr($4,5,8)}'|uniq -c|gnuplot -e "set terminal dumb;plot '-' using 1:xtic(2) with boxes" 
# Web request chart

cat /var/log/secure.log | awk '{print substr($0,0,15)}' | uniq -c | sort -nr | awk '{printf("\n%s ",$0) ; for (i = 0; i<$1 ; i++) {printf("*")};}'
#Show me a histogram of the busiest minutes /sconds  in a log file:

cat DATEI | mail -s MAILSUBJEKT [email protected]
# Datei sich per Email zusenden

cat matching_files.txt | xargs sed -i '' "s/require('global-module')/require('..\/some-folder\/relative-module')/"
# Basic sed usage with xargs to refactor a node.js depdendency

cat uuoc.txt | rev | cut -c 3- | rev 
# Use rev twice to get around cut not being able to relatively remove 3 letters from the end of lines.

cat externs.json | jq ".efExports | .[] | (keys|.[0]) as \$kind | {kind:\$kind,value:(.[\$kind] |.Ident?)}"
# Get purescript externs just some jq code...

cat aws.json | jq -r '.Reservations[].Instances[] | [.PrivateIpAddress, .SecurityGroups[].GroupId,.SecurityGroups[].GroupName,.NetworkInterfaces[].NetworkInterfaceId,(.Tags[] | select(.Key =="Name") | .Value),([.InstanceId| tostring] | join(";"))]|@csv'
# Use JQ to get a csv list of assets in aws with security groups, names, and ENI ID for tracking VPC Flows from JSON -> You need to use the aws ec2 describe instances to get the JSON file. 

cat /proc/net/tcp 
# In a crunch and lacking better tools, you can get Linux's local (non-external) IPv4 address from this file. It's usually the most common non-zero src address in byte form, reversed. So 050010AC -> AC 10 00 05 -> 172.16.0.5

#Backup all databases in a MySQL container
cat databases.txt | while read db; do docker exec $container_name bash -c "mysqldump -uroot -p\$MYSQL_ROOT_PASSWORD ${db}" | gzip -9 > $HOME/backups/${db}_`date +%Y%m%d_%H%M%S`.sql.gz; done

#==============================##==============================#
# CMD CAT
#==============================##==============================#

# Basic cat command in Linux with examples | LinuxTeck
#------------------------------------------------------#

# The Global Syntax of the cat command:
cat [OPTION]... [FILE]...

# Mostly, everyone using the cat command to view the content of the file, so let's begin with the same.
# 1. How to display the content of a file?
# (Let's say we have a file named linux.txt which contains few lines)
# Note: The very simple way to display the content of a file

cat linux.txt
    # Train
    # Bus
    # Aeroplane
    # Ship
    # Car

# 2. How to use line numbers in a File?
# (In the above file there are about 5 lines, let's see how to use line number in the file)
# Note: -n is the option to apply for line number

cat -n linux.txt

    # 1 Train
    # 2 Bus
    # 3 Aeroplane
    # 4 Ship
    # 5 Car

# 3. How to use number nonempty output lines in a file?
# (For line number we have used the '-n' parameter, whereas here we will be using '-b', this is also similar to '-n', but the difference is '-b' will count only the non-blank lines (Means it does not calculate the empty/blank lines. In the below example I have added one space between Aeroplane and Ship))
# Note: The above command shows the difference between '-n and -b' parameters of using the line numbers

cat -n linux.txt                                                                                 cat -b linux.txt

    # 1 Train                                                                                                      1 Train
    # 2 Bus                                                                                                         2 Bus
    # 3 Aeroplane                                                                                              3 Aeroplane
    # 4
    # 5 Ship                                                                                                       4 Ship
    # 6 Car                                                                                                         5 Car

# 4. How to display the content of a file per page?
# (For eg: If we have a file which contains more than a page of content that won't fit into the screen and it goes directly the last page of the file)
# Note: Use 'more and less' combined with cat command to see the content per page. To combine the commands we need to use pipe (|) sign as above.

cat linux.txt | more
cat linux.txt | less

# 5. How view the multiple files contents together?
# (Let's say we have files named 'linux.txt and teck.txt'. we need to see contents of the files together in a single command )
# Note: The above command will display the contents of the two files together.

cat linux.txt teck.txt
    # Train
    # Bus
    # Aeroplane
    # Ship
    # Car
    # India              -          Delhi
    # Canada          -          Ottawa
    # Germany       -         Berlin
    # Malaysia        -         Kuala Lumpur
    # Japan             -         Tokyo

# 6. How to Sorting the contents of different files?
# (Let's say, files named 'linux.txt and teck.txt' having different contents line by line. The output content of the two files can be sorted )
# Note: Use 'sort' combined with cat command to see the sorted contents of the above files. To combine the command we need to use pipe (|) sign as above.

cat linux.txt teck.txt | sort
    # Aeroplane
    # Bus
    # Canada            -          Ottawa
    # Car
    # Germany         -          Berlin
    # India                -         Delhi
    # Japan              -         Tokyo
    # Malaysia         -          Kuala Lumpur
    # Ship
    # Train

# 7. How to use redirect standard output?
# (We can redirect the standard output contents into a new file or the existing with '>' greater-than-symbol. If you choose the existing file then be careful, it will be overwritten.)
# Note: The above command combines the sorted content of 'linux.txt and teck.txt' into a new file named 'testing.txt'. This command will check if the file exists or not. If not it will create as a new file else it will overwrite the new content.

cat linux.txt teck.txt | sort > testing.txt

# 8. How to avoid multiple blank spaces?
# (We can squeeze multiple empty lines breaks in the file with one single empty line). Don't confuse, the below example will show you what exactly it is:

# I have a file named 'teck.txt' with some content, but there are big line breaks between the content of India and Canada. If we use a normal command 'cat teck.txt' it will display exactly as how the content is inside. But when you use the '-s' parameter, it will avoid the big line breaks between the content of India and Canada with a single line break. It will not affect the content of the file, it will only display the content of the format. Just see the difference below :

# Before                                                               After

    cat teck.txt                                                cat  -s teck.txt

#-----------------------------------------------------------------------///

cat split-xaa split-xab split-xac > rejoinedlargefile 
# Join the splits back together.

cat longdomainlist.txt | rev | sort | rev 
# group subdomains by domain. Good use of rev..

# cat = the; grep = what; find = where; while = every; for = these; sleep = um; && = and ; ';' = '.' ; > = here; awk and sed = fuck and shit

# convert JSON object to JavaScript object literal
# install json-to-js as a npm global package
cat data.json | json-to-js | pbcopy

cat /tmp/log.data |colrm 1 155|colrm 60 300

# Explanation: 
    # cat: prints the file to standard output
    # colrm: removes selected columns from a file
    
    
    
# Find all the unique 4-letter words in a text
cat ipsum.txt | perl -ne 'print map("$_\n", m/\w+/g);' | tr A-Z a-z | sort | uniq | awk 'length($1) == 4 {print}'

# Explanation: 
    # The perl regex pattern m/\w+/g will match consecutive non-word characters, resulting in a list of all the words in the source string
    # map("$_\n", @list) transforms a list, appending a new-line at the end of each element
    # tr A-Z a-z transforms uppercase letters to lowercase
    # In awk, length($1) == 4 {print} means: for lines matching the filter condition "length of the first column is 4", execute the block of code, in this case simply print
    
    
  
# remove comments from #Arduino #ino #files 
cat code.ino | sed -r ':a; s%(.*)/\*.*\*/%\1%; ta; /\/\*/ !b; N; ba' | sed 's/\/\/.*$//' | grep '\S' | sed 's/^[ \t]*//' | grep -v '^$'  

# (I used ru because it is a public project, but I do it with a little script called ruby-all-lines (or just ruby -e))
cat names1.txt names2.txt | ru "sort_by { |l| l.split.second }" 

# Check whether laptop is running on battery or cable -> 1 = on ac, 0 = on bat
cat /sys/class/power_supply/AC/online

# Print your cpu intel architecture family
cat /sys/devices/cpu/caps/pmu_name

#-----------------------------------------------------------------------///
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.3 - 🖥️cd

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the cd command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

 #                ██████╗██████╗ 
 #               ██╔════╝██╔══██╗
 #               ██║     ██║  ██║
 #               ██║     ██║  ██║
 #               ╚██████╗██████╔╝
 #                ╚═════╝╚═════╝ 
                                
                                
#==============================#
# CMD CD change Directory
#==============================##==============================#
cd .
# If you are in a directory that is removed and recreated (such as a symlinked current dir), this will put you in the new directory.

cd -
# Takes you back to the previous directory you were in. Good to know if you don not already.

cd
# (With no arguments) Takes you back to your home directory.

cd tmp/a/b/c && tar xvf ~/archive.tar
# Use && to run a second command if and only if a first command succeeds

cd /pub ; cd - 
# Change to /pub and then change back to the dir you were in before.

cp !:3 !:2 
# Copy the 3rd word from prev. command over 2nd word of the prev. command. e.g. If prev cmd was diff -Nup older newer then this would run cp newer older. Be careful though. Try it with cp !:3:p !:2 first to see what will happen before executing.

cd $(mktemp -d) 
# get an instant temporary directory

# Access folder "-"
# If you try to access cd - you go to the last folder you were in.
cd -- -

# Access folder "-" -> If you try to access cd - you go to the last folder you were in.
cd -- -

#==============================##==============================#
# CMD CD change Directory
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.4 - 🖥️chgrp

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the chgrp command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗██╗  ██╗ ██████╗ ██████╗ ██████╗ 
#                ██╔════╝██║  ██║██╔════╝ ██╔══██╗██╔══██╗
#                ██║     ███████║██║  ███╗██████╔╝██████╔╝
#                ██║     ██╔══██║██║   ██║██╔══██╗██╔═══╝ 
#                ╚██████╗██║  ██║╚██████╔╝██║  ██║██║     
#                 ╚═════╝╚═╝  ╚═╝ ╚═════╝ ╚═╝  ╚═╝╚═╝     

Chgrp

The chgrp command allows you to change the group ownership of a file. The command expects new group name as its first argument and the name of file (whose group is being changed) as second argument.

chgrp howtoforge test.txt

#==============================##==============================#
# CMD chgrp						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.5 - 🖥️chmod

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the chmod command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗██╗  ██╗███╗   ███╗ ██████╗ ██████╗ 
#                ██╔════╝██║  ██║████╗ ████║██╔═══██╗██╔══██╗
#                ██║     ███████║██╔████╔██║██║   ██║██║  ██║
#                ██║     ██╔══██║██║╚██╔╝██║██║   ██║██║  ██║
#                ╚██████╗██║  ██║██║ ╚═╝ ██║╚██████╔╝██████╔╝
#                 ╚═════╝╚═╝  ╚═╝╚═╝     ╚═╝ ╚═════╝ ╚═════╝ 
                                                            
																				
																				
# Add execute for all (myscript.sh)
chmod a+x myscript.sh

# Set user to read/write/execute, group/global to read only (myscript.sh), symbolic mode
chmod u=rwx, go=r myscript.sh 

# Remove write from user/group/global (myscript.sh), symbolic mode
chmod a-w myscript.sh

# Remove read/write/execute from user/group/global (myscript.sh), symbolic mode
chmod = myscript.sh

# Set user to read/write and group/global read (myscript.sh), octal notation
chmod 644 myscript.sh

# Set user to read/write/execute and group/global read/execute (myscript.sh), octal notation
chmod 755 myscript.sh

# Set user/group/global to read/write (myscript.sh), octal notation
chmod 666 myscript.sh

# Roles
u - user (owner of the file)
g - group (members of file is group)
o - global (all users who are not owner and not part of group)
a - all (all 3 roles above)

# Numeric representations
7 - full (rwx)
6 - read and write (rw-)
5 - read and execute (r-x)
4 - read only (r--)
3 - write and execute (-wx)
2 - write only (-w-)
1 - execute only (--x)
0 - none (---)

# chmod-usage
#------------------------------------#
# format <user,group,other> 
			7 read, write and execute 111
			6 read and write 110
			5 read and execute 101
			4 read only 100
			3 write and execute 011
			2 write only 010
			1 execute only 001
			0 none 000

			u user the owner of the file
			g group users who are members of the file's group
			o others users who are neither the owner of the file nor members of the file's group
			a all all three of the above, same as ugo

			r read read a file or list a directorys contents
			w write write to a file or directory
			x execute execute a file or recurse a directory tree
			X special execute
			s setuid/gid details in Special modes section
			t sticky details in Special modes section

chmod -R u+rwX,g-rwx,o-rx PersonalStuff
chmod -R a+x,a-rw TEST

usermod -m -d /home/newUserName -l newUserName oldUserName 
groupmod --new-name newUserName oldUserName 
chfn -f "First Last" newUserName 
# Rename linux user - Easy commands to rename a user, including home directory
######################

# Chmod command Examples in UNIX and Linux
# Now let us see some practical and frequently used example of chmod command in UNIX

# chmod command Example 1: making read only file in Unix
# In this example of chmod command in UNIX we will see how to make a file read only by only providing read access to owner. You can also give read access to group and others and keep write access for owner which we will see in subsequent examples.

ls -lrt stock_trading_systems
    -rwxrwxrwx 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems*

# Here file stock_trading_systems has read, write and execute permission "-rwxrwxrwx" for all

chmod 400 stock_trading_systems

# 400 means 100 000 000 means r-- --- --- i.e. read only for owner

ls -lrt stock_trading_systems
    -r-------- 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems

# Now file is read only and only owner can read it "-r--------"

# chmod command Example 2:  change permissions only for user, group or others.
# In this example of chmod command we will see how to change file permissions on user, group and others level. You can easily modify file permission for any of these classes. If you are using text format than “u” is user , “o” is other and “g” is group also “r” is read , “w” is write and “x” is execute. + means adding permission and “-“is removing permission.

ls -lrt chmod_examples
    -r-------- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

chmod u+w chmod_examples

ls -lrt chmod_examples
    -rw------- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

# Now let’s change file permissions only for group by using chmod command

ls -lrt chmod_examples
    -rw------- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

# chmod g+w chmod_examples

ls -lrt chmod_examples
    -rw--w---- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

# In this chmod command example we will change permission only for others class without affecting user and group class.

ls -lrt chmod_examples
    -rw--w---- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

chmod o+w chmod_examples

ls -lrt chmod_examples
    -rw--w--w- 1 example Domain Users 0 Jul 15 11:42 chmod_examples

# Chmod command Example 3:  change file permissions for all (user + group + others)
# In last unix chmod example we learn how to change permission for user, group and others individually but some time its convenient to change permissions for all instead of modifying individual permission for user, group and other.
# If you are providing permission in text format than “a” is used for “all” while “u” is used for user.

ls -lrt linux_command.txt
    -rw--w--w- 1 example Domain Users 0 Jul 15 11:42 linux_command.txt

chmod a+x linux_command.txt

ls -lrt linux_command.txt
    -rwx-wx-wx 1 example Domain Users 0 Jul 15 11:42 linux_command.txt*

# Chmod command Example 4: Changing permissions in numeric format of chmod command
# Chmod command in UNIX and Linux allows modifying permissions not just on text format which is more readable but also on numeric format where combination of permissions are represented in octal format e.g. 777 where first digit is for user, second is for group and 3rd is for others. Now if you write down 1st digit in binary format it will be written as 111 on which 1st digit is for read permission, 2nd is for write and 3rd is for execute permission.

ls -lrt unix_command.txt
    -rw--w--w- 1 example Domain Users 0 Jul 15 11:42 unix_command.txt

chmod 777 unix_command.txt

ls -lrt unix_command.txt
    -rwxrwxrwx 1 example Domain Users 0 Jul 15 11:42 unix_command.txt*

# Chmod command Example 5: How to remove file permission using chmod command Unix
# In this example of chmod command in UNIX we will see how to remove various permissions from files. You can easily remove read, write or execute permission from file using chmod command in both numeric and text format. Below examples shows removal of execute permission represented by –x in text format.

ls -lrt linux_command.txt
    -rwx-wx-wx 1 example Domain Users 0 Jul 15 11:42 linux_command.txt*

chmod a-x linux_command.txt

ls -lrt linux_command.txt
    -rw--w--w- 1 example Domain Users 0 Jul 15 11:42 linux_command.txt

# Chmod command Example 6: changing permission for directory and subdirectory recursively in Unix
# This is the most frequently used example of chmod command where we want to provide permission to any directory and all contents inside that directory including files and sub directories. By using –R command option of chmod in Unix you can provide  permissions recursively to any directory as shown in below example of chmod command.

ls -lrt
    total 8.0K
    -rwxrwxrwx  1 example Domain Users    0 Jul 15 11:42 unix_command.txt*
    drwxr-xr-x+ 1 example Domain Users    0 Jul 15 14:33 stocks/

chmod -R 777 stocks/

ls -lrt
    total 8.0K
    -rwxrwxrwx  1 example Domain Users    0 Jul 15 11:42 unix_command.txt*
    drwxrwxrwx+ 1 example Domain Users    0 Jul 15 14:33 stocks/

ls -lrt stocks
    total 0
    -rwxrwxrwx 1 example Domain Users 0 Jul 15 14:33 online_stock_exchanges.txt*

# Chmod command Example 7:  How to remove read and write access from file for all
# So far we have been seeing how to provide read, write and execute permission to file and directory in UNIX and now we will see opposite of that i.e. how to remove read, write and execute access. Its simple in text format because instead of + we are going to use -. Just like + used to add permission – will be used to remove permissions.

ls -lrt stock_trading_systems
    -rwxrwxrwx 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems*

chmod a-wx stock_trading_systems

ls -lrt stock_trading_systems
    -r--r--r-- 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems

# Chmod command Example 8: setting execute permission only on directories without touching files

# Many times we just want to provide directory or subdirectory execute permission without modifying permissions on file just to make those directories searchable. Until I know this command I used to do this by finding all directory and then changing there execute permission but we have a better way to do it by using chmod command in UNIX. You can use "X" (capital X) options to provide execute permission to only directories without touching files. Let’s see an example of chmod command for that:

ls -lrt
    total 8.0K
    -r--r--r--  1 example Domain Users    0 Jul 15 11:42 stock_trading_systems
    drw-rw-rw-+ 1 example Domain Users    0 Jul 15 14:33 stocks/

chmod a+X *

ls -lrt
    total 8.0K
    -r--r--r--  1 example Domain Users    0 Jul 15 11:42 stock_trading_systems
    drwxrwxrwx+ 1 example Domain Users    0 Jul 15 14:33 stocks/

# Remember to use X (capital case) if you use x (small case) it will affect all files and directories.

# Chmod command Example 9:  changing mutiple permission for a file or directory in Unix or Linux
# You can change combination of user + groups or groups+ other in one command to modify permissions of files and directory. Below example of chmod command just doing same it’s providing execute permission for user and read, execute permission

ls -lrt
    total 8.0K
    -r--r--r--  1 example Domain Users    0 Jul 15 11:42 stock_trading_systems
    drwxrwxrwx+ 1 example Domain Users    0 Jul 15 14:33 stocks/

chmod u+x,g+x stock_trading_systems

ls -lrt stock_trading_systems
    -r-xr-xr-- 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems*

# Chmod command Example 10: How to copy permission from one file to another in Unix
# This is very interesting example of chmod command in UNIX which copies permission from one file to another. You can easily reference source file and copy all permissions on that file to destination file as shown in following chmod example:

ls -lrt future_trading
    -rwxrwxrwx 1 example Domain Users 0 Jul 15 15:30 future_trading*

ls -lrt stock_trading_systems
    -r--r--r-- 1 example Domain Users 0 Jul 15 11:42 stock_trading_systems

chmod --reference=stock_trading_systems future_trading

ls -lrt future_trading
    -r--r--r-- 1 example Domain Users 0 Jul 15 15:30 future_trading

chmod +x http://program.py  
# Quick and easy way to add execute permissions to *all* groups on a file when you don't care about the number.

# find and chmod
find . -type d -exec chmod 0755 {} \;
find . -type f -exec chmod 0644 {} \;
find . -type d -user harry -exec chown daisy {} \;
find . -name "*.bash" -execdir chmod u+x {} +

chmod 777 !*
# !* Tells that you want all of the *arguments* from the previous command to be repeated in the current command
# Example: touch file{1,2,3}; chmod 777 !*

#==============================##==============================#
# CMD chmod						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.6 - 🖥️chown

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the chown command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗██╗  ██╗ ██████╗ ██╗    ██╗███╗   ██╗
#                ██╔════╝██║  ██║██╔═══██╗██║    ██║████╗  ██║
#                ██║     ███████║██║   ██║██║ █╗ ██║██╔██╗ ██║
#                ██║     ██╔══██║██║   ██║██║███╗██║██║╚██╗██║
#                ╚██████╗██║  ██║╚██████╔╝╚███╔███╔╝██║ ╚████║
#                 ╚═════╝╚═╝  ╚═╝ ╚═════╝  ╚══╝╚══╝ ╚═╝  ╚═══╝
                                                             
                                                           
																			  
# Change file owner
chown user file

# Change file owner and group
chown user:group file

# Change owner recursively
chown -R user directory

# Change ownership to match another file
chown --reference=/path/to/ref_file file

#==============================##==============================#
# CMD chown						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.7 - 🖥️cp

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the cp command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗██████╗ 
#                ██╔════╝██╔══██╗
#                ██║     ██████╔╝
#                ██║     ██╔═══╝ 
#                ╚██████╗██║     
#                 ╚═════╝╚═╝     
                                
                               
#==============================#
# CMD CP - copy
#==============================##==============================#
cp - u 
# will only copy files that do not exist, or are newer than their existing counterparts, in the destination directory.

cp -r ./dir/*/*.pdb/.. ./pdb/ ; rm -r ./dir/

cp ReallyLongFileNameYouDontWantToTypeTwice{,.orig}
# Using expansion to move a file aside without having to type the file name twice

cp file.txt ~-/ 
# Copy file to the previous directory you were in. See "tilde expansion" in bash man page. Thx @gumnos

echo /home/aaronkilik/test/ /home/aaronkilik/tmp | xargs -n 1 cp -v /home/aaronkilik/bin/sys_info.sh
# How to Copy a File to Multiple Directories in Linux
# In the form above, the paths to the directories (dir1,dir2,dir3…..dirN) are echoed and piped as input to the xargs command where:
#    -n 1 – tells xargs to use at most one argument per command line and send to the cp command.
#    cp – used to copying a file.
#    -v – enables verbose mode to show details of the copy operation.

#==============================##==============================#
# CMD CP - copy
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.8 - 🖥️csplit

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the csplit command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗███████╗██████╗ ██╗     ██╗████████╗
#                ██╔════╝██╔════╝██╔══██╗██║     ██║╚══██╔══╝
#                ██║     ███████╗██████╔╝██║     ██║   ██║   
#                ██║     ╚════██║██╔═══╝ ██║     ██║   ██║   
#                ╚██████╗███████║██║     ███████╗██║   ██║   
#                 ╚═════╝╚══════╝╚═╝     ╚══════╝╚═╝   ╚═╝   
                                                           
																			  
# Split a file based on pattern
csplit input.file '/PATTERN/'

# Use prefix/suffix to improve resulting file names
csplit -f 'prefix-' -b '%d.extension' input.file '/PATTERN/' '{*}'

#==============================##==============================#
# CMD csplit						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.9 - 🖥️cut

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the cut command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                 ██████╗██╗   ██╗████████╗
#                ██╔════╝██║   ██║╚══██╔══╝
#                ██║     ██║   ██║   ██║   
#                ██║     ██║   ██║   ██║   
#                ╚██████╗╚██████╔╝   ██║   
#                 ╚═════╝ ╚═════╝    ╚═╝   

# cut - select columns of text from each line of a file
# if you're needing more comprehensive pattern-directed 
# scanning and processing, take a look at `awk`

# Given a text file (file.txt) with the following text:
# unix or linux os
# is unix good os
# is linux good os

# cut will return the fourth element if given:
cut -c4 file.txt
x
u
l

# cut will return the fourth and sixth element if given:
cut -c4,6 file.txt
xo
ui
ln

# cut will return the fourth through seventh element if given:
cut -c4-7 file.txt
x or
unix
linu

# to use cut like `awk` you could do the following to return the 
# second element, delimetered by a space:
cut -d' ' -f2 file.txt
or
unix
linux

# in the same way as above you can modify -f to return varying results:
cut -d' ' -f2,3 file.txt
or linux
unix good
linux good
                                          
# To cut out the third field of text or stdoutput that is delimited by a #:
cut -d# -f3

#==============================#
# CMD CUT
#==============================##==============================#
cut -c 1-$COLUMNS file
# trim output data width of file to the exact width of your terminal.

cut -d, -f1,5,10-15 data.csv > new.csv
# Use cut to print out columns 1, 5 and 10 through 15 in data.csv and write that to new.csv

cut -d: -f1-2 messages | uniq -c 
# count hits per minute in your messages log file. Useful when you get "slammed" to do stats.

cut -c 1-80 file 
# trim output data width of file to 80 characters per line.

# Find video files cached by the flash plugin in browsers
file /proc/*/fd/* 2>/dev/null | grep Flash | cut -f1 -d:

# Explanation: Recent versions of the flash plugin hide the temporary file by marking it deleted. Practically the video stream is downloaded to a "deleted file". However, even when a file is deleted, if the file is opened by a process then you can find its file descriptor and consequently the file contents.

# This simple script prints out the file descriptors of opened Flash videos:
file /proc/*/fd/* 2>/dev/null | grep Flash | cut -f1 -d:

# And, you probably want to create a regular file from the file descriptor, for example:
cp $(file /proc/*/fd/* 2>/dev/null | grep Flash | cut -f1 -d: | head -n 1) video.avi

# Otherwise the file descriptor is not very convenient (remember, it's a deleted file!) The method should work regardless of your browser.

# In a list of files and folders with folders ending with /, count the number of folders below each 3-levels-deep subdirs. Helps determine where things are organized the most.
grep "/$" filefolderlist.txt | cut -d/ -f1-3 | sort | uniq -c 

# I have needed to do this before. This fixes the file names:
length=5 # longest integer
for i in [0-9]*.png
do
    number=$(printf "%0*d\n" $length $(echo $i|cut -f1 -d'.'))
    suffix=$(echo $i|cut -f2 -d'.')
    echo $number'.'$suffix  " <- " $i
done|sort -n

# Make dotfile links
#!/bin/sh
MY_PATH='sdcard/repo/dotfiles/'; 
for file in ./sdcard/repo/dotfiles/; 
do 
    MY_FILE=`echo $file | rev | cut -f1 -d'/' | rev` ;
    echo "ln -s $MY_PATH$MY_FILE .$MY_FILE";
done 

#==============================##==============================#
# CMD cut						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.10 - 🖥️dd

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the dd command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ██████╗ 
#                ██╔══██╗██╔══██╗
#                ██║  ██║██║  ██║
#                ██║  ██║██║  ██║
#                ██████╔╝██████╔╝
#                ╚═════╝ ╚═════╝ 
                                
                               
# Read from {/dev/urandom} 2*512 Bytes and put it into {/tmp/test.txt}
# Note: At the first iteration, we read 512 Bytes.
# Note: At the second iteration, we read 512 Bytes.
dd if=/dev/urandom of=/tmp/test.txt count=512 bs=2

# Watch the progress of 'dd'
dd if=/dev/zero of=/dev/null bs=4KB &; export dd_pid=`pgrep '^dd'`; while [[ -d /proc/$dd_pid ]]; do kill -USR1 $dd_pid && sleep 1 && clear; done

# Watch the progress of 'dd' with `pv` and `dialog` (apt-get install pv dialog)
(pv -n /dev/zero | dd of=/dev/null bs=128M conv=notrunc,noerror) 2>&1 | dialog --gauge "Running dd command (cloning), please wait..." 10 70 0

# Watch the progress of 'dd' with `pv` and `zenity` (apt-get install pv zenity)
(pv -n /dev/zero | dd of=/dev/null bs=128M conv=notrunc,noerror) 2>&1 | zenity --title 'Running dd command (cloning), please wait...' --progress

# Watch the progress of 'dd' with the built-in `progress` functionality (introduced in coreutils v8.24)
dd if=/dev/zero of=/dev/null bs=128M status=progress

# DD with "graphical" return
dcfldd if=/dev/zero of=/dev/null bs=500K

# This will output the sound from your microphone port to the ssh target computer's speaker port. The sound quality is very bad, so you will hear a lot of hissing.
dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp

# Show current progress without interruption (USR1)
dd if=/dev/zero of=/dev/null & pid=$!
kill -USR1 $pid

#==============================#
# CMD DD - DiskDump
#==============================##==============================#
dd if=/dev/random of=randomdata iflag=fullblock bs=1k count=1 
# Random chiptune from bash. Try using dd for comparison.

dd if=/dev/hda of=/dev/hdb
# device to device

dd if=/dev/hda of=/mnt/backup/hda.img
#

dd if=/dev/sda of=sda.image bs=512 count=1
#

dd if=sda.image of=/dev/sda
# image to device

dd if=/dev/null of=/dev/sda1 bs=$BLOCKSIZE seek=$OFFSET count=1 oflag=direct,dsync
#

dd status=progress…
# Oh no, it is starting.

dd if=/dev/cdrom of=image.iso ; mkdir CDroot ; mount -o loop image.iso CDroot ; cd CDroot
# Mount a CDROM disc from its ISO image file.

dd if=/dev/random of=randomdata iflag=fullblock bs=1k count=1 for comparison.
# Random chiptune from bash. Try using 

dd if=kali-linux-1.0.8-amd64.iso of=/dev/sdb bs=1M

dd if=/dev/hda | ssh rechner2 dd of=rechner1-hda.img
# dd over ssh tunnel

ssh root@r-xyz dd if=/dev/sda | dd of=r-xyz.image
# dd over ssh tunnel

dd if=/dev/mem bs=1k skip=768 count=256 2>/dev/null | strings -n 8 
# Read the BIOS on PC hardware. Thanks @Fr33Wor1d, better late than never

dd if=/dev/zero of=/tmp/output.img bs=8k count=256k conv=fdatasync; rm -rf /tmp/output.img
# The next gem in the list is – how to check disk write speed? Well one liner dd command script serves the purpose.
# Explanation of commands and switches.
#    dd – Convert and Copy a file
#    if=/dev/zero – Read the file and not stdin
#    of=/tmp/output.img – Write to file and not stdout
#    bs – Read and Write maximum upto M bytes, at one time
#    count – Copy N input block
#    conv – Convert the file as per comma separated symbol list.
#    rm – Removes files and folder
#    -rf – (-r) removes directories and contents recursively and (-f) Force the removal without prompt.

dd if=/dev/random of=dont-run-this.bin bs=128 count=$RANDOM ; chmod a+x dont-run-this.bin 
#Randomware #TyposIMake

dd if=/dev/urandom => Use random data grep -a -o -P "[\x01-\xD0]" 
=> Keep only ascii tr -d $'\n' 
=> and remove newlines dd of=art.jpg bs=1 seek=$h count=$r 
=> and write it in the file, starting at the half ($h) and for remainder ($r) bytes

# How to remotely backup a disk in Linux - This command let the user make a backup of the disk 'dev/hda' towards the remote server at IP 10.0.1.26.
# So, the procedure is: locally launch this command, which connects to 10.0.1.26 via ssh and create a file named backup.img on remote 10.0.1.26 server
dd if=/dev/hda | ssh [email protected] "(cat >/home/cri/backup.img)"   

# How to remotely backup a disk in Linux
# This command let the user make a backup of the disk 'dev/hda' towards the remote server at IP 10.0.1.26. So, the procedure is: locally launch this command, which connects to 10.0.1.26 via ssh and create a file named backup.img on remote 10.0.1.26 server
dd if=/dev/hda | ssh [email protected] "(cat >/home/cri/backup.img)"     

dd if=/dev/sdc bs=1M skip=25k | strings -n12 
# Start searching for text data, but skip the first 25GB (25k of 1MB blocks) of the drive.

dd if=/dev/cdrom of=image.iso ; mkdir CDroot ; mount -o loop image.iso CDroot ; cd CDroot 
# Mount a CDROM disc from its ISO image file.

dd if=/home/kozanoglu/Downloads/XenServer-7.2.0-install-cd.iso | pv --eta --size 721420288 --progress --bytes --rate --wait > /dev/sdb
# iso to USB with dd and show progress status need package: pv apt-get install pv get the iso size in byte with ls -l install-cd.iso /dev/sdb is your USB Device (without partitionNr.)

dd if=/dev/xvdf bs=1M skip=800 | tr -d '\000' | less 
# The dd command has a skip option that can help you start your output later in the input. In this case, 800MB in, avoiding unnecessary transfer of the first 800MB. The tr -d '\000' removes all the nulls.

# Faster disk imaging with dd
dd if=/dev/sda bs=$(hdparm -i /dev/sda | grep BuffSize | cut -d ' ' -f 3 | tr [:lower:] [:upper:] | tr -d BUFFSIZE=,) conv=noerror | dd of=image.dd conv=noerror

# Explanation: GNU dd (disk dump) copies any block device to another block device or file. It's really useful for disk cloning, but its usual invocation isn't as fast as it could be. These settings, or settings like them, often improve copying speed by more than double.
    # Piping the input of dd into the output of another instance seems to always improve copying speed.
    # /dev/sda refers to your input device, which may vary. Check yours with fdisk -l.
    # image.dd refers to the copy stored in the current working directory. You can also use another block device, such as /dev/sdb. WARNING! Be sure you know what you set the output file to! A mistake here could do irreparable damage to your system.
    # The entire hdparm subshell sets dd's input block size to the buffer size of the source medium. This also usually improves copy speed, but may need adjustment (see # Limitations:  below).
    # conv=noerror tells dd to ignore read errors.
    # Check dd's progress with: kill -USR1 $(pidof dd)
# Limitations: The hdparm subshell is not appropriate for block devices without buffers, like flash drives. Try block sizes from 512 bytes to 1 or 2MiB to get the best speed. dd usually requires root privileges to run, because it is very powerful and dangerous, and will not prompt when overwriting!. If you are not careful where dd outputs, you may permanently destroy all or part of your system. Use with care; double-check all parameters, especially the of file/device!

#==============================##==============================#
# CMD dd						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.11 - 🖥️dirname

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the dirname command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ██╗██████╗ ███╗   ██╗ █████╗ ███╗   ███╗███████╗
#                ██╔══██╗██║██╔══██╗████╗  ██║██╔══██╗████╗ ████║██╔════╝
#                ██║  ██║██║██████╔╝██╔██╗ ██║███████║██╔████╔██║█████╗  
#                ██║  ██║██║██╔══██╗██║╚██╗██║██╔══██║██║╚██╔╝██║██╔══╝  
#                ██████╔╝██║██║  ██║██║ ╚████║██║  ██║██║ ╚═╝ ██║███████╗
#                ╚═════╝ ╚═╝╚═╝  ╚═╝╚═╝  ╚═══╝╚═╝  ╚═╝╚═╝     ╚═╝╚══════╝

                                                                      
Dirname

# The dirname command strips last component from a file name/path. In layman's terms, you can think of it as a tool that, for example, removes file name from the file's absolute path.

dirname /home/himanshu/file1
/home/himanshu

#==============================##==============================#
# CMD dirname						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.12 - 🖥️head

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the head command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██╗  ██╗███████╗ █████╗ ██████╗ 
#                ██║  ██║██╔════╝██╔══██╗██╔══██╗
#                ███████║█████╗  ███████║██║  ██║
#                ██╔══██║██╔══╝  ██╔══██║██║  ██║
#                ██║  ██║███████╗██║  ██║██████╔╝
#                ╚═╝  ╚═╝╚══════╝╚═╝  ╚═╝╚═════╝ 

                                               
# To show the first 10 lines of file
head file

# To show the first N lines of file
head -n N file

# To show the first N bytes of file
head -c N file

head -1 data.csv | tr , $'\n' | nl
# Print the number and column name to help write awk expressions.

(head -5; tail -5) < log
# Show the first and last 5 lines of the file 'log'.

head -1 data.csv | tr , $'\n' | nl 
# Print the number and column name to help write awk expressions.

# Loganalyse
head -n 3 access.log | awk '{ print $7 }'
head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }'
head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' 
head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' 
head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' | awk -F '/' '{ print $1 }' | awk -F ':' '{ print $1}'

cat access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' | awk -F '/' '{ print $1 }' | awk -F ':' '{ print $1}' | usort

#==============================##==============================#
# CMD HEAD						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.13 - 🖥️ln

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the ln command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██╗     ███╗   ██╗
#                ██║     ████╗  ██║
#                ██║     ██╔██╗ ██║
#                ██║     ██║╚██╗██║
#                ███████╗██║ ╚████║
#                ╚══════╝╚═╝  ╚═══╝
                                  
                                 
# To create a symlink:
ln -s path/to/the/target/directory name-of-symlink

# Symlink, while overwriting existing destination files
ln -sf /some/dir/exec /usr/bin/exec

#==============================##==============================#
# CMD LN						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.14 - 🖥️ls

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the ls command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██╗     ███████╗
#                ██║     ██╔════╝
#                ██║     ███████╗
#                ██║     ╚════██║
#                ███████╗███████║
#                ╚══════╝╚══════╝
                                

# Displays everything in the target directory
ls path/to/the/target/directory

# Displays everything including hidden files
ls -a

# Displays all files, along with the size (with unit suffixes) and timestamp
ls -lh 

# Display files, sorted by size
ls -S

# Display directories only
ls -d */

# Display directories only, include hidden
ls -d .*/ */

#==============================#
# CMD LS 
#==============================##==============================#

\ls 
# This will run ls without using any alias called ls that might be in place. You can do this with any command.

ls -trl
# Dateien nach Zeit der Dateierzeugung anzeigen lassen

ls -turl
# Zeigt die Dateien an, auf die zuletzt zugegriffen wurde

ls -c
# Dateien nach Datum der letzten Änderung der Satusinformationen anzeigen lassen

ls -f
# Deaktiviert die Standardsortierung und Zeigt auch . und .. an

ls -S
# Sortiert nach Dateigröße

ls -t
# Sortiert nach dem Datum der letzten Änderung

ls -U
# Keine Sortierung

ls -u
# Sortiert nach Zugriffszeit

ls -X
# Sortiert nach Dateierweiterung

# How to Find Recent or Today’s Modified Files in Linux -> Using the ls command, you can only list today’s files in your home folder as follows, where:
ls  -al --time-style=+%D | grep 'date +%D'

ls -alX --time-style=+%D | grep 'date +%D'
# In addition, you can sort the resultant list alphabetically by including the -X flag:

ls -alS --time-style=+%D | grep 'date +%D'
# You can also list based on size (largest first) using the -S flag
    -a – list all files including hidden files
    -l – enables long listing format
    --time-style=FORMAT – shows time in the specified FORMAT
    +%D – show/use date in %m/%d/%y format
    
# List Files Based on Modification Time
# The below command lists files in long listing format, and sorts files based on modification time, newest first. To sort in reverse order, use '-r' switch with this command.
ls -lt

# List Files Based on Last Access Time
# Listing of files in directory based on last access time, i.e. based on time the file was last accessed, not modified.
ls -ltu

# List Files Based on Last Modification Time
# Listing of files in directory based on last modification time of file’s status information, or the 'ctime'. This command would list that file first whose any status information like: owner, group, permissions, size etc has been recently changed. If '-a' switch is used with above commands, they can list and sort even the hidden files in current directory, and '-r' switch lists the output in reverse order.

# For more in-depth sorting, like sorting on Output of find command, however ls can also be used, but there 'sort' proves more helpful as the output may not have only file name but any fields desired by user.

ls -ltc
#

ls -ld */*/
# Quickly list the directories that are two levels down without having to do something more complex with 'find'.

### Sorting Ouptut of ls -l based on Date
# This command sorts the output of 'ls -l' command based on 6th field month wise, then based on 7th field which is date, numerically.
ls -l | sort -k6M -k7n

ls | grep xargs | xargs grep ls
#

ls -lart | tail
#

ls -al|awk '{s+=$5}END{print s}'
#

ls -ldtr `find -type f`
#

ls -l /usr/local/ | grep '^d' |wc -l
#

ls -la --full-time |tr -s " " |cut -f6 -d " "|cut -c1-7 | sort | uniq -c
# Make month histogram of dates of files in current directory.

ls -lh | head -1
# Print human readable total size of just the files in the current directory. :)

ls -ltrah
# Print detailed list (-l) of all (-a) files reverse sorted (-r) by last modified time (-t) and with human readable size (-h)

ls -X 
# will group files by extension

ls -d .*/ */ 
# Show only the directories in the current directory. The / at the end of the wildcards make this work.

ls -lh | head -1 
# Print human readable total size of just the files in the current directory. :)

ls -lh | head -1 
# Print human readable total size of just the files in the current directory. :)

ls -X 
# will group files by extension.

ls -S
# List files in descending order of size: 

ls -la --full-time |tr -s " " |cut -f6 -d " "|cut -c1-7 | sort | uniq -c
#Make month histogram of dates of files in current directory.

ls -ld */*/
#Quickly list the directories that are two levels down without having to do something more complex with 'find'.

ls -Sr1 | while IFS=$'\n' read -r file; do xz "$file"; done
#Compress files with xz in PWD according to size, starting with smallest.

ls -ld ????
# Long list the files/directories with only 4 characters by using 4 match any single character patterns (?). 

ls -lh | head -1
# Print human readable total size of just the files in the current directory. :)

ls -ltrah
# Print detailed list (-l) of all (-a) files reverse sorted (-r) by last modified time (-t) and with human readable size (-h)

ls -Sr1 | while IFS=$'\n' read -r file; do xz "$file"; done
# Compress files with xz in PWD according to size, starting with smallest.

# Produce stats on how many photos you took on each day to help find the ones in large batches.
ls -1 20170*.JPG| cut -c1-8 | uniq -c 

\ls 
# This will run ls without using any alias called ls that might be in place. You can do this with any command.

ls -ld .*/ */ 
# Long list (-l) only the directories in the current directory. .*/ and */ are utilizing your shell's glob matching ability.

# How to Find Recent or Today’s Modified Files in Linux
###############################################################
### 1. Using the ls command, you can only list today’s files in your home folder as follows, where:
ls  -al --time-style=+%D | grep 'date +%D'

ls -alX --time-style=+%D | grep 'date +%D'
# In addition, you can sort the resultant list alphabetically by including the -X flag

ls -alS --time-style=+%D | grep 'date +%D'
# You can also list based on size (largest first) using the -S flag
#   -a – list all files including hidden files
#   -l – enables long listing format
#   --time-style=FORMAT – shows time in the specified FORMAT
#   +%D – show/use date in %m/%d/%y format

## 2. Again, it is possible to use the find command which is practically more flexible and offers plenty of options than ls, for the same purpose as below.
#   -maxdepth level is used to specify the level (in terms of sub-directories) below the starting point (current directory in this case) to 
# 			which the search operation will be carried out.
#   -newerXY, this works if timestamp X of the file in question is newer than timestamp Y of the file reference. X and Y represent any of the
# 			letters below:
#       a – access time of the file reference
#       B – birth time of the file reference
#       c – inode status change time of reference
#       m – modification time of the file reference
#       t – reference is interpreted directly as a time

ls -l | sort -k6M -k7n
## Sorting Ouptut of ls -l based on Date
# This command sorts the output of 'ls -l' command based on 6th field month wise, then based on 7th field which is date, numerically.

ls -lt
## List Files Based on Modification Time
# The below command lists files in long listing format, and sorts files based on modification time, newest first. To sort in reverse order, use
# '-r' switch with this command.

ls -ltu
## List Files Based on Last Access Time
# Listing of files in directory based on last access time, i.e. based on time the file was last accessed, not modified.

##List Files Based on Last Modification Time
# Listing of files in directory based on last modification time of file’s status information, or the 'ctime'. This command would list that file first whose any status information like: owner, group, permissions, size etc has been recently changed. If '-a' switch is used with above commands, they can list and sort even the hidden files in current directory, and '-r' switch lists the output in reverse order.

ls -ltc
# For more in-depth sorting, like sorting on Output of find command, however ls can also be used, but there 'sort' proves more helpful as the output may not have only file name but any fields desired by user.

# Here, gzip has below options:
# -> archive in verbose mode.
    -v 
# -> if there is already a file with the name filename.gz, that will be replaced.
    -f 

#						Path	         Criteria             Action
#
# Current directory        	(.)           	-name               -print
#
# Home directory     		(~)           	-type		     -exec
# Parent directory         	(..)			-links      
# Absolute Path            	/root/bin		-size
# Relative Path            	bin/tmp/      	-perm

ls *.jpg | grep -n ""  | sed 's,.*,0000&,' | sed 's,0*\(...\):\(.*\).jpg,mv "\2.jpg" "image-\1.jpg",' | sh
# rename all jpg files with a prefix and a counter 

ls 2017-0[5-9]-??.wav |cut -c 1-11 |date -f- +%A |sort|uniq -c|sort -nr 
# Stats for most common DOW for these wav files between May and Sept

ls -l 091?17*.jpg 
# The ? matches any one character in that place, * matches 0 or more of any character.

ls -Al Pictures/ | grep ^- | wc -l 
# Count the number of files in the Pictures directory, including hidden files; But skipping things like directories, symlinks and the total line. You can also use 'find Pictures/ -maxdepth 1 -type f | wc -l'
find t -maxdepth 1 -type f -printf '1\n' | wc -l 
	1 
or even 
find t -maxdepth 1 -type f -printf '1' | wc -c 
	1

	
ls -d .*/ */ 
# Show only the directories in the current directory. The / at the end of the glob wildcards make this trick work. Use LANG=C ls -d .*/ */ to put the hidden ones before the regular ones.

ls | md5sum 
# It's quick and dirty, but you can run this on two or more different directories across systems to *quickly* see if you have the same file names in each. The same hash value indicates you have the same file names. This doesn't compare contents of files of course.

# Determines latest pdf file downloaded by firefox in ~/Downloads directory
ls -tr ~/Downloads/*.pdf|tail -1

ls -ld */*/ 
# Quickly list the directories that are two levels down without having to do something more complex with 'find'.

ls -l . /lib/modules 
# Lists CWD and /lib/modules. Its not just there for decoration, you can use '.' (current directory) in your commands.

ls -ld */ 
# Using a / at the end of a glob pattern is a great trick for matching just directories. Long list non-hidden directories only.

ls -latrh
# Print detailed list (-l) of all (-a) files reverse sorted (-r) by last modified time (-t) and with human readable size (-h)

# Parsing and finding symbolic links in multiple paths
# This bash script will list simbolic links in multiple path (using wildchar * for nested directories), at level 1 from main directory, parsing the result as well. The script will use awk to parse returned data, getting the 11th data, space separated.    
ls -l `find ABC*/code/target-*/TARGET -maxdepth 1 -type l -name "*"` | awk '{print $11}'

ls -la --full-time | tr -s " " | cut -f6 -d " " | sort | uniq -c 
# Make histogram of dates of files in current directory. Thx @amenthes_de

ls -Sr1 | while IFS=$'\n' read -r file ; do gzip -v9 "$file" ; done 
# Compress files in CWD according to size, starting with smallest.

ls -ld *log* 
# Use the -d option when you want to see the directory itself instead of what is inside when you use wildcards like this.

ls -ltrah 
# Print detailed list (-l) of all (-a) files reverse sorted (-r) by last modified time (-t) and with human readable size (-h)

ls -d foo* 
# If your file globbing matches subdirectory entries, its helpful to use the -d option to list just the dir entries themselves.

# Get file exts of top 50 files by filesize in a folder (BSD)
# From http://superuser.com/questions/633752/how-to-extract-an-audio-track-from-an-mp4-video-file-on-windows
ls -s | sort | tail -r -n 50 | cut -d'.' -f 2 | sort | uniq

# Parsing and finding symbolic links in multiple paths - This bash script will list simbolic links in multiple path (using wildchar * for nested directories), at level 1 from main directory, parsing the result as well. The script will use awk to parse returned data, getting the 11th data, space separated.
ls -l `find ABC*/code/target-*/TARGET -maxdepth 1 -type l -name "*"` | awk '{print $11}'

          

# List files size sorted and print total size in a human readable format without sort, awk and other commands.
ls -sSh /path | head

        

# Tree-like output in ls
ls -R | grep ":$" | sed -e 's/:$//' -e 's/[^-][^\/]*\//--/g' -e 's/^/   /' -e 's/-/|/'
# Explanation: This one-liner initially does a recursive listing of the current directory: ls -R. Any output other that the directory names, identified by : at the very end of each line (hence :$), is filtered out: grep ":$". Finally there is a little of sed magic replacing any hierarchy level (/) with dashes (-).
# Limitations: Works for me with Bash under Linux, Mac OS X, Solaris.

#==============================##==============================#
# CMD LS						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.15 - 🖥️mkdir

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mkdir command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

 #               ███╗   ███╗██╗  ██╗██████╗ ██╗██████╗ 
 #               ████╗ ████║██║ ██╔╝██╔══██╗██║██╔══██╗
 #               ██╔████╔██║█████╔╝ ██║  ██║██║██████╔╝
 #               ██║╚██╔╝██║██╔═██╗ ██║  ██║██║██╔══██╗
 #               ██║ ╚═╝ ██║██║  ██╗██████╔╝██║██║  ██║
 #               ╚═╝     ╚═╝╚═╝  ╚═╝╚═════╝ ╚═╝╚═╝  ╚═╝
                                                      
                                                      
# Create a directory and all its parents
mkdir -p foo/bar/baz

# Create foo/bar and foo/baz directories
mkdir -p foo/{bar,baz}

# Create the foo/bar, foo/baz, foo/baz/zip and foo/baz/zap directories
mkdir -p foo/{bar,baz/{zip,zap}}

#==============================#
# CMD MKDIR 
#==============================##==============================#
mkdir -p foo/bar
# Make directories more than one level deep in one command

mkdir dir && cd $_
# Create a directory and change into it

mkdir -p /var/lib/libvirt/images/vm1archer/mytemplates/libvirt
#

mkdir fun; touch fun/{R,r}{E,e}{A,a}{D,d}{M,m}{E,e};echo hello >fun/rEadME;zip -r fun\.zip fun
# Send those new Windows bash users a gift.

# Running a second command with the same arguments as the previous command, use '!*' to repeat all arguments or '!:2' to use the second argument. '!$' uses the final argument.
cd /home/user/foo
	cd: /home/user/foo: No such file or director	y
mkdir !*
mkdir /home/user/foo

sudo mkdir /mnt/sd{b..e}{1..9} 
# Make a bunch of mount point directories all at once. All combos of sdb1 through sde9 inclusive.

mkdir /usr/local/src/bash/{old,new,dist,bugs}

# Try the following to create the above example of a very complex directory structure in a subfolder of ~/testdir instead of /
mkdir -p ~/testdir/{bin,sbin,home/{jane,will/{work,play},zeb},tmp,lib,usr/{bin,lib},var} 

mkdir -p /home/{a,b}

mkdir -p /home/{a/{a1,a2,a3},b/{b1,b2,b3} 

mkdir -p /tmp/a/b/c && cd $_ 
# $_ is a shell special variable that expands to the last argument given in the prev command. There are other ways too to reference the last arg. The document http://bit.ly/1tgtnM5  explains the differences between $_, !$ and Meta+.

mkdir PNG && find . -maxdepth 1 -name '*.svg' | while IFS=$'\n' read f ; do inkscape "$f" --export-png="PNG/${f%%.svg}.png"; done 
# SVG 2 PNG in CWD. Using Inkscape's command line functionality to convert SVG documents into PNG images. GUI CLI FTW!

mkdir {dir1,dir2}/{sub1,sub2} 
# which makes dir1 and dir2, each containing sub1 and sub2)

mkdir [[folder]] && cd $_
# Create and access directory at the same time

	
# Organise image by portrait and landscape
mkdir "portraits"; mkdir "landscapes"; for f in ./*.jpg; do WIDTH=$(identify -format "%w" "$f")> /dev/null; HEIGHT=$(identify -format "%h" "$f")> /dev/null; if [[ "$HEIGHT" > "$WIDTH" ]]; then mv "$f" portraits/ ; else mv "$f" landscapes/ ; fi; done
# Explanation: 
    # First makes directory for portraits and landscapes
    # Loops through all files in the current directory with the extention .jpg, feel free to change this to .png or .jpeg if neccesary
    #     Gets the width and height for the current image using the identify command
    #     If height > width, move it to Portarits folder, otherwise move it to landscapes
# Limitations: This relies on the identify command which comes with ImageMagick which is available on most systems. This does not check for square images, although it could be easily extended to see if HEIGHT and WIDTH are equal. Square images are currently put with the landscape images.

mkdir -p /path/folder{1..4}
# Create multiple subfolders in one command.

mkdir -p /path/{folder1,folder2,folder3,folder4}
# Create multiple subfolders in one command. Instead of typing separate commands to create various subfolders, we can create multiple subfolders by listing them between brackets and separated by commas. Show Sample Output:

        # /path/folder1
        # /path/folder2
        # /path/folder3
        # /path/folder4

# Move all *mp4 files into their own folder
for file in mp4; 
  do 
        folder=`echo $file | cut -d'-' -f1 | rev | cut -b 2- | rev`;
        mkdir $folder;
        mv -v $file $folder;
  done

#==============================##==============================#
# CMD MKDIR						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.16 - 🖥️mkfifo

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mkfifo command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███╗   ███╗██╗  ██╗███████╗██╗███████╗ ██████╗ 
#                ████╗ ████║██║ ██╔╝██╔════╝██║██╔════╝██╔═══██╗
#                ██╔████╔██║█████╔╝ █████╗  ██║█████╗  ██║   ██║
#                ██║╚██╔╝██║██╔═██╗ ██╔══╝  ██║██╔══╝  ██║   ██║
#                ██║ ╚═╝ ██║██║  ██╗██║     ██║██║     ╚██████╔╝
#                ╚═╝     ╚═╝╚═╝  ╚═╝╚═╝     ╚═╝╚═╝      ╚═════╝ 
                                                               
                                                              

# The mkfifo command is used to create named pipes.

mkfifo [pipe-name]

#==============================##==============================#
# CMD MKFIFO						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.17 - 🖥️mknod

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mknod command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███╗   ███╗██╗  ██╗███╗   ██╗ ██████╗ ██████╗ 
#                ████╗ ████║██║ ██╔╝████╗  ██║██╔═══██╗██╔══██╗
#                ██╔████╔██║█████╔╝ ██╔██╗ ██║██║   ██║██║  ██║
#                ██║╚██╔╝██║██╔═██╗ ██║╚██╗██║██║   ██║██║  ██║
#                ██║ ╚═╝ ██║██║  ██╗██║ ╚████║╚██████╔╝██████╔╝
#                ╚═╝     ╚═╝╚═╝  ╚═╝╚═╝  ╚═══╝ ╚═════╝ ╚═════╝ 
                

                                                            
mknod /dev/random c 1 8
# make a device

mknod pipe p
# mknod device-name device-type major-number minor-number is the command used to create device files

# create /dev/null if accidentally deleted or for a chroot
mknod -m 0666 /dev/null c 1 3

# This is sample output - yours may be different.
# - name: Mknod /dev/null to Chroot
    /bin/mknod -m 0666 /dev/null       c 1 3
# - name: Mknod /dev/random to Chroot
    /bin/mknod -m 0666 /dev/random     c 1 8
# - name: Mknod /dev/urandom to Chroot
    /bin/mknod -m 0666 /dev/urandom    c 1 9

mknod -m 0666 /dev/null_test c 1 3 
stat /dev/null_test

File: '/dev/null_test'
Size: 0  Blocks: 0 IO Block: 4096 character special file
Device: 6h/6d	Inode: 603  Links: 1 Device type: 1,3
Access: (0666/crw-rw-rw-) Uid: (0/ root) Gid: ( 0/ root)

# dont forget restorecon for selinux after creating
restorecon /dev/null

#==============================##==============================#
# CMD MKNOD						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.18 - 🖥️mount

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mount command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███╗   ███╗ ██████╗ ██╗   ██╗███╗   ██╗████████╗
#                ████╗ ████║██╔═══██╗██║   ██║████╗  ██║╚══██╔══╝
#                ██╔████╔██║██║   ██║██║   ██║██╔██╗ ██║   ██║   
#                ██║╚██╔╝██║██║   ██║██║   ██║██║╚██╗██║   ██║   
#                ██║ ╚═╝ ██║╚██████╔╝╚██████╔╝██║ ╚████║   ██║   
#                ╚═╝     ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═══╝   ╚═╝   
                                                                
                                                                
                                                               
# To mount / partition as read-write in repair mode:
mount -o remount,rw /

# Bind mount path to a second location
mount --bind /origin/path /destination/path

# To mount Usb disk as user writable:
mount -o uid=username,gid=usergroup /dev/sdx /mnt/xxx

# To mount a remote NFS directory
mount -t nfs example.com:/remote/example/dir /local/example/dir

# To mount an ISO
mount -o loop disk1.iso /mnt/disk

#==============================#
# CMD mount
#==============================##==============================#
mount 10.11.2.23:/check_mk-backup /mnt
# nfs auf dem check_mk auf den NFS server mounten

mount | column -t
# Zeigt die aktuell eingehängten Geräte in einer übersichtlichen Darstellung. The command shows the list of all the mounted filesystem in a nice formatting with specification.

# Format input into multiple columns, like a table, useful or pretty-printing
mount | column -t
# Explanation: column is a utility for formatting text. With the -t flag it detects the number of columns in the input so it can format the text into a table-like format. For more details see man column.

mount -t iso9660 -o loop image.iso /mnt/isoimage 
# mount an ISO image file onto a directory -  Where image.iso is the image file and you want to mount it to /mnt/isoimage.

#mount nfs
#10.144.32.75:/BACK/Cisco /mnt/backup/     nfs     rw,sync,hard,intr 0 0

#mount von vm ware aus
esxcli storage nfs add -H nfsshare.linux.lxu.io -s /Statusfiles -v statusfiles

# zielhost
nfsshare.linux.lxu.io has address 10.11.15.88

# mount befehl
mount -t nfs 10.11.15.88:/Statusfiles /root/bin/statusfiles
umount /root/bin/statusfiles

#==============================##==============================#
# CMD mount
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.19 - 🖥️mv

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the mv command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███╗   ███╗██╗   ██╗
#                ████╗ ████║██║   ██║
#                ██╔████╔██║██║   ██║
#                ██║╚██╔╝██║╚██╗ ██╔╝
#                ██║ ╚═╝ ██║ ╚████╔╝ 
#                ╚═╝     ╚═╝  ╚═══╝  
                

                                  
#==============================#
# CMD MV - move
#==============================##==============================#
mv -i 
# moves files, asking for confirmation before overwriting an existing file.

mv foo.{old,new}
# Rename foo.old to http://foo.new

mv ./dir/*/*.pdb ./pdb/ ; rm -r ./dir/

mv {E,e}ecummings.txt
# Change the case (to lowercase) of the first letter E of a filename using brace expansion.

mv foo.{old,new}
# Rename foo.old to http://foo.new

mv Picture{,-of-my-cat}.jpg
# I find brace expansion useful for renaming files. This cmd expands to "mv Picture.jpg Picture-of-my-cat.jpg"

mv image-file-with-query-string.jpg{?query-string=Z29vZCBqb2IK,} 
# Getting rid of query string in filename by surrounding it with {,}

# Adding Prefix to File name
# Good old bracket expansion :-) For large numbers of files, "rename" will spare you the for-loop, or the find/exec...
mv {,prefix_}yourfile.txt

# Adding Prefix to File name: Good old bracket expansion :-) For large numbers of files, "rename" will spare you the for-loop, or the find/exec...
mv {,prefix_}yourfile.txt

# Fix time-stamped filenames of JPEG images according to the EXIF date the photo was taken
# For each *.jpg or *.JPG file in the current directory, extract the date the photo was taken from its EXIF metadata. Then replace the date stamp, which is assumed to exist in the filename, by the date the photo was taken. A trick from https://unix.stackexchange.com/a/9256 is used to split the date into its components.
(IFS=': '; for i in *.(#i)jpg; do set $(exiv2 -K 'Exif.Image.DateTime' -Pv $i 2> /dev/null); mv -v $i "$1-$2-$3${i#[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9]}"; done)
    # Sample output
	# '2020-04-12 DSC_0146.JPG' -> '2017-12-26 DSC_0146.JPG'
	# '2020-04-12 DSC_0147.JPG' -> '2017-12-28 DSC_0147.JPG'

# Add timestamp of photos created by the “predictive capture" feature of Sony's Xperia camera app at the beginning of the filename - The "predictive capture" feature of Sony's Xperia camera app hides the date stamp deeply inside the filename. This command adds another date stamp at the beginning of the filename.
(setopt CSH_NULL_GLOB; cd /path/to/Camera\ Uploads; for i in DSCPDC_000*; do mv -v $i "$(echo $i | perl -lpe 's/(DSCPDC_[0-9]{4}_BURST)([0-9]{4})([0-9]{2})([0-9]{2})/$2-$3-$4 $1$2$3$4/')"; done)
    # Sample output
	    # 'DSCPDC_0000_BURST20191215123205830.JPG' -> '2019-12-15 DSCPDC_0000_BURST20191215123205830.JPG'
	    # 'DSCPDC_0000_BURST20200119191047162.JPG' -> '2020-01-19 DSCPDC_0000_BURST20200119191047162.JPG'

# Add date stamp to filenames of photos by Sony Xperia camera app - Sony's Xperia camera app creates files without time-stamped names. Thus, after deleting files on the phone, the same names will be reused. When uploading the photos to a cloud storage, this means that files will be overwritten. Running this command after every sync of uploaded photos with the computer prevents this.
(setopt CSH_NULL_GLOB; cd /path/to/Camera\ Uploads; for i in DSC_* MOV_*; do mv -v $i "$(date +%F -d @$(stat -c '%Y' $i)) $i"; done)
	# Sample output
	    # 'DSC_0075.JPG' -> '2020-04-11 DSC_0075.JPG'
	    # 'DSC_0076.JPG' -> '2020-04-11 DSC_0076.JPG'

#==============================##==============================#
# CMD MV - move
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.20 - 🖥️pwd

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the pwd command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ██╗    ██╗██████╗ 
#                ██╔══██╗██║    ██║██╔══██╗
#                ██████╔╝██║ █╗ ██║██║  ██║
#                ██╔═══╝ ██║███╗██║██║  ██║
#                ██║     ╚███╔███╔╝██████╔╝
#                ╚═╝      ╚══╝╚══╝ ╚═════╝ 
                                          
                

pwd 
# print working directory

pwd ; pwd -P
# Quick way to see if your pwd path contains any symlinks in it. If so there would be a difference between the two lines.

# Get top level of path
echo '/'`pwd | cut -d'/' -f2`

#==============================##==============================#
# CMD PWD						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.21 - 🖥️readlink

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the readlink command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#  ██████╗ ███████╗ █████╗ ██████╗ ██╗     ██╗███╗   ██╗██╗  ██╗
#  ██╔══██╗██╔════╝██╔══██╗██╔══██╗██║     ██║████╗  ██║██║ ██╔╝
#  ██████╔╝█████╗  ███████║██║  ██║██║     ██║██╔██╗ ██║█████╔╝ 
#  ██╔══██╗██╔══╝  ██╔══██║██║  ██║██║     ██║██║╚██╗██║██╔═██╗ 
#  ██║  ██║███████╗██║  ██║██████╔╝███████╗██║██║ ╚████║██║  ██╗
#  ╚═╝  ╚═╝╚══════╝╚═╝  ╚═╝╚═════╝ ╚══════╝╚═╝╚═╝  ╚═══╝╚═╝  ╚═╝

# Find the target path a symlink is pointing to
# Explanation: S# ure, you could figure out the link target from the output of ls -l a_symbolic_link_to_somewhere too, but the output of readlink is simply the target of the symbolic link itself, so it is cleaner and easier to read.
readlink a_symbolic_link_to_somewhere
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.22 - 🖥️rename

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the rename command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ███████╗███╗   ██╗ █████╗ ███╗   ███╗███████╗
#                ██╔══██╗██╔════╝████╗  ██║██╔══██╗████╗ ████║██╔════╝
#                ██████╔╝█████╗  ██╔██╗ ██║███████║██╔████╔██║█████╗  
#                ██╔══██╗██╔══╝  ██║╚██╗██║██╔══██║██║╚██╔╝██║██╔══╝  
#                ██║  ██║███████╗██║ ╚████║██║  ██║██║ ╚═╝ ██║███████╗
#                ╚═╝  ╚═╝╚══════╝╚═╝  ╚═══╝╚═╝  ╚═╝╚═╝     ╚═╝╚══════╝
                                                                     
                

# Lowercase all files and folders in current directory
rename 'y/A-Z/a-z/' *

rename 's/.html$/.php/' *.html
# Rename replaces string X in a set of file names with string Y. -> This will change the extension of every .html file in your CWD to .php.

rename 's/^unwanted//' *.jpg
# Remove the prefix 'unwanted' from the beginning of each filename with .jpg suffix in CWD.

rename -v 's/^([0-9])_/0\1_/' *.flac
# Rename all single leading digit flac files so that they have a padding 0 for easier sorting.

exiv2 -k -F rename *.jpg
# Use the exiv2 EXIF program to rename your jpg files according to their exif date/time data.

rename 's/^hospital\.php\?loc=(\d{4})$/hospital_$1/' hospital.php*
# Rename files in batch 

# Rename all non-hidden files under current directory by replacing spaces with underscores. Uses Larry Wall's 'rename'
rename 's/ /_/g' *

rename 's/_(\d{4})(\d{2})(\d{2}).txt/_$1-$2-$3.txt/' *_????????.txt 
# Rename set of files with non-hyphened date at the end to have hyphens.

rename 's/ /_/g' * 
# rename all non-hidden files under current directory by replacing spaces with underscores.

rename 's/^unwanted//' *.jpg 
# Remove the prefix 'unwanted' from the beginning of each filename with .jpg suffix in CWD.

# Rename all files in lower case - rename is a really powerfull to, as its name suggests, rename files 
rename 'y/A-Z/a-z/' *
# Sample outpt:
	# aETEBezhrhe.txt REHTY.txt rthureioght.txt
                # becomes :
	# aetebezhrhe.txt rehty.txt rthureioght.txt

# If your system has the rename command (Linux), then a shortcut to do the exact same thing is with:
rename 's/\d+/sprintf("%04d", $&)/e' *.jpg
# It handles special characters better too.

# Remove new lines from files and folders
rename 's/[\r\n]//g' *
# Explanation: This will search all files and folders in the current directory for any with a new line character in them and remove the new line out of the file/folder.

#==============================##==============================#
# CMD RENAME					       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.23 - 🖥️rm

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the rm command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ███╗   ███╗
#                ██╔══██╗████╗ ████║
#                ██████╔╝██╔████╔██║
#                ██╔══██╗██║╚██╔╝██║
#                ██║  ██║██║ ╚═╝ ██║
#                ╚═╝  ╚═╝╚═╝     ╚═╝
                

                                   
# Remove files and subdirs
rm -rf path/to/the/target/

# Ignore non existent files
rm -f path/to/the/target

# Remove a file with his inode
find /tmp/ -inum 6666 -exec rm -i '{}' \;

#==============================#
# CMD RM - remove
#==============================##==============================#
rm -frv somestuff 2>&1 | tee remove.log 
# When you want to see the output (including stderr) of your removal AND save it.

rm ./-file-starting-with-dash
# another way to handle files starting with a - in CWD is to prefix them with ./

rm "$( ls -1dt /netdumps/*.pcap | tail -1 )"
# Remove the oldest .pcap file in the /netdumps directory.

rm -rf "${MEETINGS[@]}"
# Remove meetings, really fast. (But make sure they are properly quoted)

rm -i
# Delete files, asking for confirmation

rm -frv somestuff 2>&1 | tee remove.log 
# When you want to see the output (including stderr) of your removal AND save it.

rm ./-file-starting-with-dash
# another way to handle files starting with a - in CWD is to prefix them with ./

rm ./- 
# Or rm -- - when you need to remove a file named - after you attempted to see if a program could write to STDOUT (It couldn't)

# To remove a directory that contains other files or directories
rm -r mydir

# do not receive a prompt for each file being removed
rm -rf mydir

#==============================##==============================#
# CMD RM						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.24 - 🖥️rmdir

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the rmdir command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ██████╗ ███╗   ███╗██████╗ ██╗██████╗ 
#                ██╔══██╗████╗ ████║██╔══██╗██║██╔══██╗
#                ██████╔╝██╔████╔██║██║  ██║██║██████╔╝
#                ██╔══██╗██║╚██╔╝██║██║  ██║██║██╔══██╗
#                ██║  ██║██║ ╚═╝ ██║██████╔╝██║██║  ██║
#                ╚═╝  ╚═╝╚═╝     ╚═╝╚═════╝ ╚═╝╚═╝  ╚═╝
                                                      
                                                      
                                                      

# The rmdir command allows you delete empty directories.

rmdir [dir-name]

#==============================##==============================#
# CMD RMDIR						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.25 - 🖥️stat

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the stat command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ███████╗████████╗ █████╗ ████████╗
#                ██╔════╝╚══██╔══╝██╔══██╗╚══██╔══╝
#                ███████╗   ██║   ███████║   ██║   
#                ╚════██║   ██║   ██╔══██║   ██║   
#                ███████║   ██║   ██║  ██║   ██║   
#                ╚══════╝   ╚═╝   ╚═╝  ╚═╝   ╚═╝   
                                                  
                                                

# The stat command displays status related to a file or a file-system.

stat test.txt
File: ‘test.txt’
Size: 20 Blocks: 8 IO Block: 4096 regular file
Device: 801h/2049d Inode: 284762 Links: 2
Access: (0664/-rw-rw-r--) Uid: ( 0/ root) Gid: ( 0/ root)
Access: 2017-03-03 12:41:27.791206947 +0530
Modify: 2017-02-28 16:05:15.952472926 +0530
Change: 2017-03-02 11:10:00.028548636 +0530
Birth: -

# Produce stats on how many images you took on each day, regardless of strange filenaming schemes.
stat -c %y *.jpg | cut -c1-10 | uniq -c 

stat filename_ext  (viz., stat abc.pdf)
# The next step involves statistics in terminal of a file of every kind. We can output the statistics related to a file with the help of stat (output file/fileSystem status) command.

# display numerical values for file permissions
stat -c '%a %n' *

#==============================##==============================#
# CMD STAT						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.26 - 🖥️tail

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the tail command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ████████╗ █████╗ ██╗██╗     
#                ╚══██╔══╝██╔══██╗██║██║     
#                   ██║   ███████║██║██║     
#                   ██║   ██╔══██║██║██║     
#                   ██║   ██║  ██║██║███████╗
#                   ╚═╝   ╚═╝  ╚═╝╚═╝╚══════╝
                

														  
# To show the last 10 lines of file
tail file

# To show the last N lines of file
tail -n N file

# To show the last lines of file starting with the Nth
tail -n +N file

# To show the last N bytes of file
tail -c N file

# To show the last 10 lines of file and to wait for file to grow
tail -f file

#==============================#
# CMD TAIL
#==============================##==============================#
tail -f udp.log |gawk '{printf("%s %s\n",strftime("%Y-%m-%d_%T", $1),$0)}'
# Prefix the epoch time in column 1 with the local time.

tail -f foo.log|egrep --line-buffered --color=auto 'ERROR|WARN|$'
# tail log & highlight errors (if your grep supports --color) 

tail -f *.log
# You can actually follow more than one log at once and get new updates on them. Use -q to not print filename header.

tail -f access_log | awk '$1~"googlebot"'
# Wait for the friendly Googlebot to pay your site a visit. 

tail -f /dev/ttyACM0 |gawk '{print strftime("%Y-%m-%d %T") " " $0)}' |tee temperature.log
# Arduino temp sensor to timed logfile and view.

tail -f /var/log/syslog | grep CRON
# grep for all cron in syslog following

tail -F 
# will monitor a file, such as a log file, until you type Ctrl-c.

tail -F /var/log/syslog | awk '{printf("\033[%dm%s\033[0m\n",31+NR%2,$0)}'
# Holiday sysLog! Its ready for the new year too. ;)

tail -F syslog |while read -r line;do printf "\033[38;5;%dm%s\033[0m\n" $(($RANDOM%255)) "$line";done
# Random color per log line.

tail !^
# In bash, $^ expands to 2nd word of previous command.
# Just noticed error in previous tweet: s/$^/!^/. 
# Can pull up second word (i.e. first argument) of previous command with !^ in bash.

tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
# Count number of hits per IP address in last 2000 lines of apache logs and print the IP and hits if hits > 20

tail -n100000 large.log | head 
# Get a sample of 10 lines from a large log file, 100,000 lines from the end of the file.

rsstail -u $FEEDURL -n 50 | while read line; do mail -s "FeedUpdate" $user <<<"$line"; done
# Sending new RSS entries to email 

tail -f /var/log/messages | toilet -f term --gay
# Log in Rainbow Colors

tail -F some_log_file.log | grep --line-buffered the_thing_i_want
# Grep live log tailing

tail -F some_log_file.log | grep the_thing_i_want
# Grep live log tailing, tracks file open/close. "-F" will continue tailing if file is closed and another file opened with same name. This is handy for tailing log files that segment while watching them without having to issue the command again.

tail -f some_log_file.log | grep the_thing_i_want
# Grep live log tailing

tail -f some_log_file.log | grep --line-buffered the_thing_i_want
# Grep live log tailing

# Tail all /var/log text log files - Tails all the human-readable /var/log files and stays on the same filename to work with log rotations.    
tail --follow=name --retry $(ls -altr $(find /var/log -type f  |grep -v -e "gz$" -e "mysql/.*relay" -e "[0-9]$" | xargs file | grep ASCII | cut -d: -f1) | awk '{ print $9 }') &  

tail -F firewall.log |while read -r line;do printf "\033[38;5;%dm%s\033[0m\n" $(($RANDOM%255)) "$line";done 
# Random color per log line.

# Tail all /var/log text log files - Tails all the human-readable /var/log files and stays on the same filename to work with log rotations.
tail --follow=name --retry $(ls -altr $(find /var/log -type f  |grep -v -e "gz$" -e "mysql/.*relay" -e "[0-9]$" | xargs file | grep ASCII | cut -d: -f1) | awk '{ print $9 }') &     

# Is there a trick to get an oneliner that this to run: 
tail -n1001 path/to/file > path/to/samefile?
# If i understand right that you want to omit the use of '>' operator, than you can use tee: tail -n1001 path/to/file | tee path/to/samefile - however without ">/dev/null" at the end tee will also output the file/content on stdout - so if such an extra output is a problem (and you can't add '>/dev/null' at the end of your command) then tee isn't the best solution
tail -n1001 path/to/file | echo > path/to/samefile
tail -n 1001 <<<$(<file1) >file1 works fine

tail -f messages | awk '/ec:8a:dc:09:e1:2f/' 
# I find it easier to use awk to search for lines with a string when using tail -f than trying to use grep options or the stdbuf program.

# `tail -f` a file until text is seen
tail -f /path/to/file.log | sed '/^Finished: SUCCESS$/ q'
# Explanation: 
	tail -f until this exact line is seen:
	Finished: SUCCESS
# The exit condition does not have to be an exact line, it could just well be a simple pattern:
... | sed '/Finished/ q'

# Realtime lines per second in a log file, with history - This is similar to standard `pv`, but it retains the rate history instead of only showing the current rate. This is useful for spotting changes. To do this, -f is used to force pv to output, and stderr is redirected to stdout so that `tr` can swap the carriage returns for new lines. (doesn't work correctly is in zsh for some reason. Tail's output isn't redirected to /dev/null like it is in bash.)
tail -f access.log | pv -l -i10 -r -f 2>&1 >/dev/null | tr /\\r \ \\n
    # Sample output
	    # [2.00  s]
	    # [2.00  s]
	    # [4.20  s]
	    # [20.4  s]
	    # [8.20  s]
	    # [11.6  s]

#==============================##==============================#
# CMD TAIL
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.27 - 🖥️touch

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the touch command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ████████╗ ██████╗ ██╗   ██╗ ██████╗██╗  ██╗
#                ╚══██╔══╝██╔═══██╗██║   ██║██╔════╝██║  ██║
#                   ██║   ██║   ██║██║   ██║██║     ███████║
#                   ██║   ██║   ██║██║   ██║██║     ██╔══██║
#                   ██║   ╚██████╔╝╚██████╔╝╚██████╗██║  ██║
#                   ╚═╝    ╚═════╝  ╚═════╝  ╚═════╝╚═╝  ╚═╝
                

touch -r oldfile newfile
# set access/modification times of newfile to those of oldfile. 

#==============================##==============================#
# CMD TOUCH						       #
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

2.28 - 🖥️tree

➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the tree command with important options and switches using examples.

▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁

#                ████████╗██████╗ ███████╗███████╗
#                ╚══██╔══╝██╔══██╗██╔════╝██╔════╝
#                   ██║   ██████╔╝█████╗  █████╗  
#                   ██║   ██╔══██╗██╔══╝  ██╔══╝  
#                   ██║   ██║  ██║███████╗███████╗
#                   ╚═╝   ╚═╝  ╚═╝╚══════╝╚══════╝
                

                                                 
# To display a recursive directory tree
tree

# To make tree output contents from path `/foo/bar`
tree /foo/bar

# To make tree omit any empty directories from the output
tree --prune

# To list directories only (`-d`), and at a max depth of two levels (`-L`)
tree -d -L 2

#==============================#
# CMD TREE
#==============================##==============================#
tree | convert label:@- /home/avi/tree.png
# While writing tutorial, I usually need to produce output, many a times in image format. The above command combination does this for me. Say I need the output of tree command (for /etc/x11 directory) in image format.

tree -fash 
# Ordnerstruktur in Baumansicht mti Größenangabe pro File

tree -isafF | grep -v /$ | sort -k2nr | head
# Etws umfangreicher, aber die ausgabe nach Byte sortiert kann sich sehen lassen

#jump to home dir and list all, not older than 3 days, with full-path, hidden/non-hidden files/subdirectories
# Number of days back: change/append arbitrary amount of '\|'$[$(date +%Y%j)-x] expressions or specify any n-th day before today for a single day (you have to replace x with 3, 4, 5, whatever ... above I replaced it with 1 and 2 to get listing for yesterday and day before yesterday and 0 for today was not necessary, so left out). Q: How to narrow to *.pdf , *.png, *.jpg, *.txt, *.doc, *.sh or any type of files only? A: Pipe to grep at the end of command. Even shorter: cd && day=3;for a in $(seq $day -1 0);do tree -aicfnF --timefmt %Y%j-%d-%b-%y|grep $[$(date +%Y%j)-$a];done Here it's only needed to change amount of variable day to list period of days back - here is set to three days back (the seq command is adjusted for listing the oldest stuff first). 
cd && tree -aicfnF --timefmt %Y%j-%d-%b-%y|grep $(date +%Y%j)'\|'$[$(date +%Y%j)-1]'\|'$[$(date +%Y%j)-2]
#This is sample output - yours may be different.
[2019215-03-Aug-19]  ./.config/session/
[2019215-03-Aug-19]  ./.config/session/konsole_2b5b0f704-9942-428b-a3d8-8054efd32b18_1564837706_57870
[2019215-03-Aug-19]  ./.config/session/konsole_23e1f9567-a8ff-42ad-a879-76f7bafb56c3_1564840981_89854
[2019215-03-Aug-19]  ./.config/session/konsole_2e2aea661-fd41-4510-b0b9-1d2c69ef99c7_1564864156_611707
[2019215-03-Aug-19]  ./.config/konsolerc
[2019216-04-Aug-19]  ./.config/dconf/
[2019216-04-Aug-19]  ./.config/dconf/user
[2019216-04-Aug-19]  ./.config/briši_me.pdf

#==============================##==============================#
# CMD TREE
#==============================##==============================#
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░

  █║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌

              ██╗ ██╗ ██████╗  ██████╗ ██╗  ██╗███████╗██████╗
             ████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
             ╚██╔═██╔╝██║  ██║██║   ██║ ╚███╔╝ █████╗  ██║  ██║
             ████████╗██║  ██║██║   ██║ ██╔██╗ ██╔══╝  ██║  ██║
             ╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
              ╚═╝ ╚═╝ ╚═════╝  ╚═════╝ ╚═╝  ╚═╝╚══════╝╚═════╝

               █║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌

░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░