🖥️find
➡️This is a command-line reference manual for commands and command combinations that you don’t use often enough to remember it. This cheatsheet explains the find command with important options and switches using examples.
68 minute read
▁ ▂ ▃ ▄ ꧁ 🔴☠ COMMANDLINE-KUNGFU WITH CHEATSHEETS ☠🔴꧂▅ ▃ ▂ ▁
# ███████╗██╗███╗ ██╗██████╗
# ██╔════╝██║████╗ ██║██╔══██╗
# █████╗ ██║██╔██╗ ██║██║ ██║
# ██╔══╝ ██║██║╚██╗██║██║ ██║
# ██║ ██║██║ ╚████║██████╔╝
# ╚═╝ ╚═╝╚═╝ ╚═══╝╚═════╝
# To find files by case-insensitive extension (ex: .jpg, .JPG, .jpG):
find . -iname "*.jpg"
# To find directories:
find . -type d
# To find files:
find . -type f
# To find files by octal permission:
find . -type f -perm 777
# To find files with setuid bit set:
find . -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l
# To find files with extension '.txt' and remove them:
find ./path/ -name '*.txt' -exec rm '{}' \;
# To find files with extension '.txt' and look for a string into them:
find ./path/ -name '*.txt' | xargs grep 'string'
# To find files with size bigger than 5 Mb and sort them by size:
find . -size +5M -type f -print0 | xargs -0 ls -Ssh | sort -z
# To find files bigger thank 2 MB and list them:
find . -type f -size +20000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
# To find files modified more than 7 days ago and list file information
find . -type f -mtime +7d -ls
# To find symlinks owned by a user and list file information
find . -type l --user=username -ls
# To search for and delete empty directories
find . -type d -empty -exec rmdir {} \;
# To search for directories named build at a max depth of 2 directories
find . -maxdepth 2 -name build -type d
# To search all files who are not in .git directory
find . ! -iwholename '*.git*' -type f
# To find all files that have the same node (hard link) as MY_FILE_HERE
find . -type f -samefile MY_FILE_HERE 2>/dev/null
# To find all files in the current directory and modify their permissions
find . -type f -exec chmod 644 {} \;
# To find files with extension '.txt.' and edit all of them with vim
# vim is started only once for all files
find . -iname '*.txt' -exec vim {} \+
# To find all files with extension '.png' and rename them by changing extension to '.jpg' (base name is preserved)
find . -type f -iname "*.png" -exec bash -c 'mv "$0" "${0%.*}.jpg"' {} \;
#==============================#
# CMD FIND
#==============================##==============================#
# Again, it is possible to use the find command which is practically more flexible and offers plenty of options than ls, for the same purpose as below.
#==============================#
-maxdepth level is used to specify the level (in terms of sub-directories) below the starting point (current directory in this case) to which the search operation will be carried out.
-newerXY, this works if timestamp X of the file in question is newer than timestamp Y of the file reference. X and Y represent any of the letters below:
a – access time of the file reference
B – birth time of the file reference
c – inode status change time of reference
m – modification time of the file reference
t – reference is interpreted directly as a time
# This means that, only files modified on 2016-12-06 will be considered:
find . -maxdepth 1 -newermt "2016-12-06"
# Important: Use the correct date format as reference in the find command above, once you use a wrong format, you will get an error as the one below:
find . -maxdepth 1 -newermt "12-06-2016"
find: I cannot figure out how to interpret '12-06-2016' as a date or time
# Alternatively, use the correct formats below:
find . -maxdepth 1 -newermt "12/06/2016"
# OR
find . -maxdepth 1 -newermt "12/06/16"
find path criteria action
find . -name "*.log" -exec ls -l {} \;
find . -user root
# The below command output the files with respect of the user (root) owned files. All the files owned by user ‘root’ in the current directory.
find -size +100M
# The find command lists all the files in the current directory above the specified size (here 100 MB), recursively.
### Find all the log file inside a folder/subfolder and delete or archive them.
find . -name "*.log" -type f -exec rm {} \;
find . -name "*.log" -type f -exec gzip {} \;
# Kill process if its not updating files (5 min)
[[ $( find $HOME/output -type f -mmin -5 | wc -l ) -eq 0 ]] && pkill -f '/usr/bin/whatever'
### Let's say you want to delete all txt files which are older than 60 days from /temp folder.
#==============================#
find /tmp -name "*.text" -type f -mtime +60 -exec rm -f {} \;
### Now we want to delete/archive some set of files which are 3 months old and whose size is bigger than 1 mb.
#==============================#
find /path -name "*.log" -type f -mtime +90 -size +1024k -exec rm -f {} \;
### Find out the total size of all log files in a folder which are 30 days older.
#==============================#
find /logs -name '*.log' -type f -mtime +30 -exec du -am {} \;
# The above command will list files with their size in mb. Now you can apply logic to add the first column values to get total size using awk.
du -ak afile shows file size in KB
du -am afile shows file size in MB.
# likewise du -ag afile shows file size in GB.
### List the files' count in a directory which are any files but not log files(*.log).
#==============================#
find /path ! -name "*.log" -type f -exec ls -l {} \; | wc -l
### List out all the directories in a folder recursively whose name is "archive"
#==============================#
find . -name "archive" -type d
#"." points to current directory.
### Archive any files which are older than 2 days and but exclude the files which are already zipped, from /tmp folder.
#==============================#
find /tmp/ ! -name "*.zip" -type f -mtime +2 -exec gzip -vf {} \;
# Here, gzip has below options:
# -v -> archive in verbose mode.
# -f -> if there is already a file with the name filename.gz, that will be replaced.
#
# Path Criteria Action
#
# Current directory (.) -name -print
#
# Home directory (~) -type -exec
#
# Parent directory (..) -links
#
# Absolute Path /root/bin -size
#
# Relative Path bin/tmp/ -perm
# Sorting Files based on Month
# Here, we use find command to find all files in root (‘/’) directory and then print the result as: Month in which file was accessed and then filename. Of that complete result, here we list out top 11 entries.
find / -type f -printf "\n%Ab %p" | head -n 11
# The below command sorts the output using key as first field, specified by '-k1' and then it sorts on Month as specified by 'M' ahead of it.
find / -type f -printf "\n%Ab %p" | head -n 11 | sort -k1M
### Sort Files Based on Date
# Here, again we use find command to find all the files in root directory, but now we will print the result as: last date the file was accessed, last time the file was accessed and then filename. Of that we take out top 11 entries.
find / -type f -printf "\n%AD %AT %p" | head -n 11
# The below sort command first sorts on basis of last digit of the year, then sorts on basis of last digit of month in reverse order and finally sorts on basis of first field. Here, ‘1.8‘ means 8th column of first field and ‘n’ ahead of it means numerical sort, while ‘r’ indicates reverse order sorting.
find / -type f -printf "\n%AD %AT %p" | head -n 11 | sort -k1.8n -k1.1nr -k1
### Sorting Files Based on Time
# Here, again we use find command to list out top 11 files in root directory and print the result in format: last time file was accessed and then filename.
find / -type f -printf "\n%AT %p" | head -n 11
# The below command sorts the output based on first column of the first field of the output which is first digit of hour.
find / -type f -printf "\n%AT %p" | head -n 11 | sort -k1.1n
#==============================#
# EXAMPLES: Lets try to understand this command using some example command and their interpretation.
#==============================#
find ~ -name abc -print
# The above command will search and print for a file named "abc" in the home directory in downward direction.
find . -type d -name somedir -print
# The above command will search for a directory named "somedir" in the current directory downward. Also, it will print them on console.
find / -type f -links 1 -print
# The above command will search and print all the files which are of type file(not directories) and having one links.
find .. -perm 644 -exec rm {} \;
# The above command will search and remove all the files having permission as 644 (rw-r--r--) in the parent of current directory in downward direction. Note:- Here "{}" indicates that the out put of find becomes the argument of rm command. ";" shows the end of rm command and "\" is used to avoid apecial meaning of ";"
find ~ -size 5c -exec cat {} \;
# The above command will search and print the content of each file of size 5 bytes in home directory downward.
find . -type f -name somename -exec rm -i {} \;
# The above command will search and remove all the files of name "somename" in the current directory downward , interactively.
# Note:- If we use "-i" option in exec, the rm command will perform on each file interactively. But if we use the action "OK", it will implicitly perform the command interactively. See the next example:
find / -type f -name somename -ok rm {} \;
# The above command will search and remove all the files of name "somename" from the root directory downward , interactively.
find -name
# To do a search by file names using find, -name is case-sensitive
find -iname
# and -iname is case-insensitive
find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *
find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.
find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'´
find type d
# To search for directories with find
find -type f
# To search for files
find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *
find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.
find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'
find -maxdepth
# Limit find's depth of search with the -maxdepth option. When giving numeric arguments to the find utility, + means greater than and - means less than. e.g. +7 means greater than 7 - Boolean operators in find: -not, -or. There's no 'and' because 'and' is implicit when two requirements are listed
find /etc/apt/sources.list* -type f -name *.list -exec bash -c 'echo -e "\n=== $1 === ";grep -v -e ^# -e ^$ ${1}' _ {} \;
find /tmp -type f -size 5c -print
find . -name -exec grep "phrase" {} \;
# Find all files with given name (you can use Bash expansion if youd like), and Grep for a phrase
find . -name -exec grep "phrase" {} \; -print
# To display the filename that contained a match, use -print
find . -name -exec grep -Hn "phrase" {} \;
# Or, use Grep options to print the filename and line number for each match - The string `{}` is replaced by the current filename being processed everywhere it occurs in the arguments to the command. See the `find` man page for more information.
find . -atime -$(date +%j) -type f
# Find files you haven't accessed so far this year in a directory. Requires atime attributes.
find /etc -type f -printf "%T@ %T+ %p" | sort -n
# Find last modified files on a filesystem
find /etc -type f -printf "%T@ %T+ %p \n" | sort -n | less
#
find . -type f -mtime 0
#
find . -atime -$(date +%j) -type f
# Find files you haven't accessed so far this year in a directory. Requires atime attributes.
find . -print | sort | sed 's;[^/]*/;|---;g;s;---|; |;g'
# Generate output similar to 'tree' without using tree. Maybe it should be noted however that the computer I ran this on also blew out its magic smoke over the weekend.
find . -type f -exec bash -c 'mv "$1" "./$RANDOM"' - {} \;
# Random mass-rename - Assign random names to all files in a folder (including subfolders!) - Note that this is a somewhat expensive operation, so it might take a few seconds for large numbers of files.
find . -type f -name "*.php" -exec php -l {} \; | grep -v 'No syntax errors'
# Check PHP Syntax - Heres a simple one liner you can use to syntax check all php files in your working directory.
find $1 -type d | xargs du -sm | sort -g
# you want to see ALL directories in the tree
# So its more obvious, this sorts a bunch of log files together and orders them correctly based on the date field in Apache CLF format.
find . -name '*s_log*'|xargs cat|sort -k4n -t' ' -k 4.9,4.12n -k 4.5,4.7M -k 4.2,4.3n -k 4.14,4.15n -k 4.17,4.18n -k 4.20,4.21n >mergedlogs
find /var/www -perm -o+w -a -not -type l -ls
# Find files and directories under /var/www that are world writable. Exclude symbolic links.
find /etc -type f -mtime +2 -mtime -30 -ls
# Long list files under /etc that were modified between 2 and 30 days ago.
find ~/ -mindepth 2 -type f -ls | sort -n -r -k 7 | head -20
# Show the 20 largest files at least 2 subdirectories down from your home dir
find . -maxdepth 1 -type d -ls
# Long list only the directories under the current directory.
find / -perm /+s -ls
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s
find . -printf "%TY %p\n" | grep ^2013
# Get a list of all files last modified in 2013. Useful for passing to xargs or while loop
find / -size +100M -exec du -h {} \;
# find all files larger than 100MB and display their human readable size.
find . -name \*.git 2>/dev/null|grep -oP '^(.*)(?=\/\.git$)'|while read l;do pushd "$l";git status;popd;printf "\n";done
find . -name \*.[ch] -exec grep -sl "PATTERN" {} \;
# Search for PATTERN in .c and .h file.
find / -type f | sed 's,/[^/]*$,,' |sort |uniq -c | awk '$1>=33000'
# Find directories that have 33000 or more files in them.
find . -mtime +$((365*5)) -maxdepth 1 -exec du -sb {} \; |awk '{s+=$1}END{print s}'
# Total bytes used by 5+ year old directories in CWD
find . -xdev -type f -mtime +$((365*7)) -print0|xargs -0 du -bsc|awk '/\ttotal$/{s+=$0}END{print s}'
#Total bytes of files older than ~7 yr
find . -exec file -b --mime-type {} + | sort | uniq -c | sort -nr
# Make stats of the top file types in this directory and below.
find . -maxdepth 1 -type f -printf '%TY-%Tm\n' | sort | uniq -c
# counts files in the current path by modification month.
find music -name '*.mp3' -mtime +365 -a -size +10M -ls
# Find and long list mp3 files in Music dir older than a year and larger than 10MB.
# find /usr -name '*.wav' -size -75>snds;for i in $(seq 1 13 600);do at "now + $i minute" <<<'play "$(shuf snds|head -1)">/dev/null 2>&1';done
# find /usr -name '*.wav' -size -75>snds;for i in $(seq 1 13 600);do at "now + $i minute" <<<'play "$(shuf snds|head -1)">/dev/null 2>&1';done
# kein plan
find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2015/ \;
# Move current year pics to 2015 directory.
find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.
find -name "*.xml" | while IFS=$'\n' read f ; do xmllint --format "$f" > tmp.xml && mv -v tmp.xml "$f"; done
#Format XMLs.
find . -empty -type d
#List of empty subdirectories of current directory.
find . -name \*.[ch] -exec grep -sl "PATTERN" {} \;
# Search for PATTERN in .c and .h file.
find / -type f | sed 's,/[^/]*$,,' |sort |uniq -c | awk '$1>=33000'
# Find directories that have 33000 or more files in them.
find . -xdev -type f -mtime +$((365*7)) -print0|xargs -0 du -bsc|awk '/\ttotal$/{s+=$0}END{print s}'
# Total bytes of files older than ~7 yr
find www -name '*.php' -exec egrep -l 'bin/(identify|convert|mogrify|montage)\b' {} +
# Start looking for vulnerable code using ImageMagick'
find */ | cut -d/ -f1 | uniq -c
# Print how many files are inside each directory under the current one.
find -name "*.xml" | while IFS=$'\n' read f ; do xmllint --format "$f" > tmp.xml && mv -v tmp.xml "$f"; done
# Format XMLs.
find . -exec file -b --mime-type {} + | sort | uniq -c | sort -nr
# Make stats of the top file types in this directory and below.
find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'
# findet alle Dateien von . die größer als 100MB sind
find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail | ccze -h > /var/www/farblogs/index.html
# Erstellt eine html mit allen logdateien
find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f | ccze -A
# Zeigt alle Logs Farbig an
find . -mtime +$((365*5)) -maxdepth 1 -exec du -sb {} \; |awk '{s+=$1}END{print s}'
# Total bytes used by 5+ year old directories in CWD
# Find and replace on specific files
find . -name '*.php' -exec sed -ie 's#<?#<?php#' {} \;
find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2015/ \;
# Move current year pics to 2015 directory.
find / -perm /+s -ls
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s
find . -printf "%TY %p\n" | grep ^2013
# Get a list of all files last modified in 2013. Useful for passing to xargs or while loop
find / -size +100M -exec du -h {} \;
# find all files larger than 100MB and display their human readable size.
find . -maxdepth 1 -type f -printf '%TY-%Tm\n' | sort | uniq -c
# counts files in the current path by modification month.
find music -name '*.mp3' -mtime +365 -a -size +10M -ls
# Find and long list mp3 files in Music dir older than a year and larger than 10MB.
find . -empty -type d
# List of empty subdirectories of current directory.
find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.
find . -maxdepth 1 -type d -ls
# Long list only the directories under the current directory.
find / -name '-*'
# Find any files that start with a -. These can end up setting options to commands when you do stuff like du -sh *
find /etc -type f -mtime +2 -mtime -30 -ls
# Long list files under /etc that were modified between 2 and 30 days ago.
find ~/ -mindepth 2 -type f -ls | sort -n -r -k 7 | head -20
# Show the 20 largest files at least 2 subdirectories down from your home dir
find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f
#Monitor logs in Linux using Tail - Works in Ubuntu, I hope it will work on all Linux machines. For Unixes, tail should be capable of handling more than one file with '-f' option. This command line simply take log files which are text files, and not ending with a number, and it will continuously monitor those files. Putting one alias in .profile will be more useful.
# Some great usage and example of find command in Linux
##################################################################
# Format Example:
find path criteria action
find . -name "*.log" -exec ls -l {} \;
# Find all the log file inside a folder/subfolder and delete or archive them.
find . -name "*.log" -type f -exec rm {} \;
find . -name "*.log" -type f -exec gzip {} \;
# Let's say you want to delete all txt files which are older than 60 days from /temp folder.
find /tmp -name "*.text" -type f -mtime +60 -exec rm -f {} \;
# Now we want to delete/archive some set of files which are 3 months old and whose size is bigger than 1 mb.
find /path -name "*.log" -type f -mtime +90 -size +1024k -exec rm -f {} \;
# Find out the total size of all log files in a folder which are 30 days older.
find /logs -name '*.log' -type f -mtime +30 -exec du -am {} \;
# List the files' count in a directory which are any files but not log files(*.log).
find /path ! -name "*.log" -type f -exec ls -l {} \; | wc -l
# List out all the directories in a folder recursively whose name is "archive" "." points to current directory.
find . -name "archive" -type d
# Archive any files which are older than 2 days and but exclude the files which are already zipped, from /tmp folder.
find /tmp/ ! -name "*.zip" -type f -mtime +2 -exec gzip -vf {} \;
##Sorting Files based on Month
# Here, we use find command to find all files in root (‘/’) directory and then print the result as: Month in which file was accessed and then filename. Of that complete result, here we list out top 11 entries.
find / -type f -printf "\n%Ab %p" | head -n 11
# The below command sorts the output using key as first field, specified by '-k1' and then it sorts on Month as specified by 'M' ahead of it.
find / -type f -printf "\n%Ab %p" | head -n 11 | sort -k1M
## Sort Files Based on Date
# Here, again we use find command to find all the files in root directory, but now we will print the result as: last date the file was accessed, last time the file was accessed and then filename. Of that we take out top 11 entries.
find / -type f -printf "\n%AD %AT %p" | head -n 11
# The below sort command first sorts on basis of last digit of the year, then sorts on basis of last digit of month in reverse order and finally sorts on basis of first field. Here, ‘1.8‘ means 8th column of first field and ‘n’ ahead of it means numerical sort, while ‘r’ indicates reverse order sorting.
find / -type f -printf "\n%AD %AT %p" | head -n 11 | sort -k1.8n -k1.1nr -k1
## Sorting Files Based on Time
# Here, again we use find command to list out top 11 files in root directory and print the result in format: last time file was accessed and then filename.
find / -type f -printf "\n%AT %p" | head -n 11
# The below command sorts the output based on first column of the first field of the output which is first digit of hour.
find / -type f -printf "\n%AT %p" | head -n 11 | sort -k1.1n
# EXAMPLES: Lets try to understand this command using some example command and their interpretation.
find ~ -name abc -print
# The above command will search and print for a file named "abc" in the home directory in downward direction.
find . -type d -name somedir -print
# The above command will search for a directory named "somedir" in the current directory downward. Also, it will print them on console.
find / -type f -links 1 -print
# The above command will search and print all the files which are of type file(not directories) and having one links.
find .. -perm 644 -exec rm {} \;
# The above command will search and remove all the files having permission as 644 (rw-r--r--) in the parent of current directory in downward direction.
# Note:- Here "{}" indicates that the out put of find becomes the argument of rm command. ";" shows the end of rm command and "\" is used to avoid apecial meaning of ";"
find ~ -size 5c -exec cat {} \;
# The above command will search and print the content of each file of size 5 bytes in home directory downward.
find . -type f -name somename -exec rm -i {} \;
# The above command will search and remove all the files of name "somename" in the current directory downward , interactively. Note:- If we use "-i" option in exec, the rm command will perform on each file interactively. But if we use the action "OK", it will implicitly perform the command interactively. See the next example:
find / -type f -name somename -ok rm {} \;
# The above command will search and remove all the files of name "somename" from the root directory downward , interactively.
# This means that, only files modified on 2016-12-06 will be considered:
find . -maxdepth 1 -newermt "2016-12-06"
# Important: Use the correct date format as reference in the find command above, once you use a wrong format, you will get an error as the one below:
find . -maxdepth 1 -newermt "12-06-2016"
# find: I cannot figure out how to interpret '12-06-2016' as a date or time
# Alternatively, use the correct formats below:
find . -maxdepth 1 -newermt "12/06/2016"
# OR
find . -maxdepth 1 -newermt "12/06/16"
find -name
# To do a search by file names using find, -name is case-sensitive
find -iname
# and -iname is case-insensitive
find type d
# To search for directories with find
find -type f
# To search for files
find -maxdepth
# Limit find's depth of search with the -maxdepth option. When giving numeric arguments to the find utility, + means greater than and - means less than. e.g. +7 means greater than 7 Boolean operators in find: -not, -or. There's no 'and' because 'and' is implicit when two requirements are listed
find /home/user -name '*.ksh' | xargs chmod 744
# Change permissions on files of a specific type in linux
find /home/user -name '*.ksh' | xargs ls -l
find $mountpoint -xdev -type d -size +300k
find . | cpio -oH newc | gzip -c > /boot/initrd-test.img
find . -name "node_modules" -exec rm -rf '{}' \;
# Recursively remove "node_modules" directories
find . -maxdepth 1 -iname '*.mp3' -exec eyeD3 -G podcast \{} \;
# tag all mp3 in PWD as genre podcast.
find ./music -name \*.mp3 -exec cp {} ./new \;
# Backslashing the * glob instead of quoting the expression. F
find / -please-find-the-file-i-want
# Sometimes, there are files that find can't find no matter how many options you try.
find /var/log -readable -ls
# Find files under /var/log that are readable by the current user. Takes groups and ACLs into account.
find / \( -path /proc -o -path /sys \) -prune -o -print
# Search the file system, but don't descend into the /sys or /proc directories.
find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.
## Say, you are asked to clean up your account. How will you find the biggest files in your account?
$ find . -type f -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5
# What the above command does:
1. It finds all the files and does a long listing of them(find & ls -l).
2. Only the filesize and the filenames alone are filtered.($5, $NF)
3. Sorted on the basis of file size.(Biggest files at the top).
4. The big 5 files get displayed.(head -5)
# Note: Do not run the above command where there are huge number of files present. It will take a long time to respond.Also, in the awk command, $5 denotes the filesize in Linux. It might be different in other *nix flavors.
### There could be cases when the user is specifically interested only to find big files above a particular size, say above 100MB:
$ find . -type f -size +100M -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5
# In the above find command, size switch is used to find files on the basis of size. '+100M' indicates files bigger(+) than 100MB.
### Similarly, to find files of size between 100MB and 200MB:
$ find . -type f -size +100M -size -200M -exec ls -l '{}' \; | awk '{print $5, $NF}' | sort -nr | head -5
# +100M indicates files bigger than 100MB, -200M indicates files smaller than 200MB. In other words, we will get files of size between 100 and 200MB.
The notations to specify in the size switch of the find command is :
For greater than 50KB, +50k (small k)
greater than 50MB, +50M (big M)
greater than 5GB, +5G (big G)
# Find files on a specific file system - If you know the file name and file system but not sure the exact folder path then you can use this syntax. In below example, I am searching for messages file in /var file system.
find /var -name messages
/var/log/messages
# Tips: if you don’t know the file system name, you can search on / level but keep in mind it may take time if you have large number of file systems.
find / -name messages
/var/log/messages
# If you don’t know the exact file name, you can also use wild card pattern to search. For ex – to search error_log you may try
find / -name error_*
/var/log/httpd/error_log
# How about searching file name with lower or upper case in other word ignoring case sensitive? Well, you can use –iname instead of –name. For ex:-
find / -iname MESSAGES
/var/log/messages
# Let’s take a look at one more real-time scenario. If you know the file type and want to search all of them. For ex – if you are working on WebSphere, you may want to search all files ending with .out then you can try
find / -name *.out
## Find files based on ownership and permissions - Having files with 777 permission is dangerous as anyone can edit or delete so as a System Administrator you may want to put a scan in place to find any files with 777 permissions. For ex – to show any files having 777 permission under /opt file system.
find /opt/ -type f -perm 777
/opt/testing
/opt/SystemOut.log
find /opt/ -type f -perm 777 -exec ls -ltr {} ;
-rwxrwxrwx 1 root root 0 Jul 19 03:35 /opt/testing
-rwxrwxrwx 1 root root 0 Jul 19 03:36 /opt/SystemOut.log
# Tips: how about printing file ownership, time stamp in same line command?
find /opt/ -type f -perm 777 -exec chmod 755 {} ;
# You may also change permission from 777 to 755 in single find command syntax. Obviously, you can adjust permission from 755 to any other you may like.
# How about finding files, which is owned by root or different user? This is very helpful if you are having issues while starting the services due to previous start was done by root. For ex – if tomcat is owned by user called “tomcatapp” and for some reason you have started with root. Guess what will happen when you restart next time with “tomcatapp”? It won’t because some of the files ownership is changed to root and now “tomcatapp” can’t modify/delete those files. So this becomes very handy in that situation. Here is how you can search any file owned by root in specific file system.
find /opt/ -user root
# Note: performing this find syntax on / level will results so many files/folders so you may want to control by doing this in specific file system.
## Find files older than particular days
# File System housekeeping is essential for production support and often you have to deal with this syntax to find logs which are older than (let’s say) 60 days. Below example is to find access.log file older than 60 days in /opt file system.
find /opt/ -name access.log -mtime +60
# Tips: if you decide to find and delete in same command line you can do like below. This will find access.log older than 60 days in /opt file system and delete it.
find /opt/ -name access.log -mtime +60 -exec rm {} ;
# While this is very handy, you may want to list the files before you delete them. To do so
find /opt/ -name access.log -mtime +60 -exec ls -ltr {} ;
## Find large file size - Sometime you may have to deal with frequent file system cleanup due to large number of logs are being written by application due to code issue, etc. Let’s take an example of searching file greater than 1 GB in /opt file system.
find /opt/ -size +1G
# Tips: If you know all files in /opt/ with more than 1 GB can be deleted then you can just have find and delete in same line.
find /opt/ -size +1G -exec rm {} ;
find . -name "*.[ch]" -print | xargs tar -cvf <name_of_output_file>
find /var/oracle/etl/incoming -name '*.dat' -mtime +7 -exec echo rm {} \;
find /home/backups -mtime +30 -type f -exec rm -rf {} \;
find . -mtime +3 -exec rm {} ';'
find .
# Find all files under .
find . -type d
# Find all subdirectories.
find . -iregex ".*\(bas\|cls\|frm\)"
# Find all Visual Basic code files in a directory -iregex matches a case-insensitive regular expression - backslashes necessary to protect special characters at shell
find . -iregex ".*\(bas\|cls\|frm\)" -exec grep NIBRS \{\} \;
# Find all VB files containing a given string Note you must escape the {} and the ; because we are at the shell
find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep "^*'" \{\} \;
# Find all VB files containing comment lines - Note you must escape the {} and the ; because we are at the shell
find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'" \{\} \;
# Find all VB files containing NON comment lines - Note you must escape the {} and the ; because we are at the shell
find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'|^[[:space:]]*$" \{\} \;
# Find all VB files containing NON comment, NON blank lines in a directory
find . -iregex ".*\(bas\|cls\|frm\)" -exec egrep -v "^*'|^[[:space:]]*$" \{\} \; | wc
# Count the code in a directory hierarchy
find . -iregex ".*\(java\|html\|txt\)" -exec wc \{\} \; | gawk '{ print $1 "\t" $4; sum += $1 } END { print "--------"; print sum "\tTOTAL" }'
# Sum the line counts of all the code files in a directory
find `perl -e 'print "@INC"'` -name '*.pm' -print
# Find all Perl modules - From Active Perl documentation
find . -name "*.library" -print0 | xargs -0 sed -i '' -e 's/foo:\/Drive_Letter:/foo:\/bar\/baz\/xyz/g'
find . -name Root -exec sed -i 's/1.2.3.4\/home/foo.com\/mnt/' {} \;
find . -name Root -print0 | xargs -0 sed -i '' -e 's/1.2.3.4\/home/foo.com\/mnt/'
find ./ -type f -exec sed -i " 's/string1/string2/' {} \;
"#
find . -name "*.txt" -print0 | xargs -0 sed -i '' -e 's/foo/bar/g'
# Recursively find and replace in files
find . -type f -name "*.txt" -exec sed -i'' -e 's/foo/bar/g' {} +
find . -name '*s_log*'|xargs cat|sort -k4n -t' ' -k 4.9,4.12n -k 4.5,4.7M -k 4.2,4.3n -k 4.14,4.15n -k 4.17,4.18n -k 4.20,4.21n >mergedlogs
# So its more obvious, this sorts a bunch of log files together and orders them correctly based on the date field in Apache CLF format.
find . -name lxu* -type d -exec bash -c 'mv "$1" "${lxu/\/123_//}"' -- {} \;
find . -type f -name ".wato"
find /var/www -perm -o+w -a -not -type l -ls
# Find files and directories under /var/www that are world writable. Exclude symbolic links.
find ! -name "*.pdb" -delete
find -type f ! -name "*.pdb" -delete
fdupes -r dir > dupes.txt
# Find file duplicates in 'dir' recursively based on size and mdsum and log to dupes.txt.
fdupes -r Pictures > dupes.txt
# Find file duplicates in 'Pictures' recursively based on size and mdsum and log them to dupes.txt.
find . -name "node_modules" -exec rm -rf '{}' \;
# Recursively remove "node_modules" directories
find . -name "node_modules" -exec rm -rf '{}' +
# First iteration: (doesn't call rm for every file)
find . -name "node_modules" -delete
# Second iteration: Builtin
find -wholename "*/query/*.json"
# find matching wholename example
find . -type f -name 'file*' -execdir mv {} {}_renamed ';'
# Renaming multiple files using find
find [YOURDIR] -type d -exec chmod 755 {} \;
find [YOURDIR] -type f -exec chmod 644 {} \;
# Change Directory and File Permissions Properly For Linux Web Server
# Sets file permissions to 644 and directory permissions to 755.
find . -name *.xml -exec grep -e my_grep_data {} \; -print
# How to grep the results of find using -exec in Linux
# This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.
find . -type f ! -path "./.git/*" -exec sh -c "head -n 1 {} | egrep -a 'bin/bash|bin/sh' >/dev/null" \; -print -exec shellcheck {} \;
ShellCheck all the bash/sh script under a specific directory excluding version control
# This is a commodity one-liner that uses ShellCheck to assure some quality on bash and sh scripts under a specific directory. It ignores the files in .git directory.
# Just substitute "./.git/*" with "./.svn/*" for older and booring centralized version control.
# Just substitute ShellCheck with "rm" if your scripts are crap and you want to get rid of them :)
find . -printf '%s %p\n'| sort -nr | head -10
find /var/www/web3/web/chat/ -printf '%s %p\n'| sort -nr | head -10
find /var/www/web3/web/chat/ -type f -printf '%s %p\n'| sort -nr | head -10
# You can also skip the directories and only show files , follow the below command.
# or
find /var/www/web3/web/chat/ -type f -iname "*.jpg" -printf '%s %p\n'| sort -nr | head -10
find . | xargs grep 'string' -ls
# To search for instances of a string inside all files within a directory (recursive)
find -type f -name "*.avi" -print0 | xargs -0 mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
# Get the total length of all video / audio in the current dir (and below) in H:m:s
# change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.
find -type f -iregex '.*\.\(mkv\|mp4\|wmv\|flv\|webm\|mov\|dat\|flv\)' -print0 | xargs -0 mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
# Get the total length of all videos in the current dir in H:m:s
# Use case insensitive regex to match files ending in popular video format extensions and calculate their total time. (traverses all files recursively starting from the current directory)
find / -type d | while read i; do ls $i | wc -l | tr -d \\n; echo " -> $i"; done | sort -n
# Show file count into directories. - Usefull when you try to find hugh directories that elevate system CPU (vmstat -> sy)
find . -name '*.log' | xargs ls -hlt > /tmp/logs.txt && vi /tmp/logs.txt
# Find latest modified log
find . -type f -exec stat --format '%Y :%y %n' {} \; | sort -nr | cut -d: -f2- | head
# List files by modified date - Script to list files in a directory recursively by last modified date.
find . -type f -perm /o=r -print0 | xargs -0 grep -l password=
# Find world readable files under CWD that have "password=" in them.
find ./* -type f -exec sed -i 's/oldtext/newtext/g' {} \;
# Linux Search & Replace - A recursive search and replace linux command, for those massive file changes you don't feel like doing by hand. This method doesn't create any backup files
# or
find . -type f | xargs perl -pi~ -e 's/oldtext/newtext/g;'
# this method creates backup files
find ./path_to_search -type f -name "*the_patern*" -exec rm -i {} \;
# Delete Files Recursively - Sift through a bunch of directories and delete only specific files.
find / -iname '*droids*' 2> /dev/null
# If you want to avoid error messages in the output of find such as "Permission denied", just redirect STDERR (The 2> part) to /dev/null.
find . -type d
# Find only folders in a directory
find . -maxdepth 1 -mindepth 1 -print0 | xargs -0 -n 1 -I % cmp % /DUPDIR/% 2>/dev/null
# Compare directories (using cmp to compare files byte by byte) to find files of the same name that differ
# Compare the content of the files in the current directory with files of the same name in the duplicate directory. Pop Quiz: You have a duplicate of a directory with files of the same name that might differ. What do you do? You could use diff to compare the directories, but that's boring and it isn't as clever as find -print0 with xargs. Note: You must omit stderr redirect 2>/dev/null to see the list of missing files from DUPDIR, if any. Hint: Redirect stderr to a new file to produce a more readable list of files that are missing from DUPDIR. Warning: This doesn't tell you if DUPDIR contains files not found in the current directory so don't delete DUPDIR. This is sample output - yours may be different.
./DIFFER.PNG /DUPDIR/./DIFFER.PNG differ: char 59, line 3
cmp: /DUPDIR/./NOMATCH.PNG: No such file or directory
find www/ -type f -execdir chmod -v o+r {} \; -o -type d -execdir chmod -v o+rx {} \;
# Add read permissions for files and read/execute permissions for directories under the www directory.
find /home -mtime -1 -size +100M -ls
# Try to figure out what recently used file might have just filled up the /home partition by finding files modified in the last day that are larger than 100M just to narrow it down.
find /dir/to/serach -maxdepth 1 -name "foo*.jpg"|wc -l
# Count Files in a Directory with Wildcards. Remove the '-maxdepth 1' option if you want to count in directories as well
find / -iname "manifest.json" -exec sed 's/\"update_url\": \"http/\"update_url\": \"hxxp/g' -i.bak '{}' \;
# Disable updates for installed Chrome plugins This will allow you to ensure you don't get nagged by updates and also protects you from watering hole attacks! Please be sure to make sure your plugins don't have any security issues! Backups are manifext.jason.bak
find /var/log -type f -exec file {} \; | grep 'text' | cut -d' ' -f1 | sed -e's/:$//g' | grep -v '[0-9]$' | xargs tail -f
# Recursive find and replace file extension / suffix (mass rename files) - Find recursively all files in ~/Notes with the extension '.md' and pipe that via xargs to rename command, which will replace every '.md' to '.txt' in this example (existing files will not be overwritten).
find ~/Notes -type f -iname '*.md' -print0 | xargs -0 rename --no-overwrite .md .txt {}
# Convert libreoffice files : .odt .odg and other to .pdf Find and Convert all libre office files to PDF without libreoffice GUI
find /home/foo/Documents/ -type f -iname "*.odt" -exec libreoffice --headless --convert-to pdf '{}' \;
# Fulltext search in multiple OCR'd pdfs
find /path -name '*.pdf' -exec sh -c 'pdftotext "{}" - | grep --with-filename --label="{}" --color "your pattern"' \;
# How to grep the results of find using -exec in Linux
# -> This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.
find . -name *.xml -exec grep -e my_grep_data {} \; -print
# Simple command to erase all the folders with a given name -> This is how to remove all folders with a given name (e.g. "CVS") starting from a root folder ('.' is the current folder)
find . -name <fileName> -type d -print0 | xargs -0 rm -fr
e.g.
find . -name CVS -type d -print0 | xargs -0 rm -fr
# Find all files ending with ".swp"
find . -name \*.swp -type f
# Find all files ending with ".swp" and delete them
find . -name \*.swp -type f -delete
# Find all files, not in hidden directory, that contain the pattern "TRANDESCID" and also any of the patterns "41", "42", "45", and "50"
find . -not -path '*/\.*' -type f -exec grep -iq TRANDESCID {} \; -exec grep -il -e 41 -e 42 -e 45 -e 50 {} \;
# Keep track of setuid / setgit executables. The following command lists all setuid and setgid files on the system:
find / -type f -perm /6000
# To list all setuid and setgid files that are world writable execute the following command:
find . -type f -perm /6000 -a -perm -0002
# How to grep the results of find using -exec in Linux
# This Linux one line command is useful when you want to grep the output of a find. The concatenation of grep and find is done via -exec parameter.
find . -name *.xml -exec grep -e my_grep_data {} \; -print
# zgrep across multiple files
find "$(pwd)" -name "file-pattern*.gz" -exec zgrep -H 'pattern' {} \;
# replace recursive in folder with sed
find <folder> -type f -exec sed -i 's/my big String/newString/g' {} +
# Find ASCII files and extract IP addresses
find . -type f -exec grep -Iq . {} \; -exec grep -oE "(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)" {} /dev/null \;
# Delete all files by extension
# This is a correction to https://www.commandlinefu.com/commands/view/22134 Use `-name` instead of `-iname`, because case-sensitivity is probably important when we're dealing with filenames. It's true that extensions are often capitalised (e.g., "something.JPG"), so choose whatever's appropriate. However, what is appropriate is the quoting of the name pattern, so the shell does not expand it incorrectly. Finally, `-delete` is clearer.
find / -name "*.jpg" -delete
# Find all file extension in current dir.
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u
find / -perm /+s -ls
# Find any files or directories on your system that are suid or sgid. Older versions of find can try -perm +u+s
find . -printf "%TY %p\n" | grep ^2006
# Get a list of all files last modified in 2006. Useful for passing to awk then xargs or for loop
find . -maxdepth 1 -daystart -type f -name '*.jpg' -mtime -$( date +%j ) -exec mv -v {} 2013/ \;
# Move current year pics to 2013 directory.
find . -cnewer cutoff -type f -name '2014*'
# Find files named 2014* that are newer than the change time of the file named 'cutoff'.
find . -type f -mmin -60
# Find files below the current directory that have changed within the last 60 minutes.
find {,/usr}/{,s}bin -name '??'
# Use brace expansion to check all your bin and /usr/bin dirs at once for any two letter command.
find . -xdev -ls | sort -n -k 7 | tail -5
# Quickly find the largest 5 files in the CWD tree without crossing filesystem boundaries.
# Find all log file which modified 24 hours ago and compress into zip file
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null &
# Explanation:
# -type f: only file
# -mtime +n: File's data was last modified n*24 hours ago
# -name "*.log": file have extend .log, can replace other word
# zip -m {}.zip: compress all file into zip
# /dev/null &: skipping print screen.
find / -type f ! -regex '^/\(dev\|proc\|run\|sys\).*' | sed 's@^\(.*\)/[^/]*$@\1@' | sort | uniq -c | sort -n | tail -n 10
# Find the top 10 directories containing the highest number of files It can be used to pinpoint the path(s) where the largest number of files resides when running out of free i-nodes Show Sample Output:
# 4.0K /lib64
# 8.0K /media
# 8.0K /srv
# 32K /tmp
# 92K /boot
# 5.8M /sbin
# 7.9M /bin
# 18M /etc
# 45M /lib
# 49M /opt
# 908M /root
# 1.7G /usr
# 7.5G /var
# 18G /home
find ./ -type f -name "somefile.txt" -exec sed -i -e 's/foo/bar/g' {} \;
# Recursive search and replace (with bash only) Replaces a string matching a pattern in one or several files found recursively in a particular folder.
find . -print0 | xargs -0 -P 40 -n 1 sh -c 'ffmpeg -i "$1" 2>&1 | grep "Duration:" | cut -d " " -f 4 | sed "s/.$//" | tr "." ":"' - | awk -F ':' '{ sum1+=$1; sum2+=$2; sum3+=$3; sum4+=$4 } END { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 }'
# Count the total amount of hours of your music collection First the find command finds all files in your current directory (.). This is piped to xargs to be able to run the next shell pipeline in parallel. The xargs -P argument specifies how many processes you want to run in parallel, you can set this higher than your core count as the duration reading is mainly IO bound. The -print0 and -0 arguments of find and xargs respectively are used to easily handle files with spaces or other special characters. A subshell is executed by xargs to have a shell pipeline for each file that is found by find. This pipeline extracts the duration and converts it to a format easily parsed by awk. ffmpeg reads the file and prints a lot of information about it, grep extracts the duration line. cut and sed cut out the time information, and tr converts the last . to a : to make it easier to split by awk. awk is a specialized programming language for use in shell scripts. Here we use it to split the time elements in 4 variables and add them up. Show Sample Output:
# 1036:17687:2689.686985895
# Find files with interesting file exts
find . -type f -name ".MOV" -o -name ".avi" -o -name ".flv" -o -name ".m4v" -o -name ".mov" -o -name ".mp4" -o -name ".wmv" > vid_list
# List by file modified time recursively (top 20)
find . -type f -print0 | xargs -0 stat -f "%m %N" | sort -rn | head -1 | cut -f2- -d" "
# Remove OSX bits
find . -name ".DS_Store" -print0 | xargs -0 rm -rf
find . -name "._" -print0 | xargs -0 rm -rf
# Find most recently modified files recursively (BSD/OSX style)
find . -type f -print0 | xargs -0 stat -f "%m %N" | sort -rn | head -10 | cut -f2- -d" " | more
# Find Flash videos stored by browsers on a Mac
find /private/ 2>/dev/null | grep /Flash
# Explanation: When you watch a flash video like youtube in a browser, the video file is saved on your harddisk at a temporary location. And, if you watch a video and then another video in the same window, the first one will be deleted.
# Limitations:
# Might not work with all browsers.
# Does not work with all websites (for example IMDB).
# Does not work with an anonymous window in Chrome.
# Create and restore backups using cpio
find . -xdev -print0 | cpio -oa0V | gzip > path_to_save.cpio.gz
# Explanation: To restore:
# gzip -cd path_to_save.cpio.gz | cpio -imV
# Why not use tar instead? cpio is slightly more accurate!
# find . -xdev -print0 finds all files and directories without crossing over to other partitions and prints a null delimited list of filenames
# cpio -oa0V takes the list of files to archive from stdin and creates an archive file preserving timestamps and permissions
# cpio -imV extracts the files and directories from stdin while preserving timestamps and permissions
# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f | perl -ne 'chomp(@files = <>); my $p = 9; foreach my $f (sort { (stat($a))[$p] <=> (stat($b))[$p] } @files) { print scalar localtime((stat($f))[$p]), "\t", $f, "\n" }' | tail
# Explanation:
# find path_to_dir -type f prints all the files in the directory tree
# chomp(@files = <>); reads all the lines into an array
# stat($a) is an array of interesting info about a file. Index 7 is size, 8 is access time, 9 is modification time, etc. (See man perlfunc for details and search for stat EXPR.)
# sort { (stat($a))[9] <=> (stat($b))[9] } @files sorts the files by modification time
# print scalar localtime((stat($f))[9]), "\t", $f, "\n" - prints the modification time formatted nicely, followed by a tab and the filename
## Alternative one-liners:
# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f -mtime -7 -print0 | xargs -0 ls -lt | head
# Explanation:
# find /path/to/dir -type f -mtime -7 -print0 prints all the files in the directory tree that have been modified within the last 7 days, with null character as the delimiter
# xargs -0 ls -lt expects a null delimited list of filenames and will sort the files by modification time, in descending order from most recent to oldest
# Since we are looking for the most recent files, with head we get the first 10 lines only
# Note that if there are too many files in the output of find, xargs will run multiple ls -lt commands and the output will be incorrect. This is because the maximum command line length is getconf ARG_MAX and if this is exceeded xargs has to split the execution to multiple commands. So depending on your use case you may need to tweak the -mtime parameter to make sure there are not too many lines in the output.
# Remove spaces recursively from all subdirectories of a directory
find /path/to/dir -type d | tac | while read LINE; do target=$(dirname "$LINE")/$(basename "$LINE" | tr -d ' '); echo mv "$LINE" "$target"; done
# Explanation:
# find path_to_dir -type d finds all the subdirectories
# tac reverses the order. This is important to make "leaf" directories come first!
# target=... stuff constructs the new name, removing spaces from the leaf component and keeping everything before that the same
# echo mv ... for safety you should run with "echo" first, if the output looks good then remove the "echo" to really perform the rename
# Limitations: In UNIX or BSD there is no tac. There you can use tail -r instead.
# Recursively remove all empty sub-directories from a directory tree
find . -depth -type d -empty -exec rmdir {} \;
# Explanation: Recursively remove all empty sub-directories from a directory tree using just find. No need for tac (-depth does that), no need for xargs as the directory contents changes on each call to rmdir. We're not reliant on the rmdir command deleting just empty dirs, -empty does that.
# Limitations: Will make many calls to rmdir without using xargs, which bunches commands into one argument string, which is normally useful, but -empty /could/ end up being more efficient since only empty dirs will be passed to rmdir, so possibly fewer executions in most cases (searching / for example).
##
## Related one-liners
# Recursively remove all empty sub-directories from a directory tree
find . -type d | tac | xargs rmdir 2>/dev/null
# Explanation:
# find will output all the directories
# tac reverses the ordering of the lines, so "leaf" directories come first
# The reordering is important, because rmdir removes only empty directories
# We redirect error messages (about the non-empty directories) to /dev/null
# Limitations: In UNIX and BSD systems you might not have tac, you can try the less intuitive tail -r instead.
# How to find all hard links to a file
find /home -xdev -samefile file1
# Explanation: Note: replace /home with the location you want to search - Source: http://linuxcommando.blogspot.com/2008/09/how-to-find-and-delete-all-hard-links.html
# Recursively remove all empty sub-directories from a directory tree
find . -type d | tac | xargs rmdir 2>/dev/null
# Explanation:
# find will output all the directories
# tac reverses the ordering of the lines, so "leaf" directories come first
# The reordering is important, because rmdir removes only empty directories
# We redirect error messages (about the non-empty directories) to /dev/null
# Limitations:
# In UNIX and BSD systems you might not have tac, you can try the less intuitive tail -r instead.
## Alternative one-liners:
# Recursively remove all empty sub-directories from a directory tree
find . -depth -type d -empty -exec rmdir {} \;
# Explanation: Recursively remove all empty sub-directories from a directory tree using just find. No need for tac (-depth does that), no need for xargs as the directory contents changes on each call to rmdir. We're not reliant on the rmdir command deleting just empty dirs, -empty does that.
# Limitations: Will make many calls to rmdir without using xargs, which bunches commands into one argument string, which is normally useful, but -empty /could/ end up being more efficient since only empty dirs will be passed to rmdir, so possibly fewer executions in most cases (searching / for example).
# Remove all the versioned-but-empty directories from a Subversion checkout
find . -name .svn -type d | while read ss; do dir=$(dirname "$ss"); test $(ls -a "$dir" | wc -l) == 3 && echo "svn rm \"$dir\""; done
# Explanation: Empty directories in version control stink. Most probably they shouldn't be there. Such directories have a single subdirectory in them named ".svn", and no other files or subdirectories.
# The "find" searches for files files named .svn that are directories
# The "while" assigns each line in the input to the variable ss
# The "dirname" gets the parent directory of a path, the quotes are necessary for paths with spaces
# ls -a should output 3 lines if the directory is in fact empty: ".", "..", and ".svn"
# If the test is true and there are precisely 3 files in the directory, echo what we want to do
# If the output of the one-liner looks good, pipe it to | sh to really execute
# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f -mtime -7 -print0 | xargs -0 ls -lt | head
# Explanation:
# find /path/to/dir -type f -mtime -7 -print0 prints all the files in the directory tree that have been modified within the last 7 days, with null character as the delimiter
# xargs -0 ls -lt expects a null delimited list of filenames and will sort the files by modification time, in descending order from most recent to oldest Since we are looking for the most recent files, with head we get the first 10 lines only
# Note that if there are too many files in the output of find, xargs will run multiple ls -lt commands and the output will be incorrect. This is because the maximum command line length is getconf ARG_MAX and if this is exceeded xargs has to split the execution to multiple commands. So depending on your use case you may need to tweak the -mtime parameter to make sure there are not too many lines in the output.
# Execute different commands with find depending on file type
find /path/to/dir -type d -exec chmod 0755 '{}' \; -o -type f -exec chmod 0644 '{}' \;
# Explanation:
# -type d -exec chmod 0755 '{}' \; for each directory, run chmod 0755
# \; is to mark the end of the -exec
# {} is replaced with the filename, we enclosed it in single quotes like this '{}' to handle spaces in filenames
# -ological OR operator
# -type f -exec chmod 0644 '{}' \; for each regular file, run chmod 0644
# Replace symlinks with the actual files they are pointing at
find /path/to/dir -type l -exec sh -c 'cp --remove-destination "$(readlink "{}")" "{}"' \;
# Explanation:
# All the double quoting is necessary to handle filenames with spaces.
# Calling sh with -exec is necessary to evaluate readlink for each symlink
# Limitations: The BSD implementation of cp does not have the --remove-destination flag.
# Create a visual report of the contents of a usb drive
find /path/to/drive -type f -exec file -b '{}' \; -printf '%s\n' | awk -F , 'NR%2 {i=$1} NR%2==0 {a[i]+=$1} END {for (i in a) printf("%12u %s\n",a[i],i)}' | sort -nr
# Explanation: versorge asks:I have a bunch of usb volumes lying around and I would like to get a quick summary of what is on the drives. How much space is taken up by pdf, image, text or executable files. This could be output as a text summary, or a pie chart. This one-liner produces a list like this:
# 5804731229 FLAC audio bitstream data
# 687302212 MPEG sequence
# 99487460 data
# 60734903 PDF document
# 55905813 Zip archive data
# 38430192 ASCII text
# 32892213 gzip compressed data
# 24847604 PNG image data
# 16618355 XML 1.0 document text
# 13876248 JPEG image data
# The find command locates all regular files (-type f) below the given directory, which could be a mounted USB stick or any other directory. For each one, it runs the file -b command with the filename to print the file type; if this succeeds, it also prints the file size (-printf '%s\n'). This results in a list containing a file type on one line, followed by the file size on the next.
# The awk script takes this as input. The GNU file command often produces very specific descriptions such as GIF image data, version 87a, 640 x 480 - to generalize these, we set the field separator to be a comma with the -F option. Referencing $1 then only uses what is to the left of the first comma, giving us a more generic description like GIF image data.
# In the awk script, the first pattern-action pair NR%2 {i=$1} applies to each odd-numbered line, setting the variable i to be the file type description. The even-numbered lines are handled by NR%2==0 {a[i]+=$1}, adding the value of the line (which is the file size) to the array variable a[i]. This results in an array indexed by file type, with each array member holding the cumulative sum of bytes for that type. The END { ... } pattern-action pair finally prints out a formatted list of the total size for each file type. At the end of the line, the sort command sorts the list, putting the file types with the largest numbers at the top.
# Limitations: This one-liner uses the -b option to file and the -printf primary of find - these are supported by the GNU utilities but may not work elsewhere. It can also take a long time to run, since it needs to open and analyze every file below the given directory.
# Count the total number of hours of your music collection
find . -print0 | xargs -0 -P 40 -n 1 sh -c 'ffmpeg -i "$1" 2>&1 | grep "Duration:" | cut -d " " -f 4 | sed "s/.$//" | tr "." ":"' - | awk -F ':' '{ sum1+=$1; sum2+=$2; sum3+=$3; sum4+=$4; if (sum4 > 100) { sum3+=1; sum4=0 }; if (sum3 > 60) { sum2+=1; sum3=0 }; if (sum2 > 60) { sum1+=1; sum2=0 } if (NR % 100 == 0) { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 } } END { printf "%.0f:%.0f:%.0f.%.0f\n", sum1, sum2, sum3, sum4 }'
# Store the output of find in an array
mapfile -d $'\0' arr < <(find /path/to -print0)
# Find all log files modified 24 hours ago, and zip them
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null
# Find the most recently modified files in a directory and all subdirectories
find /path/to/dir -type f | perl -ne 'chomp(@files = <>); my $p = 9; foreach my $f (sort { (stat($a))[$p] <=> (stat($b))[$p] } @files) { print scalar localtime((stat($f))[$p]), "\t", $f, "\n" }' | tail
# Explanation: find path_to_dir -type f prints all the files in the directory tree
# chomp(@files = <>); reads all the lines into an array
# stat($a) is an array of interesting info about a file. Index 7 is size, 8 is access time, 9 is modification time, etc. (See man perlfunc for details and search for stat EXPR.)
# sort { (stat($a))[9] <=> (stat($b))[9] } @files sorts the files by modification time
# print scalar localtime((stat($f))[9]), "\t", $f, "\n" - prints the modification time formatted nicely, followed by a tab and the filename
# How can I find all of the distinct file extensions in a folder hierarchy?
# It work as following:
# Find all files from current folder
# Prints extension of files if any
# Make a unique sorted list
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u
# just for reference: if you want to exclude some directories from searching (e.g. .svn), use
find . -type f -path '*/.svn*' -prune -o -print | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort -u
# A variation, this shows the list with counts per extension:
find . -type f | perl -ne 'print $1 if m/\.([^.\/]+)$/' | sort | uniq -c | sort -n
# No need for the pipe to sort, awk can do it all:
find . -type f | awk -F. '!a[$NF]++{print $NF}'
find . -type f -name "*.*" | awk -F. \'!a[$NF]++{print $NF}\'
# Recursive version:
find . -type f | sed -e 's/.*\.//' | sed -e 's/.*\///' | sort -u
# If you want totals (how may times the extension was seen):
find . -type f | sed -e 's/.*\.//' | sed -e 's/.*\///' | sort | uniq -c | sort -rn
# Non-recursive (single folder):
for f in *.*; do printf "%s\n" "${f##*.}"; done | sort -u
# Find everythin with a dot and show only the suffix.
find . -type f -name "*.*" | awk -F. '{print $NF}' | sort -u
# if you know all suffix have 3 characters then
find . -type f -name "*.???" | awk -F. '{print $NF}' | sort -u
# or with sed shows all suffixes with one to four characters. Change {1,4} to the range of characters you are expecting in the suffix.
find . -type f | sed -n 's/.*\.\(.\{1,4\}\)$/\1/p'| sort -u
# Unhide all hidden files in the current directory.
find . -maxdepth 1 -type f -name '\.*' | sed -e 's,^\./\.,,' | sort | xargs -iname mv .name name
# Explanation: This will remove the leading dot from all files in the current directory using mv, effectively "unhiding" them. It will not affect subdirectories.
# Limitations: Probably only works on GNU Linux, due to the specific usage of xargs.
## Related one-liners
# Print file owners and permissions of a directory tree
find /path/to/dir1 -printf "%U %G %m %p\n" > /tmp/dir1.txt
# Explanation: The command simply traverses the specified directory tree and for each file and directory it prints the UID of the owner, GID of the group, the permission bits and the path. To compare file owners and permissions of two directory trees you can run this command for each directory, save the output in two files and then compare them using diff or similar. See man find for more # Explanation: of all the possible symbols you can use with -printf
# Limitations: The -printf option does not exist in find on Solaris 10.
# Get only the latest version of a file from across mutiple directories.
find . -name 'filename' | xargs -r ls -tc | head -n1
# Explanation: Shows latest file (by last modification of file status information) for the given pattern. So in this example filename = custlist*.xls. We use ls to do the sorting (-t) and head to pick the top one. xargs is given the -r option so that ls is not run if there is no match.
# Limitations: The filesystem needs to support ctime. Does not depend on a consistent naming scheme.
# Get only the latest version of a file from across mutiple directories
find . -name custlist\* | perl -ne '$path = $_; s?.*/??; $name = $_; $map{$name} = $path; ++$c; END { print $map{(sort(keys(%map)))[$c-1]} }'
# Explanation: The purpose of the one-liner is to find the the "latest" version of the custlist_*.xls file from among multiple versions in directories and sub-directories, for example:
./c/custlist_v1.003.xls
./c/custlist_v2.001.xls
./d/b/custlist_v1.001.xls
./d/custlist_v1.002.xls
# Let's decompose the one-liner to the big steps:
# find . -name custlist\* -- find the files matching the target pattern
# ... | perl -ne '...' -- run perl, with the input wrapped around in a while loop so that each line in the input is set in the variable $_
# $path = $_; s?.*/??; $name = $_; -- save the full path in $path, and cut off the subdirectory part to get to the base name of the file and save it in $name
# $map{$name} = $path; -- build a mapping of $name to $path
# ++$c; -- we count the elements, to use it later
# (sort(keys(%map)))[$c-1] -- sort the keys of the map, and get the last element, which is custlist_v2.001.xls in this example
# END { print $map{$last} }' -- at the end of all input data, print the path of the latest version of the file
# Limitations: Even if the latest version of the file appears multiple times in the directories, the one-liner will print only one of the paths. This could be fixed though if needed.
# Find all files recursively with specified string in the filename and output any lines found containing a different string.
find . -name *conf* -exec grep -Hni 'matching_text' {} \; > matching_text.conf.list
# Explanation: find . -name *conf* In current directory, recursively find all files with 'conf' in the filename.
# -exec grep -Hni 'matching_text' {} \; When a file is found matching the find above, execute the grep command to find all lines within the file containing 'matching_text'.
# Here are what each of the grep switches do:
# grep -i ignore case.
# grep -H print the filename
# grep -n print the line number
# > matching_text.conf.list Direct the grep output to a text file named 'matching_text.conf.list'
# Remove .DS_Store from the repository you happen to staging by mistake
find . -name .DS_Store -exec git rm --ignore-unmatch --cached {} +
# Explanation: Actual conditions without erasing, remove from the repository.
# Check if a file exists and has a size greater than X
[[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]] && echo true || echo false
# Explanation:
# The find takes care two things at once: checks if file exists and size is greater than 51200.
# We redirect stderr to /dev/null to hide the error message if the file does not exist.
# The output of find will be non-blank if the file matched both conditions, otherwise it will be blank
# The [[ ... ]] evaluates to true or false if the output of find is non-blank or blank, respectively
# You can use this in if conditions like:
if [[ $(find /path/to/file -type f -size +51200c 2>/dev/null) ]]; do
# somecmd
fi
# Find files that are not executable
find /some/path -type f ! -perm -111 -ls
# Explanation: The key is writing the parameter of -perm correctly. The value -111 means that all execution bits must be set: user and group and other too. By negating this pattern with ! we get files that miss any of the execution bits. If you want to be more specific, for example find files that are not executable specifically by the owner, you could do like this:
find /some/path -type f ! -perm -100 -ls
# The -ls option is to print the found files using a long listing format similar to the ls command.
# Md5sum the last 5 files in a folder
find /directory1/directory2/ -maxdepth 1 -type f | sort | tail -n 5 | xargs md5sum
# Explanation:
# find lists the files, no recursion, no directories, with full path
# sort list files alphabetically
# tail keep only the last 5 files
# xargs send the list as arguments to md5sum
# md5sum calculate the md5sum for each file
# Limitations: Probably can not handle spaces in file or directory names.
# Change the encoding of all files in a directory and subdirectories
find . -type f -name '*.java' -exec sh -c 'iconv -f cp1252 -t utf-8 "$1" > converted && mv converted "$1"' -- {} \;
# Explanation: The parameters of find:
# . -- search in the current directory, and its subdirectories, recursively
# -type f -- match only files
# -name '*.java' -- match only filenames ending with .java
# -exec ... \; -- execute command
# The command to execute is slightly complicated, because iconv does not rewrite the original file but prints the converted content on stdout. To update the original file we need 2 steps:
# Convert and save to a temp file
# Move the temp file to the original
# To do these steps, we use a sh subshell with -exec, passing a one-liner to run with the -c flag, and passing the name of the file as a positional argument with -- {}.
# Deletes orphan vim undo files
find . -type f -iname '*.un~' | while read UNDOFILE ; do FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ) ; [[ -e "$FILE" ]] || rm "$UNDOFILE" ; done
# Explanation:
# find -type f -iname '*.un~' finds every vim undo file and outputs the path to each on a separate line. At the beginning of the while loop, each of these lines is assigned in to the variable $UNDOFILE with while read UNDOFILE, and in the body of the while loop, the file each undo-file should be tracking is calculated and assigned to $FILE with FILE=$( echo "$UNDOFILE" | sed -r -e 's/.un~$//' -e 's&/\.([^/]*)&/\1&' ). If $FILE does not exist [[ -e "$FILE" ]] the undo-file is removed rm "$UNDOFILE".
# Limitations:
# I am not sure whether sed in every flavour of UNIX allows the -r flag. That flag can be removed, though, as long as the parentheses in -e 's&/\.([^/]*)&/\1&' are escaped (but I think the way it stands the one-liner is more readable).
# Find recent logs that contain the string "Exception"
find . -name '*.log' -mtime -2 -exec grep -Hc Exception {} \; | grep -v :0$
# Explanation:
# The find:
# -name '*.log' -- match files ending with .log
# -mtime -2 -- match files modified within the last 2 days
# -exec CMD ARGS \; -- for each file found, execute command, where {} in ARGS will be replaced with the file's path
# The grep:
# -c is to print the count of the matches instead of the matches themselves
# -H is to print the name of the file, as grep normally won't print it when there is only one filename argument
# The output lines will be in the format path:count. Files that didn't match "Exception" will still be printed, with 0 as count
# The second grep filters the output of the first, excluding lines that end with :0 (= the files that didn't contain matches)
# Extra tips:
# Change "Exception" to the typical relevant failure indicator of your application
# Add -i for grep to make the search case insensitive
# To make the find match strictly only files, add -type f
# Schedule this as a periodic job, and pipe the output to a mailer, for example | mailx -s 'error counts' [email protected]
# Limitations: The -H flag of grep may not work in older operating systems, for example older Solaris. In that case use ggrep (GNU grep) instead, if it exists.
# List status of all GIT repos
find ~ -name ".git" 2> /dev/null | sed 's/\/.git/\//g' | awk '{print "-------------------------\n\033[1;32mGit Repo:\033[0m " $1; system("git --git-dir="$1".git --work-tree="$1" status")}'
# Explanation:
# List all .git dirs
# Trim .git parts
# Run git --git-dir=X.git --work-tree=X status with awk
find mail[1-8] -path 'mail[1-8]/home/vmail/*/*/*' -prune -o -ls > fileindex.txt
# Make a file index including subdirectories mail1 - mail8, but exclude anything 3 levels below home /home/vmail in each dir.
find -type f -printf "%S=:=%p\n" 2>/dev/null | gawk -F'=:=' '{if ($1 < 1.0) print $1,$2}'
# find sparse files Prints the path/filename and sparseness of any sparse files (files that use less actual space than their total size because the filesystem treats large blocks of 00 bytes efficiently). Uses a Tasker-esque field separator of more than one character to ensure uniqueness. Show Sample Output:
# calc "x = 2 + 2; ++x"
# 5
# calc "10 / 16"
# 0.625
# calc "sqrt(2)"
# 1.4142136
# calc "(1.2 + 3.4) ^ 56"
# 1.3014832e+37
find /var/log -type f -printf "%S=:=%p\n" 2>/dev/null | gawk -F'=:=' '{if ($1 < 1.0) print $1,$2}'
# 7.88593e-08 /var/log/with space
# 1.11717e-07 /var/log/lastlog
# Find less than an hour old files (-mmin -60) in your homedir (~/) and below without crossing into other partitions. (-xdev) and long list them (-ls)
find ~/ -mmin -60 -xdev -ls
# You can use this substitution to show all setuid bins in your PATH:
find ${PATH//:/ } -perm /u=s
# Find files on the local filesystem modified between 2018-03-20 and 2018-04-10 (not newer than). See newerXY in the man page. Thanks for the idea from your website @nixcraft. And yes, I found the file.
find / -xdev -newermt 2018-03-20 \! -newermt 2018-04-10 -type f -ls
# bash its Parameter Expansions. Something like this should work
echo $PATH|sed s/:/\ /
# For the same example from above:
find $(echo $PATH | sed s/:/\ /) -perm /u+s
# Delete interactively any file in the folder not an mp3
find ./ ! -name "mp3" -type f -exec rm -i {} \;
# Executing a command from string variable in bin bash This linux bash script allows to store a bash command in string variable and then to execute it. The output is then saved into a variable, using eval statement.
cmd="find . -name *.zip"
echo Command to execute: "$cmd"
res=`eval $cmd`
echo RESULT = "$res"
# find in different folders
find . \( -type f -and -path '*/dir1/*' -or -path '*/dir2/*' -or -path '*/dir3/*' -or -path '*/dir4/*' \)
find . \( -type f -and -path '*/dir1/*' -or -path '*/dir2/*' -or -path '*/dir3/*' -or -path '*/dir4/*' \)
# find in different direcorys and exec ls then
find . \( -name "dir1" -o -name "dir2" \) -exec ls '{}' \;
find . \( -name "dir1" -o -name "dir2" \) -exec ls '{}' \;
# [187943.623259] type=1305 audit(1587263099.250:1255): audit_pid=4541 old=0 auid=4294967295 ses=4294967295 res=1
# [192951.224433] init: psa-wdcollect main process (18343) killed by KILL signal
# [193018.683840] init: psa-wdcollect main process (20024) killed by KILL signal
# [194041.740744] init: psa-wdcollect main process (20458) killed by KILL signal
# [194070.720907] init: psa-wdcollect main process (28271) killed by KILL signal
# [194550.848296] init: psa-wdcollect main process (28385) killed by KILL signal
# [194634.092807] init: psa-wdcollect main process (32074) killed by KILL signal
# [330701.214168] device eth0 left promiscuous mode
# [1199294.185109] TCP: TCP: Possible SYN flooding on port 53. Sending cookies. Check SNMP counters.
find . -iname '*expenses*'
# Remember, find has a case insensitive way to search for filenames: -iname
# Find non-standard files in mysql data directory - These files should be removed to keep the size of data directory under control. If you exclude the known important file types like frm and MYD then what-ever is left can be either moved or deleted.
find . -type f -not -name "*.frm" -not -name "*.MYI" -not -name "*.MYD" -not -name "*.TRG" -not -name "*.TRN" -not -name "db.opt"
# find and remove old compressed backup files - remove all compressed files in /home/ folder not created in the last 10 days
find /home -type f \( -name "*.sql.gz" -o -name "*.tar.gz" -mtime +10 \) -exec rm -rf {} \;
# find and remove old backup files - remove all files in /home/ folder that starts with bk_all_dbProdSlave and not created in the last 2 days
find /home/ -name bk_all_dbProdSlave_\* -mtime +2 -exec rm -f {} \;
# Check if the same table name exist across different databases - Useful command for MySQL
find . -name "withdrownblocks.frm" | sort -u | awk -F'/' '{print $3}' | wc -l
# Sample output
# Count of databases for e.g. 3
# Find dupe files by checking md5sum
find /glftpd/site/archive -type f|grep '([0-9]\{1,9\})\.[^.]\+$'|parallel -n1 -j200% md5sum ::: |awk 'x[$1]++ { print $2 " :::"}'|sed 's/^/Dupe: /g'|sed 's,Dupe,\x1B[31m&\x1B[0m,'
# Recursively remove all "node_modules" folders
find . -name "node_modules" -exec rm -rf '{}' +
# Find files/dirs modified within a given period
find . -type d -newermt "2019-01-01" \! -newermt "2019-02-01" -exec ls -ld {} \;
# Graphical tree of sub-directories with files - The command finds every item within the directory and edits the output so that subdirectories are and files are output much like the tree command
find . -print | sed -e 's;[^/]*/;|-- ;g;s;-- |; |;g'
# Sample output
# .
# |-- vmware-tools-distrib
# | |-- installer
# | | |-- upstart-job.conf
# | |-- lib
# | | |-- modules
# | | | |-- source
# | | | | |-- legacy
# | | | | | |-- vsock.tar
# | | | | | |-- vmhgfs.tar
# | | | | | |-- vmmemctl.tar
# | | | | | |-- vmxnet3.tar
# | | | | | |-- vmci.tar
# | | | | | |-- vmxnet.tar
# | | | | | |-- vmblock.tar
# | | | | | |-- pvscsi.tar
# |-- _cafenv-appconfig_
# |-- [email protected]
# |-- vmware-root_2083-2126394392
# |-- listfiletemp
# |-- .ICE-unix
# |-- C
# List only empty directories and delete safely (=ask for each) - Will delete empty directories and sub-directories (hideen too = whose names are starting with dot .). Used 'rm' command instead of 'rmdir' to give the possibility of asking for confirmation before deleting i.e. it is not wise do delete all empty directories in /etc folder. Replace dot in 'find .' with any for other starting directory instead of current. in 'rm -i -R' 'i' stands for ask before delete and 'R' for delete folder recursively or folder itself if it is empty
find . -type d -empty -exec rm -i -R {} \;
# find all executable files across the entire tree - I can think of using this command after compiling an downloaded source from anywhere as an easy way to find all executable products. We usually issue the find command (without arguments) to list the full paths of all directories and sub-directories and files in the entire current tree.
find -executable -type f
# Similar command is
tree -aicfnF
# Store the output of find in an array
mapfile -d $'\0' arr < <(find /path/to -print0)
# find and delete file or folder older than x days
find /tmp/* -mtime +7 -exec rm {} \;
find . -type d -newermt "2019-01-01" \! -newermt "2019-02-01" -exec ls -ld {} \;
# Find all log files modified 24 hours ago, and zip them
find . -type f -mtime +1 -name "*.log" -exec zip -m {}.zip {} \; >/dev/null
find /path/to/files* -mtime +5 -exec rm {} \;
# Bash Nummerierte Dubletten automatisch aus Ordner löschen, rekursiv
find . -name "*\([0-9]\).*" -delete
#==============================##==============================#
# CMD FIND
#==============================##==============================#
Cheatsheets are an excellent complement to other information sources like Linux man-pages, Linux help, or How-To’s and tutorials, as they provide compact and easily accessible information. While man-pages and detailed tutorials often contain comprehensive explanations and extensive guides, cheatsheets summarize the most important options forthe command find in a clear format. This allows users to quickly access the needed information for find without having to sift through lengthy texts. Especially in stressful situations or for recurring tasks, cheatsheets for find are a valuable resource to work efficiently and purposefully.
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░
█║▌│║█║▌★ KALI ★ PARROT ★ DEBIAN 🔴 PENTESTING ★ HACKING ★ █║▌│║█║▌
██╗ ██╗ ██████╗ ██████╗ ██╗ ██╗███████╗██████╗
████████╗██╔══██╗██╔═══██╗╚██╗██╔╝██╔════╝██╔══██╗
╚██╔═██╔╝██║ ██║██║ ██║ ╚███╔╝ █████╗ ██║ ██║
████████╗██║ ██║██║ ██║ ██╔██╗ ██╔══╝ ██║ ██║
╚██╔═██╔╝██████╔╝╚██████╔╝██╔╝ ██╗███████╗██████╔╝
╚═╝ ╚═╝ ╚═════╝ ╚═════╝ ╚═╝ ╚═╝╚══════╝╚═════╝
█║▌│║█║▌ WITH COMMANDLINE-KUNGFU POWER █║▌│║█║▌
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.