Table Of Contents
- Linux File Manipulation
- File Manipulation
- File Compression & Chunking
- File Hashing
LINUX FILE MANIPULATION
FILE MANIPULATION
Compare files
diff <FILE_PATH_A> <FILE_PATH_B>
Force recursive deletion of directory
rm -rf <FILE_PATH>
Secure file deletion
shred -f -u <FILE_PATH>
Modify timestamp to match another file
touch -r <ORIGINAL_FILE_PATH> <MOD_FILE_PATH>
Modify file timestamp
touch -t <YYYYMMDDHHMM> <FILE>
Count lines containing specific string
grep -c "<STRING>" <FILE_PATH>
Convert Linux formatted file to Windows compatible text file
awk 'sub("$", "\r")' <SOURCE_FILE_PATH> > <OUTPUT_FILE_PATH>
Convert Windows formatted file to Linux compatible text file
dos2unix <FILE_PATH>
Search current and all subdirectories for all files that end with a specific extension
find . -type f -name "*.<FILE_EXTENSION>"
Search all files (binary and regular files) in current and all subdirectories for a case insensitive phrase
grep -Ria "<SEARCH_PHRASE>"
Return the line count of a target file
wc -l <FILE_PATH>
Search for setuid files
find / -perm -4000 -exec ls -ld {} \;
Determine file type
file <FILE_PATH>
Set/Unset immutable file
chattr +i <FILE_PATH>
chattr -i <FILE_PATH>
Generate random file (example 3M file)
dd if=/dev/urandom of=<OUTPUT_FILE_PATH> bs=3145728 count=100
FILE COMPRESSION & CHUNKING
Pack/unpack (archive) files using tar
# Compress:
tar -cf <OUTPUT_FILE>.tar <INPUT_PATH>
# Extract:
tar -xf <FILE_PATH>.tar
Compress and extract a .gz file using tar
# Compress:
tar -czf <OUTPUT_FILE>.tar.gz <INPUT_PATH>
# Extract:
tar -xzf <FILE_PATH>.tar.gz
Compress and extract a .bz2 file using tar
# Compress:
tar -cjf <OUTPUT_FILE>.tar.bz2 <INPUT_PATH>
# Extract:
tar -xjf <FILE_PATH>.tar.bz2
Compress and extract using gzip
# Compress:
gzip <INPUT_PATH>
# Extract:
gzip -d <FILE_PATH>.gz
Compress and extract using zip
# Compress:
zip -r <OUTPUT_FILE>.zip <INPUT_PATH>
# Extract:
unzip <FILE_PATH>
Pack an executable using UPX
upx -9 -o <OUTPUT_FILE> <INPUT_PATH>
Split file into 3k chunks using dd
dd if=<INPUT_PATH> bs=4M | gzip -c | split -b 3K - "<OUTPUT_FILE>.chunk"
Restore chunked file using dd
cat <FILE_PATH>.chunk* | gzip -dc | dd of=<OUTPUT_PATH> bs=4M
FILE HASHING
Generate MD5 hash of a file
md5sum <FILE_PATH>
Generate MD5 hash of a string
echo "<STRING>" | md5sum
Generate SHA1 hash of a file
sha1sum <FILE_PATH>