Are you running out of storage space on your Linux system? One way to free up space is by compressing large files. Manually identifying and compressing big files can be tedious, but a simple Bash script can automate the process. In this blog, we’ll create a script that automatically compresses files larger than 10MB and moves them to an archive folder.
Why Automate File Compression?
If you have a system generating log files, backups, or other large files, automating compression can:
Save disk space by reducing file sizes.
Improve system performance.
Reduce manual effort in managing storage.
Bash Script to Compress Large Files
A Bash script finds and compresses files larger than 10MB in a specified directory.
The Script:
#!/bin/bash
#$Revision:001$
#$Thr Feb 06
# Variables
BASE=/home/omkar
DAYS=10
DEPTH=1
# Check if the directory is present or not
if [ ! -d "$BASE" ]; then
echo "$BASE does not exist"
exit 1
fi
# Create 'archive' folder if not present
if [ ! -d "$BASE/archive" ]; then
mkdir "$BASE/archive"
fi
# Find and archive files larger than 10MB
for i in $(find "$BASE" -maxdepth "$DEPTH" -type f -size +10M);
do
echo "[$(date "+%Y-%m-%d %H:%M:%S")] Archiving $i ==> $BASE/archive"
gzip "$i" || exit 1
mv "$i.gz" "$BASE/archive" || exit 1
done
How It Works:
find "$BASE" -maxdepth "$DEPTH" -type f -size +10M
- Locates files larger than 10MB.Checks if the archive directory exists - If not, it creates it.
Compresses each file using
gzip
and moves it to the archive folder.Prints messages showing which files are being archived.
How to Use the Script
1. Save the Script
Copy and paste the script into a file, e.g., compress_large_files.sh
.
nano compress_large_files.sh
Paste the script inside the editor and save it.
2. Make the Script Executable
Run the following command to grant execute permissions:
chmod +x compress_large_files.sh
3. Run the Script
Execute the script manually:
./compress_large_files.sh
Alternatively, you can automate it using a cron job.
4. Automate with Cron (Optional)
To run the script daily at midnight, add the following line to your crontab:
crontab -e
Then add:
0 0 * * * /path/to/compress_large_files.sh
Expected Output
If your /home/omkar
directory contains files larger than 10MB, the output would look like this:
[2025-02-12 14:30:45] Archiving /home/omkar/large_file1.log ==> /home/omkar/archive
[2025-02-12 14:30:45] Archiving /home/omkar/big_data.csv ==> /home/omkar/archive
[2025-02-12 14:30:45] Archiving /home/omkar/video.mp4 ==> /home/omkar/archive
After running the script:
The original files (
large_file1.log
,big_data.csv
,video.mp4
) will be compressed (.gz
).They will be moved to
/home/omkar/archive/
.
If there are no files larger than 10MB, the script will complete without printing anything.
Possible Error Cases & Fixes
/home/omkar does not exist
- Ensure the directory exists or change
BASE=/home/omkar
to your actual directory.
- Ensure the directory exists or change
Files not being found or compressed
- Run
find /home/omkar -maxdepth 1 -type f -size +10M
separately to verify large files exist.
- Run
Permissions issue
- Run the script with
sudo ./script.sh
if required.
- Run the script with
Conclusion
With this simple Bash script, you can automatically compress large files, ensuring your system stays optimized and storage-efficient. Whether you manage servers or just want to keep your local machine tidy, automating file compression can save time and resources. Try it out and let me know how it works for you!
🚀 Happy scripting!