Копирайте два най-нови файла в друга директория с помощта на bash скрипт

Опитвам се да създам bash скрипт за създаване на ежедневни архиви на MySQL db и уеб директория. След това трябва да копира двата най-нови .tar.gz файла в седмична директория на ден 0 на всяка седмица, месечна директория на ден 1 на всеки месец и в директория за година на ден 1 на всяка година.

Имам проблеми при опитите да накарам частта „копиране на двата най-нови файла“ да работи.

Какво имам досега (използвах скрипта от https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script като основа.):

#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share

# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1

# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'

# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')

#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)

# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1

# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
        mkdir "$DIR/tmp"
        echo 'Created tmp directory...'
fi

# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created daily directory...'
fi

# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
        mkdir "$DIR/tmp/weekly"
        echo 'Created weekly directory...'
fi

# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
        mkdir "$DIR/tmp/${YEAR}"
        echo 'Directory for current year created...'
fi

# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
        mkdir "$DIR/tmp/${YEAR}/$MONTH"
        echo '...'Directory for current month created
fi

# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf  $DIR/tmp/database.sql
echo 'Made daily backup...'

# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
        if [ $DOW -eq 2 ] ; then
               cp $DAILY $DIR/tmp/weekly/
        fi
                echo 'Made weekly backup...'

# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
        if [ $DAY_OF_YEAR -eq 146 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/
        fi
                echo 'Made annual backup...'

# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
        if [ $DAY_OF_MONTH -eq 26 ] ; then
                cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
        fi
                echo 'Made monthly backup...'

# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'

Засега коментирах някои части, докато се опитвам да накарам това да работи, и зададох днешния ден/месец/година, за да мога да виждам копираните файлове. Също така съм оставил в моите коментирани предишни опити за $DAILY променливи.

Проблемът, който получавам, е, че при изпълнение на скрипта той връща следното:

./backup-rotation-script.sh                            
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...                                                       
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory   
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...                                                      
Backup complete. Log can be found under /path/to/backups/logs/.  

Но когато проверя /path/to/backups/tmp/daily/ файловете са там и той ясно ги вижда, защото връща имената на файловете в грешката.

От това, което мога да разбера, това е така, защото $DAILY (find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2) връща два резултата на един ред? Предполагам, че най-лесният начин да накарате това да работи вероятно е да създадете for цикъл, който копира двата резултата в седмичните/месечните/годишните директории?

Опитах се да добавя варианти на:

for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
   cp $file /path/to/backups/tmp/weekly/
done

Но не мина толкова добре. :С

В идеалния случай бих искал също да докладва, ако не успее, но все още не съм толкова далеч. :)

Всяка помощ ще бъде много ценена!


person Khaito    schedule 26.05.2015    source източник


Отговори (1)


Няма значение! Разбрах го.

Премахнах изцяло променливата „daily“ и вместо това използвах следното за копието:

find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/

Така скриптът сега изглежда така:

#!/bin/sh
# Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Edited/hacked/chopped/stuff by Khaito

# Redirect all script output to log file located in log directory with date in name.
exec 3>&1 4>&2
trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1

# Local Source
SOURCE=/path/to/source
# Create directories etc here
LOCAL=/path/to/backups
DIR=/path/to/backups/intranet
DIRD=/path/to/backups/intranet/daily
DIRW=/path/to/backups/intranet/weekly
DIRM=/path/to/backups/intranet/monthly

# Local Destination
DESTINATION=/path/to/network/share

# Database Backup User
DATABASE='dbname'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'

# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
DOW=$(date '+%u')
YEARMONTH=$(date +"%Y-%m-%B")

# Make Daily Folder
if [ ! -d "$LOCAL/intranet" ]; then
        mkdir "$DIR/intranet"
        echo 'Intranet directory created...'
fi

# Make Daily Folder
if [ ! -d "$DIR/daily" ]; then
        mkdir "$DIR/daily"
        echo 'Daily directory created...'
fi

# Make Weekly Folder
if [ ! -d "$DIR/weekly" ]; then
        mkdir "$DIR/weekly"
        echo 'Weekly directory created...'
fi

# Make Folder For Current Month
if [ ! -d "$DIR/monthly" ]; then
        mkdir "$DIR/monthly"
        echo 'Monthly directory created...'
fi

# Make Folder For Current Year
if [ ! -d "$DIR/${YEAR}" ]; then
        mkdir "$DIR/${YEAR}"
        echo 'Directory for current year created...'
fi

# Tar the intranet files then dump the db, tar it then remove the original dump file.
tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
rm -rf  $DIR/database.sql
echo 'Made daily backup...'

# Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
        if [ $DOW -eq 0 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
        fi
                echo 'Made weekly backup...'

# Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
        if [ $DAY_OF_MONTH -eq 1 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
        fi
                echo 'Made monthly backup...'

# Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
        if [ $DAY_OF_YEAR -eq 1 ] ; then
                find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
        fi
                echo 'Made annual backup...'

# Rsync the new files to the network share for backup to tape
rsync -hvrPt $DIR/* $DESTINATION

# Delete local backups
# find $DIRD -mtime +8 -exec rm {} \;
# find $DIRW -mtime +15 -exec rm {} \;
# find $DIRM -mtime +2 -exec rm {} \;
# find $DIR/${YEAR} -mtime +2 -exec rm {} \;

# Delete daily backups older than 7 days on network share
# find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
# Delete weekly backups older than 31 days on network share
# find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
# Delete monthly backups older than 365 days on network share
# find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;

echo 'Backup complete. Log can be found under /path/to/logs/.'
person Khaito    schedule 26.05.2015