我已经在这里检查了一个解决方案,但似乎找不到一个。我处理一个非常慢的wan连接约300kb /秒。对于我的下载我使用一个遥控盒,然后我下载他们到我家。我试图运行一个cronjob,将rsync两个目录在我的远程和本地服务器每小时。我得到一切工作,但如果有很多数据传输rsyncs重叠,并结束创建同一个文件的两个实例,因此重复的数据发送。
I have checked for a solution here but cannot seem to find one. I am dealing with a very slow wan connection about 300kb/sec. For my downloads I am using a remote box, and then I am downloading them to my house. I am trying to run a cronjob that will rsync two directories on my remote and local server every hour. I got everything working but if there is a lot of data to transfer the rsyncs overlap and end up creating two instances of the same file thus duplicate data sent.
我想
推荐答案通过脚本,您可以创建一个脚本一个锁定文件。如果文件存在,cronjob应该跳过运行;否则应该继续。脚本完成后,应删除锁定文件。
Via the script you can create a "lock" file. If the file exists, the cronjob should skip the run ; else it should proceed. Once the script completes, it should delete the lock file.
if [ -e /home/myhomedir/rsyncjob.lock ] then echo "Rsync job already running...exiting" exit fi touch /home/myhomedir/rsyncjob.lock #your code in here #delete lock file at end of your job rm /home/myhomedir/rsyncjob.lock更多推荐
rsync cronjob只有在rsync尚未运行时才会运行
发布评论