I work on a server system that does not allow me to store files more than 50 gigabytes. My application takes 20 minutes to generate a file. Is there any way whereby I can move all the files that are more than 30 minutes old from source to destination? I tried rsync
:
rsync -avP source/folder/ user@destiantionIp:dest/folder
but this does not remove the files from my server and hence the storage limit fails.
Secondly, if I use the mv
command, the files that are still getting generated also move to the destination folder and the program fails.
You can use find
along with -exec
for this:-
Replace /sourcedirectory
and /destination/directory/
with the source and target paths as you need.
find /sourcedirectory -maxdepth 1 -mmin -30 -type f -exec mv "{}" /destination/directory/ \;
What basically the command does is, it tries to find files in the current folder -maxdepth 1
that were last modified 30 mins ago -mmin -30
and move them to the target directory specified. If you want to use the time the file was last accessed use -amin -30
.
Or if you want to find files modified within a range you can use something like -mmin 30 -mmin -35
which will get you the files modified more than 30 but less than 35 minutes ago.
References from the man
page:-
-amin n
File was last accessed n minutes ago.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime
+1, a file has to have been accessed at least two days ago.
-mmin n
File's data was last modified n minutes ago.
-mtime n
File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.