Linux Archive

Linux Archive (http://www.linux-archive.org/)
-   Red Hat Linux (http://www.linux-archive.org/red-hat-linux/)
-   -   Cron to delete files older than X days (http://www.linux-archive.org/red-hat-linux/655188-cron-delete-files-older-than-x-days.html)

raj sourabh 04-12-2012 08:37 AM

Cron to delete files older than X days
 
Hi,

I am facing a problem while running a cron to delete files older that X
days in a particular folder. Initially I had configured the cron as below;

*[oracle@TESTDBS daily_backup]$ crontab -l*
*10 3 * * * /home/oracle/backup.sh*
*@daily find /home/oracle/daily_backup/* -mtime +2 -exec rm {} ;*

But this deletes everything from the backup folder and I dont see any files
in it.
But when I run "* find /home/oracle/daily_backup/* -mtime +2 -exec rm {}
;" *manually "nothing happens". I mean it doesn't delete anything.

What could be the issue here?? Even I tried below but no success;

*[oracle@TESTDBS daily_backup]$ ls -altr*
*-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
*-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
*drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
*drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
*-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
*[oracle@TESTDBS daily_backup]$*
*[oracle@TESTDBS daily_backup]$*
*[oracle@TESTDBS daily_backup]$ find /home/oracle/daily_backup/ -mtime +30
-delete*
*[oracle@TESTDBS daily_backup]$ ls*
*120412_expdp_full.log 120412_full_dbexp.dmp test.txt*
*[oracle@TESTDBS daily_backup]$*
*[oracle@TESTDBS daily_backup]$ ls -altr*
*total 121156*
*-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
*-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
*drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
*drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
*-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
*[oracle@TESTDBS daily_backup]$*
*[oracle@TESTDBS daily_backup]$ find /home/oracle/daily_backup -type f
-mtime +2 -exec rm {} ;*
*[oracle@TESTDBS daily_backup]$ ls -altr*
*total 121156*
*-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
*-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
*drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
*drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
*-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
*[oracle@TESTDBS daily_backup]$*
*
*
*
*
*Regards,*
*Raj*
--
redhat-list mailing list
unsubscribe mailto:redhat-list-request@redhat.com?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list

"Facundo M. de la Cruz" 04-12-2012 08:59 PM

Cron to delete files older than X days
 
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA512

On 04/12/2012 05:37 AM, raj sourabh wrote:
> Hi,
>
> I am facing a problem while running a cron to delete files older that X
> days in a particular folder. Initially I had configured the cron as below;
>
> *[oracle@TESTDBS daily_backup]$ crontab -l*
> *10 3 * * * /home/oracle/backup.sh*
> *@daily find /home/oracle/daily_backup/* -mtime +2 -exec rm {} ;*
>
> But this deletes everything from the backup folder and I dont see any files
> in it.
> But when I run "* find /home/oracle/daily_backup/* -mtime +2 -exec rm {}
> ;" *manually "nothing happens". I mean it doesn't delete anything.
>
> What could be the issue here?? Even I tried below but no success;
>
> *[oracle@TESTDBS daily_backup]$ ls -altr*
> *-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
> *-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
> *drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
> *drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
> *-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
> *[oracle@TESTDBS daily_backup]$*
> *[oracle@TESTDBS daily_backup]$*
> *[oracle@TESTDBS daily_backup]$ find /home/oracle/daily_backup/ -mtime +30
> -delete*
> *[oracle@TESTDBS daily_backup]$ ls*
> *120412_expdp_full.log 120412_full_dbexp.dmp test.txt*
> *[oracle@TESTDBS daily_backup]$*
> *[oracle@TESTDBS daily_backup]$ ls -altr*
> *total 121156*
> *-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
> *-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
> *drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
> *drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
> *-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
> *[oracle@TESTDBS daily_backup]$*
> *[oracle@TESTDBS daily_backup]$ find /home/oracle/daily_backup -type f
> -mtime +2 -exec rm {} ;*
> *[oracle@TESTDBS daily_backup]$ ls -altr*
> *total 121156*
> *-rw-r----- 1 oracle dba 123801600 Apr 12 12:16 120412_full_dbexp.dmp*
> *-rw-r--r-- 1 oracle dba 116275 Apr 12 12:16 120412_expdp_full.log*
> *drwx------ 12 oracle dba 4096 Apr 12 12:21 ..*
> *drwxrwxrwx 2 root root 4096 Apr 12 12:21 .*
> *-rw-r--r-- 1 oracle dba 20 Oct 10 2012 test.txt*
> *[oracle@TESTDBS daily_backup]$*
> *
> *
> *
> *
> *Regards,*
> *Raj*


You can try with the -ctime option.

The syntax is ok, but if any process update the file content the mtime
is updated too. The -ctime options seek for files which the metadata is
not updapted for X amount of days.

Regards.

- --
Facundo M. de la Cruz (tty0)
IT Consultant
RHCSA/RHCE

GPG fingerprint: DF2F 514A 5167 00F5 C753 BF3B D797 C8E1 5726 0789

"Programming today is a race between software engineers striving to
build bigger and better idiot-proof programs, and the Universe trying to
produce bigger and better idiots. So far, the Universe is winning." -
Rich Cook
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.12 (GNU/Linux)

iQIcBAEBCgAGBQJPh0IbAAoJENeXyOFXJgeJb8kP/jvafhwEteb9C+6d8lUlm5oJ
1k6uUZSCkU/mUJoWyTIuNey+0OTyhQDdC7LKXcuC7Lg/HRI5MqDiKuNdRZ8m3ojL
chTh1X68mDsQtJcJ8692HEmnnrWAaGptel6SUOckUwsBKmD92M sVt5yoFyfYIvFL
YqB45PrJxEDMhsQDS00qBqatZBfOvwoJFCf9RtY5yUes0WFfOt yLKDr8jmqveKBr
DQdDESKYh2QaIrT40Y7UnoXvyvLPiQN3tc/lsCL3LfaPpExbjwyDTr7UpDgasdM5
yCiiJI0WqOHempK+ZB3vEgYvNbarP17wo2k7U61tl5+kXJwD5W 2sALTXryKR8EYT
F1vSUJ8pnKDfQqOpaDTsMO5qfQFi8NSWPQOEHyMPEVg3z6OrnG X3CpTeuJyRZitT
WxE+DtPkx/3F5m8oXBb5oue3FaP4gimEugIVxCbrJ1H1Z6ey1EPzisxvS9J3 cacf
nLcsMRYbnV8lLLyfQCboz08mcm2VCd+7p9iSIU7T5+eeUw1G4U dWTSoyCrAo+OLM
Jz2UIwRDSOOikGlUQ7q+/sHuZsjUPq91Wntmh2FxXsBrWl8Zcfg91V3WgZFWgzZf
K27UKhSCBo/2L6bJRwAUorosT6enKHnQG+HDy7AJNVxifa/Fb3WVZNrdvwTCArwz
IhkOly6Qmi4Jd+NvX6bT
=/yYF
-----END PGP SIGNATURE-----

--
redhat-list mailing list
unsubscribe mailto:redhat-list-request@redhat.com?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list

Cameron Simpson 04-15-2012 03:06 AM

Cron to delete files older than X days
 
On 12Apr2012 17:59, Facundo M. de la Cruz <fmdlc@code4life.com.ar> wrote:
| You can try with the -ctime option.
|
| The syntax is ok, but if any process update the file content the mtime
| is updated too. The -ctime options seek for files which the metadata is
| not updapted for X amount of days.

But remember that ctime (last change to inode) is updated on chmod,
chown, file rename etc.

For debug purposes, put a -ls in the find ahead of the "-exec rm". That
way it will recite the file (and mtime) before removal.

And of course, if your cron job removed stuff, it will already be gone
when you come in to look later with your manual find.

Cheers,
--
Cameron Simpson <cs@zip.com.au> DoD#743
http://www.cskk.ezoshosting.com/cs/

I took that Reading Dynamics course, and it really works. I read _War and
Peace_ in an hour last night. It's about Russia. - W. Allen, ca. 1962

--
redhat-list mailing list
unsubscribe mailto:redhat-list-request@redhat.com?subject=unsubscribe
https://www.redhat.com/mailman/listinfo/redhat-list


All times are GMT. The time now is 01:36 AM.

VBulletin, Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO ©2007, Crawlability, Inc.