Ohhh, that sounds much better.
On WF I had put together a few shell scripts, and scheduled them via cron, to catenate and archive these so I didn't lose logs older than 7 days. It was a whole thing to setup (largely because of the file naming), but it seems to have been working well ever since. You (@sean) even helped create one of the scripts! (Thanks again!)
A few questions (just to make sure):
So these nginx_access.txt-20201027.gz
files still get deleted after 7 days?
If I wanted to retain ALL log files (long term archiving), I guess it would be pretty simple to setup a cron
job to copy each to another directory?
- ex: Run once per/day:
cp *_access.txt-*.gz /archive
and cp *_error.txt-*.gz /archive
Hum... I might need to get a little cleverer to account for files that have already been copied. Or I guess just letting cp
overwrite into /archive
would be ok??
I know (and will) try some of this myself. But I don't have enough logs yet, and it'll take at least 7 days to see complete a cycle. Which is why I'm asking now, instead of just doing it :-) Also, it might help others. Speaking of:
+1 Request for all of these details to be added to: http://help.opalstack.com (No rush :-)
Thank you!