Simple archiving to S3 for log files

Richard Benson02 April 2013IT Pros, Administrationcomments

When operating a large number of cloud servers, many of which will have small amounts of local storage, growing log files can become a problem and most countries have laws in place that service providers need to retain logs for specified amounts of time.  Manually fetching these logs from each server is a time-consuming task and becomes even more difficult when you may not even know how many servers you have at any one time.

To solve this problem in our case, it seemed obvious to upload these logs to a cloud based storage and then delete them from the local machine when done.  There didn't seem to be a simple solution out there to manage this, so we decided to create our own simple application that will fulfill this task.


Download the zip file and extract to a folder on the server somewhere, then follow the usage guide below.


Run the file S3Archive.exe once, this will create a default config file and exit. Modify the xml file to include one folder element for each path you wish to send to S3.


  • includeOpen - Will also upload open files, defaults to false as webservers usually have an open file handle on the current log.
  • deleteOnUpload - Deletes file after it has been uploaded, there is currently no verification so use with caution.
  • recursive - Process all subdirectories or just the current.
  • path - The local path to the directory to scan.
  • bucket - The S3 bucket to store in. Must already be created, currently this is not created for you.
  • basePath - a prepended path to the S3 key for the file to allow you to use a single bucket for multiple sources.
  • pattern - File search pattern to use across all included directories

Usage in the cloud

For servers that are running full-time this can be set to run on a schedule based on the volume of traffic you receive and how frequently you want to access the collected logs.  If you are using scaling environments, bake the application into your AMIs and schedule to run on shutdown as well as on a timed basis.  That way all logs are captured even in rapidly changing environments.

Open Source

As with many of our other projects, the code is open source and we welcome suggestions, additions and bug fixes.

comments powered by Disqus
Support Ticket
Remote Support
clever girl