You can automatically archive log data by using our built-in S3 archive feature - see this page for more information.
Logs can also be archived on demand for a specific date range with the User menu -> "Batch Export to S3" option.
- In both options, log data is exported in JSON format.
- Users don't need to set an FQDN for the bucket name or specify any region, so creating a normal bucket and specify the plain bucket name in the /scalyr/logs config file should do it.
DataSet S3 archive includes a retry mechanism, so it's normal to see errors (ex.
NotFoundException) persisted after the bucket name is fixed. Those errors are generated by the retry on the incorrect S3 bucket configuration. Therefore, if you do see that logs from your DataSet account are archived to the configured S3 bucket, those errors will eventually disappear.
Please sign in to leave a comment.