Introduction
Logs imported to DataSet from various data sources have a similar structure. Additional refinement of these search queries may be necessary to ensure that specific events are returned.
Example Searches
AWS CloudTrail
source='cloudtrail'
PowerQuery: Display errors. Truncate lengthy error messages (if applicable)
source='cloudtrail' errorCode=*
| let errMsg = len(errorMessage) >= 80 ? substr(errorMessage,72) + " ..." : errorMessage
| group count=count() by awsRegion,errMsg,eventName
AWS RDS
source='rds' dbInstance='<dbinstance>' logfile='<logfile>'
Configure a parser to extract fields from log
Azure EventHub
source='eventhub' eventHubName='<name>' namespaceHostname='<hostname>'
Logs are in JSON format, configure a parser to extract fields
GitHub
source='github'
PowerQuery: Review GitHub activity by 5 minute intervals
source='github' !(user="") // only return logs with populated user field
| group count=count() by timestamp=timebucket('10m'),user, repo, action, pull_request_title
Google Pub/Sub
source='pubsub' logfile='<logfile>'
PowerQuery: Analyze log frequency by resource type and user
source='pubsub' resource.labels.region=* user=*
| group count=count() by resource.type,resource.labels.region,user
| sort -count
Okta
source='okta'
PowerQuery: Evaluate logs by selected fields
source='okta' tag='oktaimport'
| parse "\\[\\{\"alternateId\":\"$alternateId$\",\"displayName\":\"$displayName$\"" from target
| group count=count() by outcome.result,actor.type,alternateId,displayName,client.geographicalContext.country
S3 Import
source='s3Bucket' region='<awsregion>' logfile='<logfile>'
Comments
0 comments
Please sign in to leave a comment.