Introduction
Logs can be uploaded to DataSet by altering the Logstash configuration file (i.e. logstash.conf) and installing our Logstash plugin. Please see our official documentation link for more information.
In this article, we will set up a Docker container to demonstrate the configuration.
Configuration
Install the Logstash Plugin into a Docker Container
The easiest way to configure the Logstash plugin is to build a custom Logstash image:
Modify the Dockerfile
# https://github.com/elastic/logstash-docker
FROM docker.elastic.co/logstash/logstash:6.8.10
# Add your logstash plugins setup here
RUN bin/logstash-plugin install logstash-output-scalyr
Build the custom Logstash image with DataSet output plugin
docker build -t scalyr-logstash .
Modify Logstash Configuration
You should now have a Logstash Docker image with the tag scalyr-logstash
. You can add DataSet API key and other attributes to the configuration file.
Here is an example config that reads file input (i.e scalyr-plugin.log) through Logstash and imports the log events to DataSet:
Important: The reserved message
field (highlighted in red below) is the only field that can be parsed by a DataSet parser. Searches are also applied to the message
field by default (when no attributes is explicitly defined).
input {
file {
path => "/usr/share/logstash/scalyr-genlog/scalyr-plugin.log"
start_position => beginning
sincedb_path => "/dev/null"
type => "plugin"
}
stdin { }
}
filter {
mutate {
add_field => { "parser" => "logstash_parser" }
add_field => { "serverHost" => "my hostname" }
rename => { "path" => "logfile" }
rename => { "data" => "message" }
}
}
output {
if [type] == "plugin" {
scalyr {
api_write_token => "<LOG_WRITE_API_TOKEN>"
}
}
stdout { codec => "rubydebug"}
}
Run the Logstash Container and Send Messages to DataSet
I use docker-compose to manage the Logstash container.
logstash:
image: scalyr-logstash:latest
volumes:
- ./logstash/pipeline:/usr/share/logstash/pipeline
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
- ./logstash/scalyr-genlog:/usr/share/logstash/scalyr-genlog
command: sh -c "logstash -f /usr/share/logstash/pipeline/logstash.conf"
ports:
- "5000:5000"
- "9600:9600"
networks:
- front
logging:
driver: "json-file"
options:
max-size: "2g"
After the Logstash DataSet container successfully starts, I add a few lines to the file and confirm that the messages were successfully ingested to DataSet.
Output
Comments
0 comments
Please sign in to leave a comment.