Nginx/Filebeat Installation
Method 1: Docker Compose
filebeat:
image: docker.elastic.co/beats/filebeat:7.6.1
container_name: filebeat
user: root
environment:
- strict.perms=false
volumes:
- './filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml:ro'
- './filebeat/data:/usr/share/filebeat/data:rw'
- './nginx/log:/var/log/nginx'
nginx:
image: nginx
volumes:
- './nginx/log:/var/log/nginx'
ports:
- "80:80"
- "443:443"
user: root
Method 2: Configure Manually
Configure and Install Nginx
1. Install an Nginx server. This is for CentOS 7
sudo yum install epel-release
sudo yum install nginx
sudo systemctl start nginx
sudo firewall-cmd --permanent --zone=public --add-service=http
sudo firewall-cmd --permanent --zone=public --add-service=https
sudo firewall-cmd --reload
2. Configure access and error logs if you want a custom config, otherwise, by default, the log file location is
/var/log/nginx/*.log
Install and Configure Filebeat
1. Install Filebeat
sudo rpm --import https://packages.elastic.co/GPG-KEY-elasticsearch
sudo yum install filebeat
sudo systemctl enable filebeat
Configure Filebeat
Configure Input as Nginx and Output as Kafka
Configure Filebeat as an input stream. You can see where this is configured on different systems here If you are using Docker, the file is located in ./filebeat/filebeat.yml
sudo vim /etc/filebeat/filebeat.yml
# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: log
paths:
- "/var/log/nginx/*"
fields:
parser: accessLog
app: nginx
fields_under_root: true
# ================================== General ===================================
# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:
# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]
# Optional fields that you can specify to add additional information to the
# output.
fields:
env: staging
# ================================== Outputs ===================================
output.kafka:
# specifying filebeat to take timestamp and message fields, other wise it
# take the lines as json and publish to kafka
#codec.format:
# string: '%{[@timestamp]} %{[message]}'
# kafka
# publishing to 'log' topic
hosts: ["<kafka_broker>:9092"]
topic: 'scalyr'
partition.round_robin:
reachable_only: false
required_acks: 1
compression: gzip
max_message_bytes: 1000000
# ================================= Processors =================================
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
- add_fields:
target: app
fields:
name: nginx
id: '574734885120952459'
# ================================== Logging ===================================
# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
#logging.level: debug
# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]
Start Filebeat
sudo bin/filebeat -e -c /etc/filebeat/filebeat.yml
Or start the Docker container
docker compose up -d filebeat
docker compose up -d nginx
Logs Ingestion (Filebeat -> Kafka -> DataSet)
Visit the public IP of the Nginx server and data should be written to the access.log.
24.23.157.7 - - [30/Sep/2020:08:27:54 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36" "-"
This line is converted to a JSON message and sent to Kafka.
{"@timestamp":"2020-09-30T08:27:54.000Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.6.1"},"ecs":{"version":"1.4.0"},"agent":{"version":"7.6.1","type":"filebeat","ephemeral_id":"e8627ae7-e847-44f7-8259-81e6d9cae045","hostname":"c1288dccce40","id":"9e43ca4b-accb-4386-a2cb-b3cb78bd22c1"},"cloud":{"project":{"id":"clear-canyon-240423"},"provider":"gcp","instance":{"id":"4240384044872722104","name":"kafka-0"},"machine":{"type":"e2-medium"},"availability_zone":"us-central1-a"},"message":"24.23.157.7 - - [30/Sep/2020:08:27:54 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36\" \"-\"","parser":"accessLog","app":{"name":"nginx","id":"574734885120952459"},"input":{"type":"log"},"fields":{"env":"staging"},"host":{"hostname":"c1288dccce40","architecture":"x86_64","os":{"platform":"centos","version":"7 (Core)","family":"redhat","name":"CentOS Linux","kernel":"3.10.0-1127.19.1.el7.x86_64","codename":"Core"},"containerized":true,"name":"c1288dccce40"},"log":{"offset":72012,"file":{"path":"/var/log/nginx/access.log"}}}
Continue to Installing DataSet Connector
Comments
0 comments
Please sign in to leave a comment.