How I did it - "Remote Logging with the F5 XC Global Log Receiver and Elastic"
Hello and welcome back to another edition of "How I did it". In this installment, we connect F5 Distributed Cloud (XC) Services to another of our Analytics partners, Elastic. To make that happen , we'll configure the F5 XC Global Log Receiver service and show how easy it is to send event data to an Elastic Stack.
Configuration Overview
The Elastic Stack
Elastic Stack, also known as ELK Stack, is a collection of open-source tools for log management, visualization, and analysis. It includes Elasticsearch for searching and analyzing data, Logstash for centralized logging, and Kibana for data visualization and exploration. Together, these tools provide a platform for monitoring and troubleshooting applications and systems.
There are multiple options for ingesting data into Elasticsearch such as the Elastic Agent, numerous data shippers - aka Beats, OTEL, and Logstash. For this demonstration, I utilize Logstash for data collection.
F5 Distributed Cloud Services
F5 Distributed Cloud Services, (XC) provides a global cloud native platform where customers can deploy, manage and secure their applications regardless of whether the application resides in a public cloud, in a private data center, or a colocation facility, (see below). The platform provides a variety of ADN, MCN and CDN services including the Global Log Receiver service.
A Global log receiver can be configured to securely send logs to a variety of endpoints -including Logstash - over HTTP(s).
Exposing Logstash
The central component of Logstash is its pipeline. A Logstash pipeline is a series of stages that ingest, transform, and output data. It typically includes input plugins to collect data, filter plugins to modify or enrich data, and output plugins to send the processed data to a storage or indexing system. An example of the pipeline configuration utilized for this demo is provided below.
input { http { port => 8080 } } filter { json { source => "message" } } output { elasticsearch { hosts => ["https://127.0.0.1:9200"] user => "elastic" password => "********" codec => json index => "f5xc" ssl => true ssl_certificate_verification => false cacert => "/etc/logstash/elasticsearch-ca.pem" } }
You will note that the input stage is configured to expose a basic HTTP listener on port 8080 - definitely NOT secure - and should not be exposed externally. However, rather than configuring secure connectivity (TLS/mTLS) and exposing the local Logstash instance directly, I've published the endpoint on F5 XC.
This way I’m able to securely expose and monitor Logstash (see below), utilize F5 XC security services like WAF, BOT and DDOS detection and reduce load on my local Elastic instance.
With the Elastic endpoint exposed, ingesting logs from F5 XC is a simple matter of configuring a global log receiver.
Check it Out
Rather than walk you through the entire configuration steps - links to guidance are below - how about a movie? The video below provides a brief walkthrough demo integrating the F5 Distributed Cloud Services platform with Elastic.
Additional Links