How I did it - "Visualizing Data with F5 TS and Splunk"

The new Splunk Add-on for F5 BIG-IP includes several objects, (modular inputs, CIM-knowledge, etc.) that work to “normalize” incoming BIG-IP data for use with other Splunk apps, such as Splunk Enterprise Security and the Splunk App for PCI Compliance.   

The add-on includes a mechanism for pulling network traffic data, system logs, system settings, performance metrics, and traffic statistics from the F5 BIG-IP platform using F5’s iControl API, (see below).

But what I'm really excited about is that the add-on now integrates with F5 Telemetry Streaming, (TS). With TS I am easily able to declaratively aggregate, normalize, and push BIG-IP statistics and events, (JSON-formatted) to a variety of third-party analytics vendors. 

For the remainder of this article, we’ll take a look at how I integrate F5 TS with Splunk Enterprise. I’ll be working with an existing BIG-IP deployment as well as a newly deployed Splunk Enterprise instance. As an added bonus, (and since it’s part of the article’s title) I’ll import a couple custom dashboards, (see below) to visualize our newly ingested telemetry data.

Oh! As an "Extra" added bonus, here is a link to a video walk through of this solution.  

Installing the Splunk Add-on for F5 BIG-IP and Splunk CIM

Installing the Splunk F5 add-on is very simple. Additionally, to make use of the add-on I’ll need to install Splunk’s Common Information Model, (CIM).   

1.    From the top Splunk the search page, I select ‘Apps’ → ‘Find More Apps’.  

2.   I browse for “CIM” and select the Splunk Common Information Model add-on.

3.   I accept the license agreement, provide my Splunk account login credentials and select ‘Login and Install’.

4.   I’ll repeat steps 2-3 to install the Splunk Add-on for F5 BIG-IP. 

Setup Splunk HTTP Event Collector

To receive incoming telemetry data into my Splunk Enterprise environment over HTTP/HTTPs I will need to create an HTTP Event Collector.

1.    From the UI I select ‘Settings’ → ‘Data Inputs’. I select ‘HTTP Event Collector’ from the input list.

2.   Prior to creating a new event collector token, I must first enable token access for my Splunk environment. On the ‘HTTP Event Collector’ page, I select ‘Global Settings’. I set ‘All Tokens’ to enabled, default index, incoming port and ensure SSL is enabled. I click ‘Save’ to exit.

3.    I select ‘New Token’ and provide a name for the new collector and select ‘Next’.

4.    On the ‘Input Settings’ tab I’ll select my allowed index(es) and select ‘Review’ then ‘Submit’.

5.    Once the token is created, I will need to copy the token for use with my F5 TS configuration.

Configure Telemetry Streaming

With my Splunk environment ready to receive telemetry data, I now turn my attention to configuring the BIG-IP for telemetry streaming. Fortunately, F5’s Automation Toolchain configuring the BIG-IP is quite simple.  

1.    I’ll use Postman to POST an AS3 declaration to configure telemetry resources, (telemetry listener, log publisher, logging profiles, etc.).  

The above AS3 declaration, (available here) deploys the required BIG-IP objects for pushing event data to a third-party vendor. Notably, it creates four (4) logging profiles I’ll attach to my application’s virtual server.

2.    Still using Postman, I POST my TS declaration, (sample). I will need to provide my Splunk HTTP Collector endpoint address/port as well as the token generated previously.

Associate Logging Profiles to Virtual Server

The final step to configuring the BIG-IP for telemetry streaming is associating the logging profiles I just created with my existing virtual server. In addition to system telemetry, these logging profiles, when assigned to a virtual, will send LTM, AVR, and ASM telemetry.

1.    From the BIG-IP management UI, I select ‘Local Traffic’ → ‘Virtual Servers’ → <virtual>.

2.    Under ‘Configuration’ I select ‘Advanced’, scroll down and select the HTTP, TCP, and request logging profiles previously created. I select ‘Update’ at the bottom of the page to save

3.   From the top of the virtual server page, I select ‘Security’ → ‘Policies’. From the policy settings page, I can see that there is an existing WAF policy associated with my application. To enable ASM logging, I select the previously created ASM logging profile from the available logging profiles and select ‘Update’ to save my changes.

With the configuration process complete, I should now start seeing event data in my Splunk Environment.  

Import Dashboards

“Ok, so I have event data streaming into my Splunk environment; now what?” 

Since I have installed the Splunk F5 add-on, I can integrate my “normalized” data with other data sources to populate various Splunk applications like Splunk Enterprise Security and Splunk App for PCI Compliance. Likewise, I can use dashboards to visualize my telemetry data as well as monitor BIG-IP resources/processes. To finish up, I’ll use the following steps to create custom dashboards visualizing BIG-IP metrics and Advanced WAF, (formerly ASM) attack information.

1.    From the Splunk Search page, I navigate to the Dashboards page by selecting ‘Dashboards’.

2.   Select ‘Create New Dashboard’ from the Dashboards page.

3.   Provide a name for the new dashboard and select ‘Create Dashboard’. The dashboard name, (ID will remain unchanged) will be updated in the next step where I replace the newly created dashboard’s XML source with one of the community-supported dashboard XML files here.

4.   On the ‘Edit Dashboard' screen I select ‘Source’ to edit the dashboard XML. I replace the existing XML data with the contents of the ‘advWafInsights.xml’ file. Select ‘Save’ to install the new dashboard.  

 

5.    I’ll repeat steps 1-4 using ‘bigipSystemMetrics.xml’ to install the BIG-IP metrics dashboard,

 

Additional Links

·     F5 Telemetry Streaming

·     Splunk Add-on for F5 BIG-IP

·     Splunk Common Information Model 

·     F5 Automation Toolchain

Updated Dec 13, 2022
Version 2.0
  • Thank's  super interesting article.

    i tried ts with the telemetry declaration only.

    What format type have you used ?

     

    I used legacy, because multi-metric was not available in the older documentation.

    And legacy was simpler to parse in Splunk search commands and also with splunk streaming stats.

  • Thanks for your helpful post buddy. That was perfect.

    I have a problem with Postman. When I post, I get "404 Not Found" and just "1" shown in response body. When I tried to open the url (https://172.16.1.1/mgmt/shared/appsvcs/declare), this message is displayed: "{"code":404,"message":"Public URI path not registered: /shared/appsvcs/declare","referer":"172.16.10.10","restOperationId":.............,"kind":":resterrorresponse"}"

    The same thing happens with the second url. (https://172.16.1.1/mgmt/shared/telemetry/declare?show=detail)

     

    Would you please help me?

    Thanks in advance.

    Ali