cancel
Showing results for 
Search instead for 
Did you mean: 
Login & Join the DevCentral Connects Group to watch the Recorded LiveStream (May 12) on Basic iControl Security - show notes included.

Problem this snippet solves:

In many cases generated bad ip address lists by a SIEM (ELK, Splunk, IBM QRADAR) need to be uploaded to F5 for to be blocked but the BIG-IQ can't be used to send data group changes to the F5 devices.

1.A workaround to use the BIG-IQ script option to make all the F5 devices to check a file on a source server and to update the information in the external data group. I hope F5 to add the option to BIG-IQ to schedule when the scrpts to be run otherwise a cron job on the BIG-IQ may trigger the script feature that will execute the data group to refresh its data (sounds like the Matrix).

https://clouddocs.f5.com/training/community/big-iq-cloud-edition/html/class5/module1/lab6.html

Example command to run in the BIG-IQ script feature:

tmsh modify sys file data-group ban_ip type ip source-path https://x.x.x.x/files/bad_ip.txt

https://support.f5.com/csp/article/K17523

 

2.You can also set the command with cronjob on the BIG-IP devices if you don't have BIG-IQ as you just need Linux server to host the data group files.

 

3.Also without BIG-IQ Ansible playbook can be used to manage many groups on the F5 devices as I have added the ansible playbook code below.

 

4.If you have AFM then you can use custom feed lists to upload the external data without the need for Ansible or Big-IQ. The ASM supports IP  intelligence but no custom feeds can be used:

https://techdocs.f5.com/kb/en-us/products/big-ip-afm/manuals/product/big-ip-afm-getting-started-14-1...

How to use this snippet:

I made my code reading:

https://docs.ansible.com/ansible/latest/collections/f5networks/f5_modules/bigip_data_group_module.ht...

https://support.f5.com/csp/article/K42420223 

If you want to have an automatic timeout then you need to use the irule table command (but you can't edit that with REST-API, so see the article below as a workaround) that writes in the RAM memory that supports automatic timeout and life time for each entry then there is a nice article for that as I added comment about possible bug resolution, so read the comments!

https://devcentral.f5.com/s/articles/populating-tables-with-csv-data-via-sideband-connections 

Another way is on the server where you save the data group info is to add a bash script that with cronjob deletes from time to time old entries. For example (I tested this). Just write each data group line/text entry with for example IP address and next to it the date it was added..

 

cutoff=$(date -d 'now - 30 days' '+%Y-%m-%d')
awk -v cutoff="$cutoff" '$2 >= cutoff { print }' <in.txt >out.txt && mv out.txt in.txt

 

https://stackoverflow.com/questions/38571524/remove-line-in-text-file-with-bash-if-the-date-is-older...

Code :

---

 

- name: Create or modify data group

 hosts: all

 connection: local

 

 

 vars:

   provider:

     password: xxxxx

     server: x.x.x.x

     user: xxxxx

     validate_certs: no

     server_port: 443

 

 tasks:

  - name: Create a data group of IP addresses from a file

    bigip_data_group:

     name: block_group

     records_src: /var/www/files/bad.txt

     type: address

     provider: "{{ provider }}"

 

    notify:

       - Save the running configuration to disk

 

 handlers:

   - name: Save the running configuration to disk

     bigip_config:

       save: yes

       provider: "{{ provider }}"

Tested this on version:

15.1

Version history
Last update:
‎10-Mar-2022 07:40
Updated by: