Showing results for 
Search instead for 
Did you mean: 

Problem this snippet solves:

In many cases generated bad ip address lists by a SIEM (ELK, Splunk, IBM QRADAR) need to be uploaded to F5 for to be blocked but the BIG-IQ can't be used to send data group changes to the F5 devices.

1.A workaround to use the BIG-IQ script option to make all the F5 devices to check a file on a source server and to update the information in the external data group. I hope F5 to add the option to BIG-IQ to schedule when the scrpts to be run otherwise a cron job on the BIG-IQ may trigger the script feature that will execute the data group to refresh its data (sounds like the Matrix).

Example command to run in the BIG-IQ script feature:

tmsh modify sys file data-group ban_ip type ip source-path https://x.x.x.x/files/bad_ip.txt


2.You can also set the command with cronjob on the BIG-IP devices if you don't have BIG-IQ as you just need Linux server to host the data group files.


3.Also without BIG-IQ Ansible playbook can be used to manage many groups on the F5 devices as I have added the ansible playbook code below.


4.If you have AFM then you can use custom feed lists to upload the external data without the need for Ansible or Big-IQ. The ASM supports IP  intelligence but no custom feeds can be used:

How to use this snippet:

I made my code reading: 

If you want to have an automatic timeout then you need to use the irule table command (but you can't edit that with REST-API, so see the article below as a workaround) that writes in the RAM memory that supports automatic timeout and life time for each entry then there is a nice article for that as I added comment about possible bug resolution, so read the comments! 

Another way is on the server where you save the data group info is to add a bash script that with cronjob deletes from time to time old entries. For example (I tested this). Just write each data group line/text entry with for example IP address and next to it the date it was added..


cutoff=$(date -d 'now - 30 days' '+%Y-%m-%d')
awk -v cutoff="$cutoff" '$2 >= cutoff { print }' <in.txt >out.txt && mv out.txt in.txt

Code :



- name: Create or modify data group

 hosts: all

 connection: local





     password: xxxxx

     server: x.x.x.x

     user: xxxxx

     validate_certs: no

     server_port: 443



  - name: Create a data group of IP addresses from a file


     name: block_group

     records_src: /var/www/files/bad.txt

     type: address

     provider: "{{ provider }}"



       - Save the running configuration to disk



   - name: Save the running configuration to disk


       save: yes

       provider: "{{ provider }}"

Tested this on version:



There are Ansible modules provided by F5 to handle datagroups. Especially with internal datagroups I had issues with the key/value-separator and whitespaces. That´s why I tend to stick to use internal datagroups as long as they aren´t too big.

Instead of using the F5 provided modules you might want to consider direct calls to the iControl REST API.

List existing internal datagroups, i.e.:

GET /mgmt/tm/ltm/data-group/internal$select=name,type

Creating an empty new datagroup, i.e. of type string:

POST /mgmt/tm/ltm/data-group/internal

    "description": "Test Datagroup",
    "name": "datagroup_virtual_lab.bit",
    "type": "string"

Replace the content of a datagroup, i.e.:

PATCH /mgmt/tm/ltm/data-group/internal/<datagroup-name>

    "description": "Test Datagroup",
    "name": "datagroup_virtual_lab.bit",
    "type": "string",
    "records": [

Add records to an existing datagroup, i.e.:

PATCH /mgmt/tm/ltm/data-group/internal/<datagroup-name>?options=records add { <key-name> { data <data-value> } }


Version history
Last update:
‎10-Mar-2022 07:40
Updated by: