Automate Data Group updates on many Big-IP devices using Big-IQ or Ansible or Terraform
Problem this snippet solves:
In many cases generated bad ip address lists by a SIEM (ELK, Splunk, IBM QRADAR) need to be uploaded to F5 for to be blocked but the BIG-IQ can't be used to send data group changes to the F5 devices.
1.A workaround to use the BIG-IQ script option to make all the F5 devices to check a file on a source server and to update the information in the external data group. I hope F5 to add the option to BIG-IQ to schedule when the scrpts to be run otherwise a cron job on the BIG-IQ may trigger the script feature that will execute the data group to refresh its data (sounds like the Matrix).
https://clouddocs.f5.com/training/community/big-iq-cloud-edition/html/class5/module1/lab6.html
Example command to run in the BIG-IQ script feature:
tmsh modify sys file data-group ban_ip type ip source-path https://x.x.x.x/files/bad_ip.txt
https://support.f5.com/csp/article/K17523
2.You can also set the command with cronjob on the BIG-IP devices if you don't have BIG-IQ as you just need Linux server to host the data group files.
3.Also without BIG-IQ Ansible playbook can be used to manage many groups on the F5 devices as I have added the ansible playbook code below. Now with the windows subsystem you can run Ansible on Windows!
4.If you have AFM then you can use custom feed lists to upload the external data without the need for Ansible or Big-IQ. The ASM supports IP intelligence but no custom feeds can be used:
How to use this snippet:
I made my code reading:
https://support.f5.com/csp/article/K42420223
If you want to have an automatic timeout then you need to use the irule table command (but you can't edit that with REST-API, so see the article below as a workaround) that writes in the RAM memory that supports automatic timeout and life time for each entry then there is a nice article for that as I added comment about possible bug resolution, so read the comments!
https://devcentral.f5.com/s/articles/populating-tables-with-csv-data-via-sideband-connections
Another way is on the server where you save the data group info is to add a bash script that with cronjob deletes from time to time old entries. For example (I tested this). Just write each data group line/text entry with for example IP address and next to it the date it was added.
cutoff=$(date -d 'now - 30 days' '+%Y-%m-%d') awk -v cutoff="$cutoff" '$2 >= cutoff { print }' <in.txt >out.txt && mv out.txt in.txt
Ansible is a great automation tool that makes changes only when the configuration is modified, so even if you run the same playbook 2 times (a playbook is the main config file and it contains many tasks), the second time there will be nothing (the same is true for terraform). Ansible supports "for" loops but calls them "loop" (before time "with_items
" was used) and "if else" conditions but it calls them "when" just to confuse us and the conditions and loops are placed at the end of the task not at the start 😀 A loop is good if you want to apply the same config to multiple devices with some variables just being changed and "when" is nice for example to apply different tasks to different versions of the F5 TMOS or F5 devices with different provisioned modules.
Code :
--- - name: Create or modify data group hosts: all connection: local vars: provider: password: xxxxx server: x.x.x.x user: xxxxx validate_certs: no server_port: 443 tasks: - name: Create a data group of IP addresses from a file bigip_data_group: name: block_group records_src: /var/www/files/bad.txt type: address provider: "{{ provider }}" notify: - Save the running configuration to disk handlers: - name: Save the running configuration to disk bigip_config: save: yes provider: "{{ provider }}"
The "notify" triggers the handler task after the main task is done as there is no point in saving the config before that and the handler runs only on change,
Tested this on version:
15.1
Also now F5 has Terraform Provider and together with Visual Studio you can edit your code on Windows and deploy it from the Visual Studio itself!
Visual Studio wil even open for you the teminal, where you can select the folder where the terraform code will be saved after you have added the code run terraform init, terraform plan, terraform apply. VS even has a plugin for writting F5 irules.Terraform's files are called "tf" and the terraform providers are like the ansible inventory file (ansible may also have a provider object in the playbook not the inventory file) and are used to make the connection and then to create the resources (like ansible tasks).
Usefull links for Visual Studio and Terraform:
https://registry.terraform.io/providers/F5Networks/bigip/1.16.0/docs/resources/bigip_ltm_datagroup
https://www.youtube.com/watch?v=Z5xG8HLwIh4
For more advanced terafform stuff like for loops and if or count conditions:
https://blog.gruntwork.io/terraform-tips-tricks-loops-if-statements-and-gotchas-f739bbae55f9
Code :
You may need to add also this resource below as to save the config and with "depends_on" it wil run after the date group is created. This is like the handler in Ansible that is started after the task is done and also terraform sometimes creates resources at the same time not like Ansible task after task,
Tested this on version:
16.1
Ansible and Terraform now can be used for AS3 deployments like the BIG-IQ's "applications" as they will push the F5 declarative templates to the F5 device and nowadays even the F5 AWAF/ASM and SSLO (ssl orchestrator) support declarative configurations.
For more info:
https://www.f5.com/company/blog/f5-as3-and-red-hat-ansible-automation
https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/playbook_tutorial.html
https://clouddocs.f5.com/products/orchestration/terraform/latest/userguide/as3-integration.html
https://support.f5.com/csp/article/K23449665
https://clouddocs.f5.com/training/fas-ansible-workshop-101/3.3-as3-asm.html
https://www.youtube.com/watch?v=Ecua-WRGyJc&t=105s
There are Ansible modules provided by F5 to handle datagroups. Especially with internal datagroups I had issues with the key/value-separator and whitespaces. That´s why I tend to stick to use internal datagroups as long as they aren´t too big.
Instead of using the F5 provided modules you might want to consider direct calls to the iControl REST API.
List existing internal datagroups, i.e.:
GET /mgmt/tm/ltm/data-group/internal$select=name,type
Creating an empty new datagroup, i.e. of type string:
POST /mgmt/tm/ltm/data-group/internal
{ "description": "Test Datagroup", "name": "datagroup_virtual_lab.bit", "type": "string" }
Replace the content of a datagroup, i.e.:
PATCH /mgmt/tm/ltm/data-group/internal/<datagroup-name>
{ "description": "Test Datagroup", "name": "datagroup_virtual_lab.bit", "type": "string", "records": [ { "name":"key1", "data":"data1" }, { "name":"key2", "data":"data2" } ] }
Add records to an existing datagroup, i.e.:
PATCH /mgmt/tm/ltm/data-group/internal/<datagroup-name>?options=records add { <key-name> { data <data-value> } }