cancel
Showing results for 
Search instead for 
Did you mean: 

Problem this snippet solves:

This ansible playbook can be ised to copy the test ASM policy from the dev/preproduction environment to the production environment as this is Continuous integration and continuous delivery.

How to use this snippet:

You use the playbook by replacing the vars with "xxx" with your F5 device values for the connection. Also with the " vars_prompt:" you add policy name during execution as the preprod policy name is "{{ asm_policy }}_preprod" and the prod policy name is "{{ asm_policy }}_prod". For example if we enter "test" during the policy execution the name will be test_prod and test_preprod. If using Ansible Tower with the payed version you can use Jenkins or bamboo to push variables (I still have not tested this).



Also there is a task that deletes the old asm policy file saved on the server as I saw that the ansible modules have issues overwriting existing files when doing the export and the task name is "Ansible delete file example" and in the group "internal" I have added the localhost.




https://docs.ansible.com/ansible/latest/collections/f5networks/f5_modules/index.html



Also after importing the policy file the bug https://support.f5.com/csp/article/K25733314 is hit, so the last 2 tasks deactivate and and activate the production policy.



A nice example that I based my own is:


https://support.f5.com/csp/article/K42420223



You can also write the connections vars in the hosts file as per K42420223



 vars:

   provider:

     password: "{{ bigip_password }}"

     server: "{{ ansible_host }}"

     user: "{{ bigip_username }}"

     validate_certs: no

     server_port: 443





Example hosts:


[bigip]


f5.com




[bigip:vars]


bigip_password=xxx

bigip_username=xxx

ansible_host=xxx



The policy is exported in binary format otherwize there is an issue importing it after that "binary: yes". Also when importing the option " force: yes" provides an overwrite if there is a policy with the same name.



See the comments for my example about using host groups with this way your dev environment can be on one F5 device and the exported policy from it will be imported on another F5 device that is for production. When not using ''all'' for hosts you need to use set_facts to only be propmpted once for the policy name and then this to be shared between plays.

Code :

---



- name: Exporting and importing the ASM policy
  hosts: all
  connection: local
  become: yes


  vars:
    provider:
      password: xxx
      server: xxxx
      user: xxxx
      validate_certs: no
      server_port: 443

  vars_prompt:

    - name: asm_policy
      prompt: What is the name of the ASM policy?
      private: no

  tasks:


   - name: Ansible delete file example
     file:
       path: "/home/niki/asm_policy/{{ asm_policy }}"
       state: absent
     when: inventory_hostname in groups['internal']


   - name: Export policy in XML format
     bigip_asm_policy_fetch:
       name: "{{ asm_policy }}_preprod"
       file: "{{ asm_policy }}"
       dest: /home/niki/asm_policy/
       binary: yes
       provider: "{{ provider }}"


   - name: Override existing ASM policy
     bigip_asm_policy_import:
       name: "{{ asm_policy }}_prod"
       source: "/home/niki/asm_policy/{{ asm_policy }}"
       force: yes
       provider: "{{ provider }}"

     notify:
        - Save the running configuration to disk


   - name: Task - deactivate policy
     bigip_asm_policy_manage:
         name: "{{ asm_policy }}_prod"
         state: present
         provider: "{{ provider }}"
         active: no


   - name: Task - activate policy
     bigip_asm_policy_manage:
         name: "{{ asm_policy }}_prod"
         state: present
         provider: "{{ provider }}"
         active: yes

  handlers:
     - name: Save the running configuration to disk
       bigip_config:
        save: yes
        provider: "{{ provider }}"

Tested this on version:

13.1
Comments

Example of using seperate plays to first play delete the old local asm policy file and then just for group "bigip" the second play exports and imports the ASM policy. If you are using different F5 devices for production and preproduction then just make different host groups and seperate plays (in one play for preprod that exports and one play for prod that imports the ASM policy).

 

 

---

 

- name: Deliting old files

 hosts: all

 connection: local

 

 

 vars_prompt:

 

   - name: asm_policy

     prompt: What is the name of the ASM policy?

     private: no

 

 

 tasks:

 

 

  - name: Ansible delete file example

    file:

      path: "/home/niki/asm_policy/{{ asm_policy }}"

      state: absent

    when: inventory_hostname in groups['internal']

 

  - set_fact: "asm_fact={{ asm_policy }}"

 

 

- name: Import and export the ASM policy

 hosts: bigip

 connection: local

 become: yes

 

 

 vars:

   provider:

     password: "{{ bigip_password }}"

     server: "{{ ansible_host }}"

     user: "{{ bigip_username }}"

     validate_certs: no

     server_port: 443

 

 

 

 tasks:

 

 

  - name: Export policy in XML format

    bigip_asm_policy_fetch:

      name: "{{ asm_fact }}_preprod"

      file: "{{ asm_fact }}"

      dest: /home/niki/asm_policy/

      binary: yes

      provider: "{{ provider }}"

 

 

  - name: Override existing ASM policy

    bigip_asm_policy_import:

      name: "{{ asm_fact }}_prod"

      source: "/home/niki/asm_policy/{{ asm_fact }}"

      force: yes

      provider: "{{ provider }}"

 

    notify:

       - Save the running configuration to disk

 

 

  - name: Task - deactivate policy

    bigip_asm_policy_manage:

        name: "{{ asm_fact }}_prod"

        state: present

        provider: "{{ provider }}"

        active: no

 

 

  - name: Task - activate policy

    bigip_asm_policy_manage:

        name: "{{ asm_fact }}_prod"

        state: present

        provider: "{{ provider }}"

        active: yes

 

 handlers:

    - name: Save the running configuration to disk

      bigip_config:

       save: yes

       provider: "{{ provider }}"

 

 

 

 

 

 

 

Another way to share the policy name variable between hosts again with facts is using a dummy host to attach it and this way you don't need to use "all" in the first play to attach the fact under all the hosts.

 

 

 

 

 

 

 

---

 

- name: Deliting old files

 hosts: internal

 connection: local

 

 

 vars_prompt:

 

   - name: asm_policy

     prompt: What is the name of the ASM policy?

     private: no

 

 

 tasks:

 

 

  - name: Ansible delete file example

    file:

      path: "/home/niki/asm_policy/{{ asm_policy }}"

      state: absent

 

 

  - name: set a variable

    set_fact:

       shared_variable: "{{ asm_policy }}"

  - name: add variables to dummy host

    add_host:

       name: "variable_holder"

       shared_variable: "{{ shared_variable }}"

 

 

- name: Import and export the ASM policy

 hosts: bigip

 connection: local

 become: yes

 

 

 vars:

   asm_fact: "{{ hostvars['variable_holder']['shared_variable'] }}"

   provider:

     password: "{{ bigip_password }}"

     server: "{{ ansible_host }}"

     user: "{{ bigip_username }}"

     validate_certs: no

     server_port: 443

 

 

 

 

Version history
Last update:
‎02-Aug-2021 08:13
Updated by:
Contributors