f5networks / f5-ansible-bigip Goto Github PK
View Code? Open in Web Editor NEWDeclarative Ansible collection for managing F5 BIG-IP/BIG-IQ.
Declarative Ansible collection for managing F5 BIG-IP/BIG-IQ.
Currently the only approach to deploying as3 and ts only allows you to either add/update it or remove the declaration passed to the module. This may be have been great in the past but now everything is moving towards defining a desired state rather than keeping track of the running state out of bigip and then making changes based on it.
There is currently state: present
and state: absent
, it would be great if there could be a 3rd state: desired
, meaning you would define a desired state and the module would delete everything that is not in the defined declaration on the bigip while adding/updating any changes made to it.
If that is not possible, it would atleast be great then if the module would return the parts that did not change during state: present
which you could then take and run state: absent
with the returned value.
Retrieving the running state from bigip, diffing it with the desired state and running state: present
on the parts that are in the declaration and state: absent
on parts that have been deleted.
f5networks.f5_bigip.bigip_as3_deploy
version 1.12.0
2.14.1
16.1.3.3
[defaults]
collections_paths=./collections
filter_plugins=./filter_plugins
module_utils=./module_utils
roles_path=./roles
inventory=./inventory
N/A
bigip_as3_deploy
assumes that a declaration starts with the AS3 class, but AS3 does allow declarations to omit the AS3 class and directly start with the ADC class.
There are many example declarations in the Postman collection on the release page of the f5-appsvcs-extension omitting the AS3 class and directly using the ADC class.
Below is a playbook with a full repro of the issue. Obviously the &bigip_provider
needs to be modified to match the test bed.
---
- name: "Repro"
hosts: all
gather_facts: false
connection: httpapi
vars:
provider: &bigip_provider
server: "10.1.1.5"
user: "admin"
password: "Secret_245"
validate_certs: "no"
server_port: 443
ansible_host: "{{ provider.server }}"
ansible_user: "{{ provider.user }}"
ansible_httpapi_password: "{{ provider.password }}"
ansible_httpapi_port: "{{ provider.server_port }}"
ansible_network_os: "f5networks.f5_bigip.bigip"
ansible_httpapi_use_ssl: "yes"
ansible_httpapi_validate_certs: "{{ provider.validate_certs }}"
tasks:
- name: "F5 Collection Versions"
ansible.builtin.debug:
msg: >-
Ansible version:{{ ansible_version.full }},
f5networks.f5_bigip:{{ lookup('community.general.collection_version', 'f5networks.f5_bigip') }},
f5networks.f5_modules:{{ lookup('community.general.collection_version', 'f5networks.f5_modules') }}
- name: "(Test #1) Create Declaration"
ansible.builtin.set_fact:
# Copied from example Postman collection: https://github.com/F5Networks/f5-appsvcs-extension/releases/tag/v.3.42.0
# Name: example-service-generic
# This declaration works perfectly fine when POSTed to https://{{host}}/mgmt/shared/appsvcs/declare (eg. via Postman)
declaration: |-
{
"class": "ADC",
"schemaVersion": "3.5.0",
"id": "Service_Generic",
"Sample_misc_02": {
"class": "Tenant",
"Application": {
"class": "Application",
"generic_virtual": {
"class": "Service_Generic",
"virtualAddresses": [
"192.0.2.140"
],
"virtualPort": 8080
}
}
}
}
- name: "(Test #1) AS3 send declaration with ADC"
f5networks.f5_bigip.bigip_as3_deploy:
content: "{{ declaration }}"
state: present
ignore_errors: yes # continue with next task to demonstrate repro
- name: "(Test #2) Wrap declaration in AS3 class"
ansible.builtin.set_fact:
# Copied from example Postman collection: https://github.com/F5Networks/f5-appsvcs-extension/releases/tag/v.3.42.0
# Name: example-service-generic
declaration_wrapped_in_AS3_class: >-
{
"class": "AS3",
"persist": false,
"declaration": {
"class": "ADC",
"schemaVersion": "3.5.0",
"id": "Service_Generic",
"Sample_misc_02": {
"class": "Tenant",
"Application": {
"class": "Application",
"generic_virtual": {
"class": "Service_Generic",
"virtualAddresses": [
"192.0.2.140"
],
"virtualPort": 8080
}
}
}
}
}
- name: "(Test #1) AS3 send declaration wrapped in AS3 class"
f5networks.f5_bigip.bigip_as3_deploy:
content: "{{ declaration_wrapped_in_AS3_class }}"
state: present
The expectation is that bigip_as3_deploy
implements the provided AS3 declaration and not require explicitly wrapping it in the AS3 class. The AS3 API perfectly supports this use-case.
$ ansible-playbook -i inventory/dev play-f5as3-issueX.yml -l b16
PLAY [Repro] ****************************************************************************************************************************************************************************************************************************************************************************************************
TASK [F5 Collection Versions] ***********************************************************************************************************************************************************************************************************************************************************************************
ok: [b16] => {
"msg": "Ansible version:2.14.1, f5networks.f5_bigip:1.12.0, f5networks.f5_modules:1.22.0"
}
TASK [(Test #1) Create Declaration] *****************************************************************************************************************************************************************************************************************************************************************************
ok: [b16]
TASK [(Test #1) AS3 send declaration with ADC] ******************************************************************************************************************************************************************************************************************************************************************
fatal: [b16]: FAILED! => {"changed": false, "msg": "{'code': 422, 'errors': ['/action: should be object'], 'declarationFullId': '', 'message': 'declaration is invalid'}"}
...ignoring
TASK [(Test #2) Wrap declaration in AS3 class] ******************************************************************************************************************************************************************************************************************************************************************
ok: [b16]
TASK [(Test #1) AS3 send declaration wrapped in AS3 class] ******************************************************************************************************************************************************************************************************************************************************
ok: [b16]
PLAY RECAP ******************************************************************************************************************************************************************************************************************************************************************************************************
b16 : ok=5 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=1
The root cause lies in the exists() method, specifically in the below line:
The above code inserts (or overwrites) the action
property without checking if the AS3 class is actually used. This inserts the action
property into the ADC class which results in a validation error.
This can be easily fixed by checking for class AS3 and wrapping the actual declaration in an AS3 class if it isn't used.
if declaration.get('class') == 'AS3': # declaration uses AS3 class
declaration['action'] = 'dry-run'
else: # declaration needs to be wrapped in AS3 class to perform a dry-run
declaration = {
'class': 'AS3',
'persist': False,
'action': 'dry-run',
'declaration': declaration,
}
bigip_ssl_pkcs12
ansible [core 2.12.4]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/jobl/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/jobl/venv/lib/python3.8/site-packages/ansible
ansible collection location = /home/jobl/.ansible/collections:/usr/share/ansible/collections
executable location = /home/jobl/venv/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
When updating certificates with a P12 protected by a password,
cert_pass is not send in the params of the API call
I fix it with (but really not sure it's the way to do it as i'm new with ansible)
Replace in
line 126
passphrase with cert_pass
and add the missing comma
and
Replace in
209
Parameters.returnables with Parameters.api_attributes
tasks:
- name: Install PKCS12 cert and key
bigip_ssl_pkcs12:
source: "/P12/xxxx.p12"
state: present
cert_pass: xxxxxx
name: 'hello'
The certificates is updated
The full traceback is:
File "/tmp/ansible_bigip_ssl_pkcs12_payload_aozt8jkr/ansible_bigip_ssl_pkcs12_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_ssl_pkcs12.py", line 385, in main
File "/tmp/ansible_bigip_ssl_pkcs12_payload_aozt8jkr/ansible_bigip_ssl_pkcs12_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_ssl_pkcs12.py", line 232, in exec_module
File "/tmp/ansible_bigip_ssl_pkcs12_payload_aozt8jkr/ansible_bigip_ssl_pkcs12_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_ssl_pkcs12.py", line 248, in present
File "/tmp/ansible_bigip_ssl_pkcs12_payload_aozt8jkr/ansible_bigip_ssl_pkcs12_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_ssl_pkcs12.py", line 267, in create
File "/tmp/ansible_bigip_ssl_pkcs12_payload_aozt8jkr/ansible_bigip_ssl_pkcs12_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_ssl_pkcs12.py", line 309, in install_on_device
fatal: [hostxxx.be]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"attributes": null,
"cert_pass": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"force": false,
"group": null,
"mode": null,
"name": "hello",
"owner": null,
"partition": "Common",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"source": "/P12/xxxx.p12",
"state": "present",
"unsafe_writes": false
}
},
"msg": "{'code': 400, 'message': 'Key management library returned bad status: -28, Bad password', 'errorStack': [], 'apiError': 26214401}"
f5networks.f5_bigip.bigip_device_info
ansible [core 2.12.7]
config file = /home/horol/DEV/mvsr/mvdc-ansible/ansible.cfg
configured module search path = ['/home/horol/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/horol/.local/lib/python3.8/site-packages/ansible
ansible collection location = /home/horol/.ansible/collections/ansible_collections:/usr/share/ansible/collections
executable location = /home/horol/.local/bin/ansible
python version = 3.8.10 (default, Jun 22 2022, 20:18:18) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3
Build 0.0.12
Edition Final
Date Tue Jun 7 19:57:05 PDT 2022
Ubuntu 20.04.4 LTS
Getting device info (f5networks.f5_bigip.bigip_device_info) finish with message:
TASK [> 01: Getting device facts] **********************************************
fatal: [dcb_bigip_aci_01]: FAILED! => {"changed": false, "module_stderr": "command timeout triggered, timeout value is 30 secs.\nSee the timeout setting options in the Network Debug and Troubleshooting Guide.", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"}
fatal: [dcb_bigip_aci_02]: FAILED! => {"changed": false, "module_stderr": "command timeout triggered, timeout value is 30 secs.\nSee the timeout setting options in the Network Debug and Troubleshooting Guide.", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"}
We tested this on several hosts running ansible. Here is the result:
- name: "> 01: Getting device facts"
f5networks.f5_bigip.bigip_device_info:
gather_subset:
- devices
# delegate_to: localhost
# no_log: true
register: device_facts
tags: always
when ICMP to device is enabled, above task is working correctly, but when ICMP is disallowed, we get timeout and no response from bigip device.
It is nice to have result from bigip when icmp to the device is not allowed (only https connection is allowed). In some environments is ICMP disabled, but HTTPS is allowed for REST connections.
Ansible module: bigip_sslo_config_topology
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
While the policy does appear to get attached to the LTM VIP, it is not rendered in the SSLO UI.
Using this declaration:
- name: Create SSLO Topology
bigip_sslo_config_topology:
name: "l3inboundapp"
topology_type: "inbound_l3"
dest: "10.0.2.200/32"
port: 443
ssl_settings: "sslconfig"
security_policy: "ssloP_sslopolicy"
vlans:
- "/Common/external"
snat: "automap"
pool: "/Common/webapp"
Navigate to Local Traffic - Virtual Servers in the UI and edit the topology LTM VIP. The SSLO security policies will be listed here.
Navigate to SSL Orchestrator - Configuration in the UI and edit the topology. The topology will be stuck on the policy page and will not allow any edits or the ability to select an existing policy.
Ansible module: bigip_sslo_config_policy
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
Module line 1126 should likely read "server_port_match" as "client_port_match" is listed twice:
('condition_type', 'client_port_match', ['condition_option_ports'], True),
('condition_type', 'client_port_match', ['condition_option_ports'], True),
No options in the policy module to create/re-create this behavior.
f5networks.f5_bigip 1.4.0 bigip_device_info
Not related to Ansible version, Ansible 2.9.25 used
BigIP 14.1.4.4 build 0.0.4
no specific, see playbook
RHEL 7.9 but issue not OS related
Can not gather bigip_device_info
data when using user with Auditor rights.
Please improve collection functionality for bigip_device_info
to work with Read-Only rights.
At BigIP set user rights to Auditor
Then run testing playbook.
- name: test
hosts: f5bigip
connection: httpapi
gather_facts: false
# Connection Info
vars:
ansible_user: "ansible"
ansible_httpapi_port: 443
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: Collect information about virtual servers, client ssl profiles and server ssl profiles
f5networks.f5_bigip.bigip_device_info:
gather_subset:
- virtual-servers
- irules
As we are not using other modules than bigip_device_info
it is sufficient for us to have lower than administrator rights.
When running playbook with Auditor rights, we get:
TASK [Collect information about virtual servers, client ssl profiles and server ssl profiles] *******************************************************************************************************************
task path: /home/vacmille/ansible/f5-testv2.yml:18
<f5-inventory-host> attempting to start connection
<f5-inventory-host> using connection plugin httpapi
<f5-inventory-host> local domain socket does not exist, starting it
<f5-inventory-host> control socket path is /home/vacmille/.ansible/pc/1a5ab3cca9
<f5-inventory-host> local domain socket listeners started successfully
<f5-inventory-host> loaded API plugin ansible_collections.f5networks.f5_bigip.plugins.httpapi.bigip from path /home/vacmille/ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/httpapi/bigip.py for network_os f5networks.f5_bigip.bigip
<f5-inventory-host>
<f5-inventory-host> local domain socket path is /home/vacmille/.ansible/pc/1a5ab3cca9
<f5-inventory-host> Using network group action f5networks.f5_bigip.bigip for f5networks.f5_bigip.bigip_device_info
<f5-inventory-host> ANSIBLE_NETWORK_IMPORT_MODULES: disabled
<f5-inventory-host> ANSIBLE_NETWORK_IMPORT_MODULES: module execution time may be extended
<f5-inventory-host> ESTABLISH LOCAL CONNECTION FOR USER: vacmille
<f5-inventory-host> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod `"&& mkdir "` echo /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417 `" && echo ansible-tmp-1639993589.098304-27742-168608236643417="` echo /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417 `" ) && sleep 0'
<avx-bigip01.dhl.com> Attempting python interpreter discovery
<f5-inventory-host> EXEC /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'/usr/bin/python'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'python3.6'"'"'; command -v '"'"'python3.5'"'"'; command -v '"'"'python2.7'"'"'; command -v '"'"'python2.6'"'"'; command -v '"'"'/usr/libexec/platform-python'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python'"'"'; echo ENDFOUND && sleep 0'
<f5-inventory-host> EXEC /bin/sh -c '/usr/bin/python && sleep 0'
Using module file /home/vacmille/ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py
<f5-inventory-host> PUT /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/tmpw8f4c9yd TO /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417/AnsiballZ_bigip_device_info.py
<f5-inventory-host> EXEC /bin/sh -c 'chmod u+x /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417/ /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417/AnsiballZ_bigip_device_info.py && sleep 0'
<f5-inventory-host> EXEC /bin/sh -c '/usr/bin/python /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417/AnsiballZ_bigip_device_info.py && sleep 0'
<f5-inventory-host> EXEC /bin/sh -c 'rm -f -r /home/vacmille/.ansible/tmp/ansible-local-277360fly9cod/ansible-tmp-1639993589.098304-27742-168608236643417/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
WARNING: The below traceback may *not* be related to the actual failure.
File "/tmp/ansible_f5networks.f5_bigip.bigip_device_info_payload_pXwFRq/ansible_f5networks.f5_bigip.bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16786, in main
File "/tmp/ansible_f5networks.f5_bigip.bigip_device_info_payload_pXwFRq/ansible_f5networks.f5_bigip.bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16504, in exec_module
File "/tmp/ansible_f5networks.f5_bigip.bigip_device_info_payload_pXwFRq/ansible_f5networks.f5_bigip.bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16581, in execute_managers
File "/tmp/ansible_f5networks.f5_bigip.bigip_device_info_payload_pXwFRq/ansible_f5networks.f5_bigip.bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/client.py", line 153, in packages_installed
raise F5ModuleError(response['contents'])
fatal: [avx-bigip01.dhl.com]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python"
},
"changed": false,
"invocation": {
"module_args": {
"gather_subset": [
"virtual-servers",
"irules"
]
}
},
"msg": "{u'message': u'Authorization failed: user=https://localhost/mgmt/shared/authz/users/ansible resource=/mgmt/shared/iapp/global-installed-packages verb=GET uri:http://localhost:8100/mgmt/shared/iapp/global-installed-packages referrer:165.72.0.72 sender:165.72.0.72', u'code': 401, u'referer': u'165.72.0.72', u'restOperationId': 79201901, u'kind': u':resterrorresponse'}"
}
f5networks.f5_bigip.bigip_as3_deploy
ansible [core 2.12.2]
config file = /home/cloud/ansible.cfg
configured module search path = ['/home/cloud/plugins/modules']
ansible python module location = /home/cloud/.venv/core212/lib/python3.8/site-packages/ansible
ansible collection location = /home/cloud/.ansible/collections:/usr/share/ansible/collections
executable location = /home/cloud/.venv/core212/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.0.3
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.3
Build 0.21.11
Edition Engineering Hotfix
Date Wed Apr 28 09:38:30 PDT 2021
Hotfix List
ID940185-2 ID976505-2 ID755976-4 ID959629-2 ID981069-1 ID941625-1
ID1000973-3 ID746861-3 ID880289 ID975809-1 ID948717-3
N/A
N/A
Referring to the Ansible built-in connection variable ansible_host
in the required collection variable ansible_host
causes playbook to fail with "recursive error".
---
- name: AS3
hosts: "{{ targets }}"
connection: httpapi
gather_facts: false
vars_files:
- vault.yaml
vars:
ansible_host: "{{ ansible_host }}"
ansible_user: admin
ansible_httpapi_password: "{{ vault_password }}"
ansible_httpapi_port: 443
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: Deploy or Update AS3
f5networks.f5_bigip.bigip_as3_deploy:
content: "{{ lookup('file', 'as3.json') }}"
Playbook should complete with no errors, as the playbook parameters are taken from the official collection documentation.
Playbook fails with "recursive error" caused by referring to the ansible_host
variable.
TASK [Deploy or Update AS3] *************************************************************************************************************************************************************************************
Friday 12 August 2022 14:30:13 +0000 (0:00:00.162) 0:00:00.162 *********
fatal: [qa-de-1a-f512-02]: FAILED! => {"msg": "An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while templating '{{ ansible_host }}'.
<repeating output omitted>
bigip_sslo_config_ssl
ansible [core 2.12.5]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3.2
Build 0.0.4
Edition Point Release 2
Date Wed Sep 14 08:12:07 PDT 2022
9.3.41
No specific system/ansible configuration changes
Ubuntu 20.04
Python 3.8.10
Forward Proxy "block_expired" and "block_untrusted" should default to "no" (false) unless specified. Also, these settings seem to have no effect unless you run the playbook twice. The correct settings should be "drop" and "ignore", with "ignore" being the default for forward proxy, and "drop" being the default for reverse proxy.
---
# Reference: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/bigip_sslo_config_ssl_module.html#bigip-sslo-config-ssl-module-2
- name: Create SSLO SSL Outbound Configuration
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
#ansible_host: "172.16.1.83"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
## import cert/key
- name: Import CA cert/key
bigip_ssl_key_cert:
key_content: "{{ lookup('file', 'certs/subrsa.f5labs.com.pemk') }}"
key_name: subrsa.f5labs.com
cert_content: "{{ lookup('file', 'certs/subrsa.f5labs.com.crt') }}"
cert_name: subrsa.f5labs.com
## SSL Configuration (simple)
- name: Create an SSLO outbound SSL config
bigip_sslo_config_ssl:
name: "sslconfig"
state: "absent"
client_settings:
proxy_type: "forward"
ca_cert: "/Common/subrsa.f5labs.com.crt"
ca_key: "/Common/subrsa.f5labs.com.key"
server_settings:
block_expired: false
block_untrusted: false
bigip_sslo_config_ssl
All
There is no way to control TLS processing options, to enable TLS1.3. This is expressed in the "enabledSSLProcessingOptions" array in the JSON.
bigip_sslo_service_layer2
>= 9.3
The Inline L2 service definition does not provide a vendor_info option, as described in:ย f5-ssl-orchestrator-service (Inline Level 2)
None
bigip_fast_application
ansible --version
ansible [core 2.13.5]
config file = /ansible/ansible.cfg
configured module search path = ['/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.10/site-packages/ansible
ansible collection location = /ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.10.6 (main, Aug 23 2022, 08:25:41) [GCC 10.2.1 20210110]
jinja version = 3.1.2
libyaml = True
python3 --version
Python 3.10.6
Product: BIG-IP
Version: 16.1.2.1
Build: 0.0.10
Sequence: 16.1.2.1-0.0.10.0
- name: "FAST declaration"
set_fact:
fast_declaration:
tenant_name: TestTenant
application_name: TestApp
virtual_port: 80
virtual_address: 10.1.2.3
server_port: 80
server_addresses:
- 10.1.2.100
- 10.1.2.101
tags:
- f5-fast-test
- name: "FAST declaration debug"
debug:
msg: "{{ fast_declaration }}"
tags:
- f5-fast-test
- name: "FAST create"
f5networks.f5_bigip.bigip_fast_application:
template: "examples/simple_http"
application: "{{ fast_declaration.application_name }}"
tenant: "{{ fast_declaration.tenant_name }}"
content: "{{ fast_declaration }}"
state: "create"
tags:
- f5-fast-test
- name: "FAST present"
f5networks.f5_bigip.bigip_fast_application:
template: "examples/simple_http"
application: "{{ fast_declaration.application_name }}"
tenant: "{{ fast_declaration.tenant_name }}"
content: "{{ fast_declaration }}"
state: "present"
tags:
- f5-fast-test
f5networks.f5_bigip.bigip_fast_application
module fails with the below error.
TASK [f5fast : FAST present] *******************************************************************************************************************************************************************************************************
task path: /ansible/roles/myrole/tasks/main.yml:61
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
<10.1.1.5> ESTABLISH HTTP(S) CONNECTFOR USER: admin TO https://10.1.1.5:443
fatal: [bigip16]: FAILED! => {
"changed": false,
"module_stderr": "list indices must be integers or slices, not str",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"
}
use the ansible tasks listed above
No module failure
f5networks.f5_bigip.bigip_fast_application
version 1.12.0
$ ansible --version
ansible [core 2.14.1]
config file = /ansible/ansible.cfg
configured module search path = ['/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /ansible/.local/lib/python3.10/site-packages/ansible
ansible collection location = /ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.10.6 (main, Aug 23 2022, 08:25:41) [GCC 10.2.1 20210110] (/usr/local/bin/python)
jinja version = 3.1.2
libyaml = True
# cat /VERSION
Product: BIG-IP
Version: 16.1.3.2
Build: 0.0.4
Sequence: 16.1.3.2-0.0.4.0
BaseBuild: 0.0.4
Edition: Point Release 2
Date: Wed Sep 14 08:12:07 PDT 2022
Built: 220914081207
Changelist: 3632323
JobID: 1383561
$ cat ansible.cfg
[defaults]
collections_paths=./collections
filter_plugins=./filter_plugins
module_utils=./module_utils
roles_path=./roles
inventory=./inventory
# https://stackoverflow.com/questions/61948417/how-to-measure-and-display-time-taken-for-tasks-when-running-ansible-playbook
#callbacks_enabled = profile_tasks
N/A
---
- name: "Repro"
hosts: all
gather_facts: false
connection: httpapi
vars:
provider: &bigip_provider
server: "10.1.1.4"
user: "admin"
password: "Secret_245"
validate_certs: "no"
server_port: 443
ansible_host: "{{ provider.server }}"
ansible_user: "{{ provider.user }}"
ansible_httpapi_password: "{{ provider.password }}"
ansible_httpapi_port: "{{ provider.server_port }}"
ansible_network_os: "f5networks.f5_bigip.bigip"
ansible_httpapi_use_ssl: "yes"
ansible_httpapi_validate_certs: "{{ provider.validate_certs }}"
tasks:
- name: "(0) FAST declaration"
ansible.builtin.set_fact:
fast_declaration: >-
{
"tenant_name": "MyTenantName",
"application_name": "MyAppName",
"virtual_port": 443,
"virtual_address": "192.0.2.202",
"WAF_policy_path": "/Common/this_waf_policy_does_not_exist",
"server_port": 80,
"server_address": [
"192.0.2.180"
],
"certificate": "-----BEGIN CERTIFICATE-----\nMIIDrjCCApagAwIBAgIEGFse8zANBgkqhkiG9w0BAQsFADCBmDELMAkGA1UEBhMC\nVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMRIwEAYDVQQKEwlNeUNv\nbXBhbnkxCzAJBgNVBAsTAklUMR4wHAYDVQQDExVsb2NhbGhvc3QubG9jYWxkb21h\naW4xKTAnBgkqhkiG9w0BCQEWGnJvb3RAbG9jYWxob3N0LmxvY2FsZG9tYWluMB4X\nDTIyMTIxMzE4NTQ0M1oXDTMyMTIxMDE4NTQ0M1owgZgxCzAJBgNVBAYTAlVTMQsw\nCQYDVQQIEwJXQTEQMA4GA1UEBxMHU2VhdHRsZTESMBAGA1UEChMJTXlDb21wYW55\nMQswCQYDVQQLEwJJVDEeMBwGA1UEAxMVbG9jYWxob3N0LmxvY2FsZG9tYWluMSkw\nJwYJKoZIhvcNAQkBFhpyb290QGxvY2FsaG9zdC5sb2NhbGRvbWFpbjCCASIwDQYJ\nKoZIhvcNAQEBBQADggEPADCCAQoCggEBALKLQBSjlGxbtuYPDPW+HCPt7+RjoUGC\nab0bGdEtOUxfCDnKKxT2GBjYrfvH7S/xtbdaI1cJbA0qPEhQljNyPudPoSunQ7D2\nl3ka/27jL/FKHL+/svkgLG4dlMVWpDhYKq8DaYdb6iI5qZFUfy95hn2QjkXm0Vrn\nmo7PiDPbnTVTGU6CsPQzmANdN0J5CR6l30ORMXKBwc9mYLUxHHyHI5HSIKrsrj9B\n4PNKDK7OJaYb0d21uuVNyZ281jbnKJs2W54YbhTGVGKcKRGR9Rbfd4LzcDgBtIt0\nZSPfrPu82s3CydqfqctZbg22NE2CUVzyK9yWM0pghghOUPSI2cIrwEsCAwEAATAN\nBgkqhkiG9w0BAQsFAAOCAQEAGdDW+kZde88JarcMfA/I78eDn9j4szGJn6gvCsHZ\niLJsugOxj6udrNXKn6NuekNoQXMae0kXVqVmGZouX8lzrn0I6+bR1TEkBmc+v9bF\nLXk096iUBaKPSAkoNOFeHQE/XRu4kk6TqedMneZvs2/725um+9kkPZde4luwCfk6\nqrki7MtAL9xhuxUVFpISjvsBQRwCg0ckQ1YvKYf+s7i9/4fBZB25biGzmVK7cEPx\nAPVjOHNlYwHCa9bp7SX4tuUrqGyZ8ib3OHxFdXVZzBKsNe02zldpDQ7ZyPlUzhhh\nwl3HfQXADQYtf5rIwsFxgrSDIS2U7oD39poucodCEzc2PQ==\n-----END CERTIFICATE-----",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCyi0AUo5RsW7bm\nDwz1vhwj7e/kY6FBgmm9GxnRLTlMXwg5yisU9hgY2K37x+0v8bW3WiNXCWwNKjxI\nUJYzcj7nT6Erp0Ow9pd5Gv9u4y/xShy/v7L5ICxuHZTFVqQ4WCqvA2mHW+oiOamR\nVH8veYZ9kI5F5tFa55qOz4gz2501UxlOgrD0M5gDXTdCeQkepd9DkTFygcHPZmC1\nMRx8hyOR0iCq7K4/QeDzSgyuziWmG9HdtbrlTcmdvNY25yibNlueGG4UxlRinCkR\nkfUW33eC83A4AbSLdGUj36z7vNrNwsnan6nLWW4NtjRNglFc8ivcljNKYIYITlD0\niNnCK8BLAgMBAAECggEAMvgnufycwXZJN1HynDDCbcteIXADt+TX9MFI1Hs5kUDL\n41uAgwJiDK3GtUr0viwdeRNFZXJuIy/8d5Rx3Ivvwy7rTr/4RguPYGZBp1E5/YLv\nxBmgqRfzNxhAwTkjtmYNAVtTA+5MX6rganmZuV7S8wOSaggjmfTmHYDHXC6EqZTs\n+BIALRKLYwHzriKM04zDCDcPzG9Wt3ZDLXBBAzWwtMOCDJXRE0PSQXl8C5p6IiLY\najRXXHVqrF4Q10rid6kJKsNKuXMOvoeguAeaPJlY0oOco7hUNmJjf78qKWBM7VuK\nwu0O0FPbRl/oVs9eLzBfEn2lOcjhF9LpyagM4k1fKQKBgQD48OFyk5USjfnmzlQk\nl9wb3/cYyZKpKVIDQ/2qtfRnuYKtSqmayyQwLaxlzbOKKvBenwsTF259Onz0Pt0/\nmJsFWv3DMJDh3QyCrTvoZnnyw3PT3NUN7cGiQQi0SH5M9mvF8gIRZEyG8LmPKJkE\niDW6PC95Y/N1zyaI3tOTnNDrtwKBgQC3m1d80Ce+kJzsz6CP/XD28qi5OeD5HJAp\njzPSRv6dGy1vyiwXv76nN4vQwzZqUo7g+9rJEiec0iDmpMelPkXHJteLZWi3QiN+\nfwqcMj4eYrFVZ2hdkKZhKI99zbAQ8fKJmfB6dVOYjfQOUUpF/tT17ozaAkGe5xHI\nwtuz7354DQKBgQCXIz/tHTb7feFERO6HDP/gmJhfnzoApAqb2vKuaywIsXNqHJNe\nXIkLCx/I6xte/nTTLcI+hBJby1/DtksDanZryPOaRukfh+IpkF132oedYRb4gPGF\nNF1EUjGjqwOrXEzQb/7bakagApTWGrLUMpJUEGhOTeWpF+xwWsCftSyOfwKBgQC3\nkrw9UY17TfFoIAuEC70HWwTw9PqHd1R4CPKiGlN11vdt3vCI6jB/1dyX5KYiVdr+\n/TD5eopalAlLMZNfFs0DWkWF3OV+3MTKM9Dy7JUJIln1bsd9TSPc3oXhHWcc+hsq\nEtzKQ0ZKsBtEuWgOZcSdA16WlkzvyE4SsSijVh/XfQKBgFCGWIJZ+QFw+hFCNbpW\nSKFpVeEQ49DjoiL1k0LjszOFLG6eI1X9MhELV0PgjosoYU4SUTgsnfQBUfxKlGvN\nZoxgntmaI1fk1o86Zw7QLpG5t5RSDde+xscvS1nEDs12zyijw5Me3NG/ShJ4XpIn\ntv20DKWvUIrJD2BhQNTdbWmx\n-----END PRIVATE KEY-----"
}
- name: "(1) FAST create application"
f5networks.f5_bigip.bigip_fast_application:
template: examples/simple_waf
content: "{{ fast_declaration }}"
tenant: MyTenantName
application: MyAppName
state: create
register: f5_fast_response
- name: "(2) FAST response debug"
ansible.builtin.debug:
var: f5_fast_response
f5networks.f5_bigip.bigip_fast_application
is expected to catch runtime errors in the FAST task as well.
Therefore it is expected that the above task "(1) FAST create application"
will fail.
The playbook tasks succeed.
ansible-playbook -i inventory/repro play-f5fast-issue-repro.yml -vvvv
ansible-playbook [core 2.14.1]
config file = /ansible/ansible.cfg
configured module search path = ['/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /ansible/.local/lib/python3.10/site-packages/ansible
ansible collection location = /ansible/collections
executable location = /usr/local/bin/ansible-playbook
python version = 3.10.6 (main, Aug 23 2022, 08:25:41) [GCC 10.2.1 20210110] (/usr/local/bin/python)
jinja version = 3.1.2
libyaml = True
Using /ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /ansible/inventory/repro/hosts.yml as it did not pass its verify_file() method
script declined parsing /ansible/inventory/repro/hosts.yml as it did not pass its verify_file() method
Parsed /ansible/inventory/repro/hosts.yml inventory source with yaml plugin
Loading collection f5networks.f5_bigip from /ansible/collections/ansible_collections/f5networks/f5_bigip
Loading callback plugin default of type stdout, v2.0 from /ansible/.local/lib/python3.10/site-packages/ansible/plugins/callback/default.py
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: play-f5fast-issue-repro.yml *********************************************************************************************************************************************************************************************************************************************************
Positional arguments: play-f5fast-issue-repro.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/ansible/inventory/repro',)
forks: 5
1 plays in play-f5fast-issue-repro.yml
PLAY [Repro] **********************************************************************************************************************************************************************************************************************************************************************************
TASK [(0) FAST declaration] *******************************************************************************************************************************************************************************************************************************************************************
task path: /ansible/play-f5fast-issue-repro.yml:22
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /ansible/collections/ansible_collections/ansible/netcommon
<10.1.1.4> attempting to start connection
<10.1.1.4> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /usr/local/bin/ansible-connection
<10.1.1.4> local domain socket does not exist, starting it
<10.1.1.4> control socket path is /ansible/.ansible/pc/2db776cd4c
<10.1.1.4> redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
<10.1.1.4> Loading collection ansible.netcommon from /ansible/collections/ansible_collections/ansible/netcommon
<10.1.1.4> Loading collection f5networks.f5_bigip from /ansible/collections/ansible_collections/f5networks/f5_bigip
<10.1.1.4> local domain socket listeners started successfully
<10.1.1.4> loaded API plugin ansible_collections.f5networks.f5_bigip.plugins.httpapi.bigip from path /ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/httpapi/bigip.py for platform type f5networks.f5_bigip.bigip
<10.1.1.4>
<10.1.1.4> local domain socket path is /ansible/.ansible/pc/2db776cd4c
ok: [bigip] => {
"ansible_facts": {
"fast_declaration": "{\n \"tenant_name\": \"MyTenantName\",\n \"application_name\": \"MyAppName\",\n \"virtual_port\": 443,\n \"virtual_address\": \"192.0.2.202\",\n \"WAF_policy_path\": \"/Common/this_waf_policy_does_not_exist\",\n \"server_port\": 80,\n \"server_address\": [\n \"192.0.2.180\"\n ],\n \"certificate\": \"-----BEGIN CERTIFICATE-----\\nMIIDrjCCApagAwIBAgIEGFse8zANBgkqhkiG9w0BAQsFADCBmDELMAkGA1UEBhMC\\nVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMRIwEAYDVQQKEwlNeUNv\\nbXBhbnkxCzAJBgNVBAsTAklUMR4wHAYDVQQDExVsb2NhbGhvc3QubG9jYWxkb21h\\naW4xKTAnBgkqhkiG9w0BCQEWGnJvb3RAbG9jYWxob3N0LmxvY2FsZG9tYWluMB4X\\nDTIyMTIxMzE4NTQ0M1oXDTMyMTIxMDE4NTQ0M1owgZgxCzAJBgNVBAYTAlVTMQsw\\nCQYDVQQIEwJXQTEQMA4GA1UEBxMHU2VhdHRsZTESMBAGA1UEChMJTXlDb21wYW55\\nMQswCQYDVQQLEwJJVDEeMBwGA1UEAxMVbG9jYWxob3N0LmxvY2FsZG9tYWluMSkw\\nJwYJKoZIhvcNAQkBFhpyb290QGxvY2FsaG9zdC5sb2NhbGRvbWFpbjCCASIwDQYJ\\nKoZIhvcNAQEBBQADggEPADCCAQoCggEBALKLQBSjlGxbtuYPDPW+HCPt7+RjoUGC\\nab0bGdEtOUxfCDnKKxT2GBjYrfvH7S/xtbdaI1cJbA0qPEhQljNyPudPoSunQ7D2\\nl3ka/27jL/FKHL+/svkgLG4dlMVWpDhYKq8DaYdb6iI5qZFUfy95hn2QjkXm0Vrn\\nmo7PiDPbnTVTGU6CsPQzmANdN0J5CR6l30ORMXKBwc9mYLUxHHyHI5HSIKrsrj9B\\n4PNKDK7OJaYb0d21uuVNyZ281jbnKJs2W54YbhTGVGKcKRGR9Rbfd4LzcDgBtIt0\\nZSPfrPu82s3CydqfqctZbg22NE2CUVzyK9yWM0pghghOUPSI2cIrwEsCAwEAATAN\\nBgkqhkiG9w0BAQsFAAOCAQEAGdDW+kZde88JarcMfA/I78eDn9j4szGJn6gvCsHZ\\niLJsugOxj6udrNXKn6NuekNoQXMae0kXVqVmGZouX8lzrn0I6+bR1TEkBmc+v9bF\\nLXk096iUBaKPSAkoNOFeHQE/XRu4kk6TqedMneZvs2/725um+9kkPZde4luwCfk6\\nqrki7MtAL9xhuxUVFpISjvsBQRwCg0ckQ1YvKYf+s7i9/4fBZB25biGzmVK7cEPx\\nAPVjOHNlYwHCa9bp7SX4tuUrqGyZ8ib3OHxFdXVZzBKsNe02zldpDQ7ZyPlUzhhh\\nwl3HfQXADQYtf5rIwsFxgrSDIS2U7oD39poucodCEzc2PQ==\\n-----END CERTIFICATE-----\",\n \"private_key\": \"-----BEGIN PRIVATE KEY-----\\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCyi0AUo5RsW7bm\\nDwz1vhwj7e/kY6FBgmm9GxnRLTlMXwg5yisU9hgY2K37x+0v8bW3WiNXCWwNKjxI\\nUJYzcj7nT6Erp0Ow9pd5Gv9u4y/xShy/v7L5ICxuHZTFVqQ4WCqvA2mHW+oiOamR\\nVH8veYZ9kI5F5tFa55qOz4gz2501UxlOgrD0M5gDXTdCeQkepd9DkTFygcHPZmC1\\nMRx8hyOR0iCq7K4/QeDzSgyuziWmG9HdtbrlTcmdvNY25yibNlueGG4UxlRinCkR\\nkfUW33eC83A4AbSLdGUj36z7vNrNwsnan6nLWW4NtjRNglFc8ivcljNKYIYITlD0\\niNnCK8BLAgMBAAECggEAMvgnufycwXZJN1HynDDCbcteIXADt+TX9MFI1Hs5kUDL\\n41uAgwJiDK3GtUr0viwdeRNFZXJuIy/8d5Rx3Ivvwy7rTr/4RguPYGZBp1E5/YLv\\nxBmgqRfzNxhAwTkjtmYNAVtTA+5MX6rganmZuV7S8wOSaggjmfTmHYDHXC6EqZTs\\n+BIALRKLYwHzriKM04zDCDcPzG9Wt3ZDLXBBAzWwtMOCDJXRE0PSQXl8C5p6IiLY\\najRXXHVqrF4Q10rid6kJKsNKuXMOvoeguAeaPJlY0oOco7hUNmJjf78qKWBM7VuK\\nwu0O0FPbRl/oVs9eLzBfEn2lOcjhF9LpyagM4k1fKQKBgQD48OFyk5USjfnmzlQk\\nl9wb3/cYyZKpKVIDQ/2qtfRnuYKtSqmayyQwLaxlzbOKKvBenwsTF259Onz0Pt0/\\nmJsFWv3DMJDh3QyCrTvoZnnyw3PT3NUN7cGiQQi0SH5M9mvF8gIRZEyG8LmPKJkE\\niDW6PC95Y/N1zyaI3tOTnNDrtwKBgQC3m1d80Ce+kJzsz6CP/XD28qi5OeD5HJAp\\njzPSRv6dGy1vyiwXv76nN4vQwzZqUo7g+9rJEiec0iDmpMelPkXHJteLZWi3QiN+\\nfwqcMj4eYrFVZ2hdkKZhKI99zbAQ8fKJmfB6dVOYjfQOUUpF/tT17ozaAkGe5xHI\\nwtuz7354DQKBgQCXIz/tHTb7feFERO6HDP/gmJhfnzoApAqb2vKuaywIsXNqHJNe\\nXIkLCx/I6xte/nTTLcI+hBJby1/DtksDanZryPOaRukfh+IpkF132oedYRb4gPGF\\nNF1EUjGjqwOrXEzQb/7bakagApTWGrLUMpJUEGhOTeWpF+xwWsCftSyOfwKBgQC3\\nkrw9UY17TfFoIAuEC70HWwTw9PqHd1R4CPKiGlN11vdt3vCI6jB/1dyX5KYiVdr+\\n/TD5eopalAlLMZNfFs0DWkWF3OV+3MTKM9Dy7JUJIln1bsd9TSPc3oXhHWcc+hsq\\nEtzKQ0ZKsBtEuWgOZcSdA16WlkzvyE4SsSijVh/XfQKBgFCGWIJZ+QFw+hFCNbpW\\nSKFpVeEQ49DjoiL1k0LjszOFLG6eI1X9MhELV0PgjosoYU4SUTgsnfQBUfxKlGvN\\nZoxgntmaI1fk1o86Zw7QLpG5t5RSDde+xscvS1nEDs12zyijw5Me3NG/ShJ4XpIn\\ntv20DKWvUIrJD2BhQNTdbWmx\\n-----END PRIVATE KEY-----\"\n}"
},
"changed": false
}
TASK [(1) FAST create application] ************************************************************************************************************************************************************************************************************************************************************
task path: /ansible/play-f5fast-issue-repro.yml:39
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /ansible/collections/ansible_collections/ansible/netcommon
<10.1.1.4> attempting to start connection
<10.1.1.4> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /usr/local/bin/ansible-connection
<10.1.1.4> found existing local domain socket, using it!
<10.1.1.4> updating play_context for connection
<10.1.1.4>
<10.1.1.4> local domain socket path is /ansible/.ansible/pc/2db776cd4c
<10.1.1.4> Using network group action f5networks.f5_bigip.bigip for f5networks.f5_bigip.bigip_fast_application
<{{ provider.server }}> ANSIBLE_NETWORK_IMPORT_MODULES: enabled
<{{ provider.server }}> ANSIBLE_NETWORK_IMPORT_MODULES: found f5networks.f5_bigip.bigip_fast_application at /ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_fast_application.py
<{{ provider.server }}> ANSIBLE_NETWORK_IMPORT_MODULES: running f5networks.f5_bigip.bigip_fast_application
<{{ provider.server }}> ANSIBLE_NETWORK_IMPORT_MODULES: complete
ok: [bigip] => {
"application": "MyAppName",
"changed": null,
"content": {
"WAF_policy_path": "/Common/this_waf_policy_does_not_exist",
"application_name": "MyAppName",
"certificate": "-----BEGIN CERTIFICATE-----\nMIIDrjCCApagAwIBAgIEGFse8zANBgkqhkiG9w0BAQsFADCBmDELMAkGA1UEBhMC\nVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMRIwEAYDVQQKEwlNeUNv\nbXBhbnkxCzAJBgNVBAsTAklUMR4wHAYDVQQDExVsb2NhbGhvc3QubG9jYWxkb21h\naW4xKTAnBgkqhkiG9w0BCQEWGnJvb3RAbG9jYWxob3N0LmxvY2FsZG9tYWluMB4X\nDTIyMTIxMzE4NTQ0M1oXDTMyMTIxMDE4NTQ0M1owgZgxCzAJBgNVBAYTAlVTMQsw\nCQYDVQQIEwJXQTEQMA4GA1UEBxMHU2VhdHRsZTESMBAGA1UEChMJTXlDb21wYW55\nMQswCQYDVQQLEwJJVDEeMBwGA1UEAxMVbG9jYWxob3N0LmxvY2FsZG9tYWluMSkw\nJwYJKoZIhvcNAQkBFhpyb290QGxvY2FsaG9zdC5sb2NhbGRvbWFpbjCCASIwDQYJ\nKoZIhvcNAQEBBQADggEPADCCAQoCggEBALKLQBSjlGxbtuYPDPW+HCPt7+RjoUGC\nab0bGdEtOUxfCDnKKxT2GBjYrfvH7S/xtbdaI1cJbA0qPEhQljNyPudPoSunQ7D2\nl3ka/27jL/FKHL+/svkgLG4dlMVWpDhYKq8DaYdb6iI5qZFUfy95hn2QjkXm0Vrn\nmo7PiDPbnTVTGU6CsPQzmANdN0J5CR6l30ORMXKBwc9mYLUxHHyHI5HSIKrsrj9B\n4PNKDK7OJaYb0d21uuVNyZ281jbnKJs2W54YbhTGVGKcKRGR9Rbfd4LzcDgBtIt0\nZSPfrPu82s3CydqfqctZbg22NE2CUVzyK9yWM0pghghOUPSI2cIrwEsCAwEAATAN\nBgkqhkiG9w0BAQsFAAOCAQEAGdDW+kZde88JarcMfA/I78eDn9j4szGJn6gvCsHZ\niLJsugOxj6udrNXKn6NuekNoQXMae0kXVqVmGZouX8lzrn0I6+bR1TEkBmc+v9bF\nLXk096iUBaKPSAkoNOFeHQE/XRu4kk6TqedMneZvs2/725um+9kkPZde4luwCfk6\nqrki7MtAL9xhuxUVFpISjvsBQRwCg0ckQ1YvKYf+s7i9/4fBZB25biGzmVK7cEPx\nAPVjOHNlYwHCa9bp7SX4tuUrqGyZ8ib3OHxFdXVZzBKsNe02zldpDQ7ZyPlUzhhh\nwl3HfQXADQYtf5rIwsFxgrSDIS2U7oD39poucodCEzc2PQ==\n-----END CERTIFICATE-----",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCyi0AUo5RsW7bm\nDwz1vhwj7e/kY6FBgmm9GxnRLTlMXwg5yisU9hgY2K37x+0v8bW3WiNXCWwNKjxI\nUJYzcj7nT6Erp0Ow9pd5Gv9u4y/xShy/v7L5ICxuHZTFVqQ4WCqvA2mHW+oiOamR\nVH8veYZ9kI5F5tFa55qOz4gz2501UxlOgrD0M5gDXTdCeQkepd9DkTFygcHPZmC1\nMRx8hyOR0iCq7K4/QeDzSgyuziWmG9HdtbrlTcmdvNY25yibNlueGG4UxlRinCkR\nkfUW33eC83A4AbSLdGUj36z7vNrNwsnan6nLWW4NtjRNglFc8ivcljNKYIYITlD0\niNnCK8BLAgMBAAECggEAMvgnufycwXZJN1HynDDCbcteIXADt+TX9MFI1Hs5kUDL\n41uAgwJiDK3GtUr0viwdeRNFZXJuIy/8d5Rx3Ivvwy7rTr/4RguPYGZBp1E5/YLv\nxBmgqRfzNxhAwTkjtmYNAVtTA+5MX6rganmZuV7S8wOSaggjmfTmHYDHXC6EqZTs\n+BIALRKLYwHzriKM04zDCDcPzG9Wt3ZDLXBBAzWwtMOCDJXRE0PSQXl8C5p6IiLY\najRXXHVqrF4Q10rid6kJKsNKuXMOvoeguAeaPJlY0oOco7hUNmJjf78qKWBM7VuK\nwu0O0FPbRl/oVs9eLzBfEn2lOcjhF9LpyagM4k1fKQKBgQD48OFyk5USjfnmzlQk\nl9wb3/cYyZKpKVIDQ/2qtfRnuYKtSqmayyQwLaxlzbOKKvBenwsTF259Onz0Pt0/\nmJsFWv3DMJDh3QyCrTvoZnnyw3PT3NUN7cGiQQi0SH5M9mvF8gIRZEyG8LmPKJkE\niDW6PC95Y/N1zyaI3tOTnNDrtwKBgQC3m1d80Ce+kJzsz6CP/XD28qi5OeD5HJAp\njzPSRv6dGy1vyiwXv76nN4vQwzZqUo7g+9rJEiec0iDmpMelPkXHJteLZWi3QiN+\nfwqcMj4eYrFVZ2hdkKZhKI99zbAQ8fKJmfB6dVOYjfQOUUpF/tT17ozaAkGe5xHI\nwtuz7354DQKBgQCXIz/tHTb7feFERO6HDP/gmJhfnzoApAqb2vKuaywIsXNqHJNe\nXIkLCx/I6xte/nTTLcI+hBJby1/DtksDanZryPOaRukfh+IpkF132oedYRb4gPGF\nNF1EUjGjqwOrXEzQb/7bakagApTWGrLUMpJUEGhOTeWpF+xwWsCftSyOfwKBgQC3\nkrw9UY17TfFoIAuEC70HWwTw9PqHd1R4CPKiGlN11vdt3vCI6jB/1dyX5KYiVdr+\n/TD5eopalAlLMZNfFs0DWkWF3OV+3MTKM9Dy7JUJIln1bsd9TSPc3oXhHWcc+hsq\nEtzKQ0ZKsBtEuWgOZcSdA16WlkzvyE4SsSijVh/XfQKBgFCGWIJZ+QFw+hFCNbpW\nSKFpVeEQ49DjoiL1k0LjszOFLG6eI1X9MhELV0PgjosoYU4SUTgsnfQBUfxKlGvN\nZoxgntmaI1fk1o86Zw7QLpG5t5RSDde+xscvS1nEDs12zyijw5Me3NG/ShJ4XpIn\ntv20DKWvUIrJD2BhQNTdbWmx\n-----END PRIVATE KEY-----",
"server_address": [
"192.0.2.180"
],
"server_port": 80,
"tenant_name": "MyTenantName",
"virtual_address": "192.0.2.202",
"virtual_port": 443
},
"invocation": {
"module_args": {
"application": "MyAppName",
"content": {
"WAF_policy_path": "/Common/this_waf_policy_does_not_exist",
"application_name": "MyAppName",
"certificate": "-----BEGIN CERTIFICATE-----\nMIIDrjCCApagAwIBAgIEGFse8zANBgkqhkiG9w0BAQsFADCBmDELMAkGA1UEBhMC\nVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMRIwEAYDVQQKEwlNeUNv\nbXBhbnkxCzAJBgNVBAsTAklUMR4wHAYDVQQDExVsb2NhbGhvc3QubG9jYWxkb21h\naW4xKTAnBgkqhkiG9w0BCQEWGnJvb3RAbG9jYWxob3N0LmxvY2FsZG9tYWluMB4X\nDTIyMTIxMzE4NTQ0M1oXDTMyMTIxMDE4NTQ0M1owgZgxCzAJBgNVBAYTAlVTMQsw\nCQYDVQQIEwJXQTEQMA4GA1UEBxMHU2VhdHRsZTESMBAGA1UEChMJTXlDb21wYW55\nMQswCQYDVQQLEwJJVDEeMBwGA1UEAxMVbG9jYWxob3N0LmxvY2FsZG9tYWluMSkw\nJwYJKoZIhvcNAQkBFhpyb290QGxvY2FsaG9zdC5sb2NhbGRvbWFpbjCCASIwDQYJ\nKoZIhvcNAQEBBQADggEPADCCAQoCggEBALKLQBSjlGxbtuYPDPW+HCPt7+RjoUGC\nab0bGdEtOUxfCDnKKxT2GBjYrfvH7S/xtbdaI1cJbA0qPEhQljNyPudPoSunQ7D2\nl3ka/27jL/FKHL+/svkgLG4dlMVWpDhYKq8DaYdb6iI5qZFUfy95hn2QjkXm0Vrn\nmo7PiDPbnTVTGU6CsPQzmANdN0J5CR6l30ORMXKBwc9mYLUxHHyHI5HSIKrsrj9B\n4PNKDK7OJaYb0d21uuVNyZ281jbnKJs2W54YbhTGVGKcKRGR9Rbfd4LzcDgBtIt0\nZSPfrPu82s3CydqfqctZbg22NE2CUVzyK9yWM0pghghOUPSI2cIrwEsCAwEAATAN\nBgkqhkiG9w0BAQsFAAOCAQEAGdDW+kZde88JarcMfA/I78eDn9j4szGJn6gvCsHZ\niLJsugOxj6udrNXKn6NuekNoQXMae0kXVqVmGZouX8lzrn0I6+bR1TEkBmc+v9bF\nLXk096iUBaKPSAkoNOFeHQE/XRu4kk6TqedMneZvs2/725um+9kkPZde4luwCfk6\nqrki7MtAL9xhuxUVFpISjvsBQRwCg0ckQ1YvKYf+s7i9/4fBZB25biGzmVK7cEPx\nAPVjOHNlYwHCa9bp7SX4tuUrqGyZ8ib3OHxFdXVZzBKsNe02zldpDQ7ZyPlUzhhh\nwl3HfQXADQYtf5rIwsFxgrSDIS2U7oD39poucodCEzc2PQ==\n-----END CERTIFICATE-----",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCyi0AUo5RsW7bm\nDwz1vhwj7e/kY6FBgmm9GxnRLTlMXwg5yisU9hgY2K37x+0v8bW3WiNXCWwNKjxI\nUJYzcj7nT6Erp0Ow9pd5Gv9u4y/xShy/v7L5ICxuHZTFVqQ4WCqvA2mHW+oiOamR\nVH8veYZ9kI5F5tFa55qOz4gz2501UxlOgrD0M5gDXTdCeQkepd9DkTFygcHPZmC1\nMRx8hyOR0iCq7K4/QeDzSgyuziWmG9HdtbrlTcmdvNY25yibNlueGG4UxlRinCkR\nkfUW33eC83A4AbSLdGUj36z7vNrNwsnan6nLWW4NtjRNglFc8ivcljNKYIYITlD0\niNnCK8BLAgMBAAECggEAMvgnufycwXZJN1HynDDCbcteIXADt+TX9MFI1Hs5kUDL\n41uAgwJiDK3GtUr0viwdeRNFZXJuIy/8d5Rx3Ivvwy7rTr/4RguPYGZBp1E5/YLv\nxBmgqRfzNxhAwTkjtmYNAVtTA+5MX6rganmZuV7S8wOSaggjmfTmHYDHXC6EqZTs\n+BIALRKLYwHzriKM04zDCDcPzG9Wt3ZDLXBBAzWwtMOCDJXRE0PSQXl8C5p6IiLY\najRXXHVqrF4Q10rid6kJKsNKuXMOvoeguAeaPJlY0oOco7hUNmJjf78qKWBM7VuK\nwu0O0FPbRl/oVs9eLzBfEn2lOcjhF9LpyagM4k1fKQKBgQD48OFyk5USjfnmzlQk\nl9wb3/cYyZKpKVIDQ/2qtfRnuYKtSqmayyQwLaxlzbOKKvBenwsTF259Onz0Pt0/\nmJsFWv3DMJDh3QyCrTvoZnnyw3PT3NUN7cGiQQi0SH5M9mvF8gIRZEyG8LmPKJkE\niDW6PC95Y/N1zyaI3tOTnNDrtwKBgQC3m1d80Ce+kJzsz6CP/XD28qi5OeD5HJAp\njzPSRv6dGy1vyiwXv76nN4vQwzZqUo7g+9rJEiec0iDmpMelPkXHJteLZWi3QiN+\nfwqcMj4eYrFVZ2hdkKZhKI99zbAQ8fKJmfB6dVOYjfQOUUpF/tT17ozaAkGe5xHI\nwtuz7354DQKBgQCXIz/tHTb7feFERO6HDP/gmJhfnzoApAqb2vKuaywIsXNqHJNe\nXIkLCx/I6xte/nTTLcI+hBJby1/DtksDanZryPOaRukfh+IpkF132oedYRb4gPGF\nNF1EUjGjqwOrXEzQb/7bakagApTWGrLUMpJUEGhOTeWpF+xwWsCftSyOfwKBgQC3\nkrw9UY17TfFoIAuEC70HWwTw9PqHd1R4CPKiGlN11vdt3vCI6jB/1dyX5KYiVdr+\n/TD5eopalAlLMZNfFs0DWkWF3OV+3MTKM9Dy7JUJIln1bsd9TSPc3oXhHWcc+hsq\nEtzKQ0ZKsBtEuWgOZcSdA16WlkzvyE4SsSijVh/XfQKBgFCGWIJZ+QFw+hFCNbpW\nSKFpVeEQ49DjoiL1k0LjszOFLG6eI1X9MhELV0PgjosoYU4SUTgsnfQBUfxKlGvN\nZoxgntmaI1fk1o86Zw7QLpG5t5RSDde+xscvS1nEDs12zyijw5Me3NG/ShJ4XpIn\ntv20DKWvUIrJD2BhQNTdbWmx\n-----END PRIVATE KEY-----",
"server_address": [
"192.0.2.180"
],
"server_port": 80,
"tenant_name": "MyTenantName",
"virtual_address": "192.0.2.202",
"virtual_port": 443
},
"state": "create",
"template": "examples/simple_waf",
"tenant": "MyTenantName",
"timeout": 300
}
},
"template": "examples/simple_waf",
"tenant": "MyTenantName"
}
TASK [(2) FAST response debug] ****************************************************************************************************************************************************************************************************************************************************************
task path: /ansible/play-f5fast-issue-repro.yml:48
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /ansible/collections/ansible_collections/ansible/netcommon
<10.1.1.4> attempting to start connection
<10.1.1.4> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /usr/local/bin/ansible-connection
<10.1.1.4> found existing local domain socket, using it!
<10.1.1.4> ESTABLISH HTTP(S) CONNECTFOR USER: admin TO https://10.1.1.4:443
<10.1.1.4> updating play_context for connection
<10.1.1.4>
<10.1.1.4> local domain socket path is /ansible/.ansible/pc/2db776cd4c
ok: [bigip] => {
"f5_fast_response": {
"application": "MyAppName",
"changed": null,
"content": {
"WAF_policy_path": "/Common/this_waf_policy_does_not_exist",
"application_name": "MyAppName",
"certificate": "-----BEGIN CERTIFICATE-----\nMIIDrjCCApagAwIBAgIEGFse8zANBgkqhkiG9w0BAQsFADCBmDELMAkGA1UEBhMC\nVVMxCzAJBgNVBAgTAldBMRAwDgYDVQQHEwdTZWF0dGxlMRIwEAYDVQQKEwlNeUNv\nbXBhbnkxCzAJBgNVBAsTAklUMR4wHAYDVQQDExVsb2NhbGhvc3QubG9jYWxkb21h\naW4xKTAnBgkqhkiG9w0BCQEWGnJvb3RAbG9jYWxob3N0LmxvY2FsZG9tYWluMB4X\nDTIyMTIxMzE4NTQ0M1oXDTMyMTIxMDE4NTQ0M1owgZgxCzAJBgNVBAYTAlVTMQsw\nCQYDVQQIEwJXQTEQMA4GA1UEBxMHU2VhdHRsZTESMBAGA1UEChMJTXlDb21wYW55\nMQswCQYDVQQLEwJJVDEeMBwGA1UEAxMVbG9jYWxob3N0LmxvY2FsZG9tYWluMSkw\nJwYJKoZIhvcNAQkBFhpyb290QGxvY2FsaG9zdC5sb2NhbGRvbWFpbjCCASIwDQYJ\nKoZIhvcNAQEBBQADggEPADCCAQoCggEBALKLQBSjlGxbtuYPDPW+HCPt7+RjoUGC\nab0bGdEtOUxfCDnKKxT2GBjYrfvH7S/xtbdaI1cJbA0qPEhQljNyPudPoSunQ7D2\nl3ka/27jL/FKHL+/svkgLG4dlMVWpDhYKq8DaYdb6iI5qZFUfy95hn2QjkXm0Vrn\nmo7PiDPbnTVTGU6CsPQzmANdN0J5CR6l30ORMXKBwc9mYLUxHHyHI5HSIKrsrj9B\n4PNKDK7OJaYb0d21uuVNyZ281jbnKJs2W54YbhTGVGKcKRGR9Rbfd4LzcDgBtIt0\nZSPfrPu82s3CydqfqctZbg22NE2CUVzyK9yWM0pghghOUPSI2cIrwEsCAwEAATAN\nBgkqhkiG9w0BAQsFAAOCAQEAGdDW+kZde88JarcMfA/I78eDn9j4szGJn6gvCsHZ\niLJsugOxj6udrNXKn6NuekNoQXMae0kXVqVmGZouX8lzrn0I6+bR1TEkBmc+v9bF\nLXk096iUBaKPSAkoNOFeHQE/XRu4kk6TqedMneZvs2/725um+9kkPZde4luwCfk6\nqrki7MtAL9xhuxUVFpISjvsBQRwCg0ckQ1YvKYf+s7i9/4fBZB25biGzmVK7cEPx\nAPVjOHNlYwHCa9bp7SX4tuUrqGyZ8ib3OHxFdXVZzBKsNe02zldpDQ7ZyPlUzhhh\nwl3HfQXADQYtf5rIwsFxgrSDIS2U7oD39poucodCEzc2PQ==\n-----END CERTIFICATE-----",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQCyi0AUo5RsW7bm\nDwz1vhwj7e/kY6FBgmm9GxnRLTlMXwg5yisU9hgY2K37x+0v8bW3WiNXCWwNKjxI\nUJYzcj7nT6Erp0Ow9pd5Gv9u4y/xShy/v7L5ICxuHZTFVqQ4WCqvA2mHW+oiOamR\nVH8veYZ9kI5F5tFa55qOz4gz2501UxlOgrD0M5gDXTdCeQkepd9DkTFygcHPZmC1\nMRx8hyOR0iCq7K4/QeDzSgyuziWmG9HdtbrlTcmdvNY25yibNlueGG4UxlRinCkR\nkfUW33eC83A4AbSLdGUj36z7vNrNwsnan6nLWW4NtjRNglFc8ivcljNKYIYITlD0\niNnCK8BLAgMBAAECggEAMvgnufycwXZJN1HynDDCbcteIXADt+TX9MFI1Hs5kUDL\n41uAgwJiDK3GtUr0viwdeRNFZXJuIy/8d5Rx3Ivvwy7rTr/4RguPYGZBp1E5/YLv\nxBmgqRfzNxhAwTkjtmYNAVtTA+5MX6rganmZuV7S8wOSaggjmfTmHYDHXC6EqZTs\n+BIALRKLYwHzriKM04zDCDcPzG9Wt3ZDLXBBAzWwtMOCDJXRE0PSQXl8C5p6IiLY\najRXXHVqrF4Q10rid6kJKsNKuXMOvoeguAeaPJlY0oOco7hUNmJjf78qKWBM7VuK\nwu0O0FPbRl/oVs9eLzBfEn2lOcjhF9LpyagM4k1fKQKBgQD48OFyk5USjfnmzlQk\nl9wb3/cYyZKpKVIDQ/2qtfRnuYKtSqmayyQwLaxlzbOKKvBenwsTF259Onz0Pt0/\nmJsFWv3DMJDh3QyCrTvoZnnyw3PT3NUN7cGiQQi0SH5M9mvF8gIRZEyG8LmPKJkE\niDW6PC95Y/N1zyaI3tOTnNDrtwKBgQC3m1d80Ce+kJzsz6CP/XD28qi5OeD5HJAp\njzPSRv6dGy1vyiwXv76nN4vQwzZqUo7g+9rJEiec0iDmpMelPkXHJteLZWi3QiN+\nfwqcMj4eYrFVZ2hdkKZhKI99zbAQ8fKJmfB6dVOYjfQOUUpF/tT17ozaAkGe5xHI\nwtuz7354DQKBgQCXIz/tHTb7feFERO6HDP/gmJhfnzoApAqb2vKuaywIsXNqHJNe\nXIkLCx/I6xte/nTTLcI+hBJby1/DtksDanZryPOaRukfh+IpkF132oedYRb4gPGF\nNF1EUjGjqwOrXEzQb/7bakagApTWGrLUMpJUEGhOTeWpF+xwWsCftSyOfwKBgQC3\nkrw9UY17TfFoIAuEC70HWwTw9PqHd1R4CPKiGlN11vdt3vCI6jB/1dyX5KYiVdr+\n/TD5eopalAlLMZNfFs0DWkWF3OV+3MTKM9Dy7JUJIln1bsd9TSPc3oXhHWcc+hsq\nEtzKQ0ZKsBtEuWgOZcSdA16WlkzvyE4SsSijVh/XfQKBgFCGWIJZ+QFw+hFCNbpW\nSKFpVeEQ49DjoiL1k0LjszOFLG6eI1X9MhELV0PgjosoYU4SUTgsnfQBUfxKlGvN\nZoxgntmaI1fk1o86Zw7QLpG5t5RSDde+xscvS1nEDs12zyijw5Me3NG/ShJ4XpIn\ntv20DKWvUIrJD2BhQNTdbWmx\n-----END PRIVATE KEY-----",
"server_address": [
"192.0.2.180"
],
"server_port": 80,
"tenant_name": "MyTenantName",
"virtual_address": "192.0.2.202",
"virtual_port": 443
},
"failed": false,
"template": "examples/simple_waf",
"tenant": "MyTenantName"
}
}
PLAY RECAP ************************************************************************************************************************************************************************************************************************************************************************************
bigip : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Logs on the BIG-IP:
==> /var/log/restnoded/restnoded.log <==
Tue, 03 Jan 2023 19:59:10 GMT - warning: [appsvcs] {"message":"unable to digest declaration. Error: Unable to find specified WAF policy /Common/this_waf_policy_does_not_exist for /MyTenantName/MyAppName/serviceMain/policyWAF","level":"warning"}
...
Tue, 03 Jan 2023 19:59:11 GMT - info: FAST Worker [283]: Exiting gathering a list of tasks from the driver
Tue, 03 Jan 2023 19:59:11 GMT - fine: FAST Worker [283]: gathering a list of tasks from the driver took 1188ms to complete
Tue, 03 Jan 2023 19:59:11 GMT - fine: FAST Worker [283]: sending response after 1190ms
{
"method": "Get",
"path": "/shared/fast/tasks/6fa5ba37-8b4b-4ba8-873d-4c197c6aaad9",
"status": 200
}
Checking the actual task on the BIG-IP provides the relevant details:
# restcurl -s -u admin: https://localhost/mgmt/shared/fast/tasks/6fa5ba37-8b4b-4ba8-873d-4c197c6aaad9
{
"id": "6fa5ba37-8b4b-4ba8-873d-4c197c6aaad9",
"code": 422,
"message": "Unable to find specified WAF policy /Common/this_waf_policy_does_not_exist for /MyTenantName/MyAppName/serviceMain/policyWAF",
"name": "",
"parameters": {},
"tenant": "MyTenantName",
"application": "MyAppName",
"operation": "create",
"timestamp": "",
"host": "localhost",
"_links": {
"self": "/mgmt/shared/fast/tasks/6fa5ba37-8b4b-4ba8-873d-4c197c6aaad9"
}
}
f5networks.f5_bigip.bigip_fast_application
is expected to pickup the error or (at least) provide the task id (6fa5ba37-8b4b-4ba8-873d-4c197c6aaad9
) to check for the status manually.
bigiq_as3_deploy
ansible 2.10.8
...
python version = 3.8.5 (default, Jan 27 2021, 15:41:15) [GCC 9.3.0]
do not know
N/A
N/A
Can not use an ldap auth provider (for example, "ldap-my.mydomain.com") to use the BigIQ API . When fetching an auth token, this Ansible module will build the url incorrectly https://github.com/F5Networks/f5-ansible-bigip/blob/devel/ansible_collections/f5networks/f5_bigip/plugins/httpapi/bigiq.py#L198-L202
It shouldn't be using the provider name as arg {0}
. it should be https://localhost/mgmt/cm/system/authn/providers/ldap/<UUID>/login
- name: AS3
hosts: bigip
connection: httpapi
gather_facts: false
tasks:
- name: Deploy or Update
f5networks.f5_bigip.bigiq_as3_deploy:
content: "{{ lookup('file', 'declarations/as3.json') }}"
vars:
f5_provider: ldap-my.mydomain.com
tags: [ deploy ]
Playbook runs, authenticates with the BigIQ api, and deploys the AS3 declaration
ansible.module_utils.connection.ConnectionError: Authentication process failed, server returned: {'code': 400, 'message': 'No login provider found.', 'originalRequestBody': '{"username":"XXXXXXXXX","loginReference":{"link":"https://localhost/mgmt/cm/system/authn/providers/ldap-my.mydomain.com/ac6514af-b16d-9201-aaea-60cea111daaa/login"},"generation":0,"lastUpdateMicros":0}', 'restOperationId': 162129111, 'errorStack': [], 'kind': ':resterrorresponse'}
bigip_sslo_service_icap
>= 9.3
The ICAP service definition does not provide a cpmPolicies option, as described in:ย f5-ssl-orchestrator-service (ICAP Service)
None
bigip_sslo_service_icap
>= 9.3
The ICAP service definition does not provide a vendor_info option, as described in:ย f5-ssl-orchestrator-service (ICAP Service)
None
bigip_sslo_service_http
>= 9.3
The Inline HTTP service definition does not provide a vendor_info option, as described in:ย f5-ssl-orchestrator-service (HTTP Proxy)
None
CloudDocs documentation
ansible [core 2.11.7]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/ubuntu/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3/dist-packages/ansible
ansible collection location = /home/ubuntu/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.8.10 (default, Sep 28 2021, 16:10:42) [GCC 9.3.0]
jinja version = 2.10.1
libyaml = True
N/A
Installed v2 modules in local directory
collections_paths=./collections
In the CloudDocs installation instructions for the v2 Ansible modules, there is a reference to use the f5devcentral repository if the user chooses to install from GitHub. Since the f5devcentral repository is out of date, this will result in invalid module path errors and similar.
Suggested fixes:
https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/install_f5_bigip.html#install-from-github
ansible-galaxy collection install git+https://github.com/f5devcentral/f5-ansible-bigip.git#ansible_collections/f5networks/f5_bigip -p ./collections/
a complete installation of the v2 F5 Ansible modules
an incomplete installation of the v2 F5 Ansible modules
bigip_sslo_config_ssl
>= 9.3
SSLO 9.3 introduced "SNI Server Name (FQDN)" as a "serverName" (string) field under "GeneralSettings" in the JSON.
Request new field "type" be supported in ansible velos_tenant_module: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/velos_tenant_module.html#velos-tenant-module-2
New option starting in F5OS 1.4 which determines support for BIG-IP Next.
https://clouddocs.f5.com/api/velos-api/F5OS-C-1.1.0-api.html#operation/data_f5_tenants_tenants_tenant_tenant_name_config_post
Add type field with acceptable values (bigip, or bigip-next are currently only two available options).
None.
Ansible module: bigip_sslo_service_layer3
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
The layer 3 service module does not honor the defined subnet mask. For example:
- name: SSLO LAYER 3 (SNORT2)
bigip_sslo_service_layer3:
name: "SNORT2"
devices_to:
vlan: "/Common/dmz3"
self_ip: "10.0.6.27"
netmask: "255.255.255.0"
devices_from:
vlan: "/Common/dmz4"
self_ip: "10.0.7.23"
netmask: "255.255.255.0"
devices:
- ip: "{{ snort2_host }}"
The declaration defines dmz3 and dmz4 in separate /24 subnets. When the config is built, the resulting service self-IPs are all /25.
bigip_sslo_config_policy
NA
Documentation should read that the policy_consumer field defaults to outbound if not specified.
Ansible module: bigip_sslo_config_policy
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
There are no configuration options for what to do with non-matching traffic in the SSLO security policy. This is the built-in "All Traffic" rule that provides actions for allow/block, intercept/bypass, and service chain selection.
No options in the policy module to create/re-create this behavior.
bigip_do_deploy
[gerace@europa ansible]$ ansible --version
ansible [core 2.11.4]
config file = /home/gerace/Documents/ansible/ansible.cfg
configured module search path = ['/home/gerace/Documents/ansible/library']
ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
ansible collection location = /home/gerace/Documents/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.6.8 (default, Nov 16 2020, 16:55:22) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
jinja version = 3.0.1
libyaml = True
[root@localhost:NO LICENSE:Standalone] config # tmsh show sys version
Sys::Version
Main Package
Product BIG-IP
Version 15.1.3.1
Build 0.0.18
Edition Point Release 1
Date Mon Jul 12 23:47:21 PDT 2021
ansible_host: "192.168.0.201"
ansible_user: "admin"
ansible_httpapi_password: "{{ bigip_admin_pass }}"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
CentOS7
Attempting to post a DO declaration to the BIG-IP and receive the following error:
fatal: [notahost]: FAILED! => {"changed": false, "msg": "Server parameter cannot be None or missing, please provide a valid value"}
---
- hosts: all
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
ansible_host: "192.168.0.201"
ansible_user: "admin"
ansible_httpapi_password: "{{ bigip_admin_pass }}"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: Deploy DO declaration
bigip_do_deploy:
content: "{{ lookup('file', '{{bigipVmName}}_do.json') }}"
Have the DO declaration posted to the BIG-IP
File lookup using /home/gerace/Documents/ansible/configs/bigTest01_do.json as file
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /home/gerace/Documents/ansible/collections/ansible_collections/ansible/netcommon
<192.168.0.201> attempting to start connection
<192.168.0.201> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /usr/local/bin/ansible-connection
<192.168.0.201> found existing local domain socket, using it!
<192.168.0.201> updating play_context for connection
<192.168.0.201>
<192.168.0.201> local domain socket path is /home/gerace/.ansible/pc/51465be718
<192.168.0.201> Using network group action bigip for bigip_do_deploy
<192.168.0.201> ANSIBLE_NETWORK_IMPORT_MODULES: disabled
<192.168.0.201> ANSIBLE_NETWORK_IMPORT_MODULES: module execution time may be extended
<192.168.0.201> ESTABLISH LOCAL CONNECTION FOR USER: gerace
<192.168.0.201> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc `"&& mkdir "` echo /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318 `" && echo ansible-tmp-1631577163.2544668-3717-65038104816318="` echo /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318 `" ) && sleep 0'
Using module file /home/gerace/Documents/ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_do_deploy.py
<192.168.0.201> PUT /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/tmpyybuze8x TO /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318/AnsiballZ_bigip_do_deploy.py
<192.168.0.201> EXEC /bin/sh -c 'chmod u+x /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318/ /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318/AnsiballZ_bigip_do_deploy.py && sleep 0'
<192.168.0.201> EXEC /bin/sh -c '/usr/bin/python3 /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318/AnsiballZ_bigip_do_deploy.py && sleep 0'
<192.168.0.201> EXEC /bin/sh -c 'rm -f -r /home/gerace/.ansible/tmp/ansible-local-3639bzbyljqc/ansible-tmp-1631577163.2544668-3717-65038104816318/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
File "/tmp/ansible_bigip_do_deploy_payload_8hjkdptf/ansible_bigip_do_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_do_deploy.py", line 325, in main
File "/tmp/ansible_bigip_do_deploy_payload_8hjkdptf/ansible_bigip_do_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_do_deploy.py", line 160, in __init__
File "/tmp/ansible_bigip_do_deploy_payload_8hjkdptf/ansible_bigip_do_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/bigip_local.py", line 20, in __init__
self.provider = self.merge_provider_params()
File "/tmp/ansible_bigip_do_deploy_payload_8hjkdptf/ansible_bigip_do_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/local.py", line 116, in merge_provider_params
self.merge_provider_server_param(result, provider)
File "/tmp/ansible_bigip_do_deploy_payload_8hjkdptf/ansible_bigip_do_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/local.py", line 132, in merge_provider_server_param
raise F5ModuleError('Server parameter cannot be None or missing, please provide a valid value')
fatal: [notahost]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"content": {
"$schema": "https://raw.githubusercontent.com/F5Networks/f5-declarative-onboarding/master/src/schema/1.10.0/base.schema.json",
"Common": {
"ansible": {
"class": "User",
"partitionAccess": {
"all-partitions": {
"role": "admin"
}
},
"password": "@n$1bl3",
"shell": "none",
"userType": "regular"
},
"class": "Tenant",
"defaultRoute": {
"class": "Route",
"gw": "10.0.10.1",
"mtu": 0,
"network": "default"
},
"external": {
"class": "VLAN",
"interfaces": [
{
"name": "1.2",
"tagged": false
}
],
"mtu": 1500,
"tag": 100
},
"external-self": {
"address": "10.0.10.126/24",
"allowService": "none",
"class": "SelfIp",
"trafficGroup": "traffic-group-local-only",
"vlan": "external"
},
"hostname": "bigTest01.f5demo.org",
"internal": {
"class": "VLAN",
"interfaces": [
{
"name": "1.1",
"tagged": false
}
],
"mtu": 1500,
"tag": 200
},
"internal-self": {
"address": "10.0.20.96/24",
"allowService": "default",
"class": "SelfIp",
"trafficGroup": "traffic-group-local-only",
"vlan": "internal"
},
"k8sadmn": {
"class": "User",
"partitionAccess": {
"all-partitions": {
"role": "admin"
}
},
"password": "k8sadmn",
"shell": "none",
"userType": "regular"
},
"myDns": {
"class": "DNS",
"nameServers": [
"8.8.8.8",
"8.8.4.4"
],
"search": [
"f5demo.org"
]
},
"myLicense": {
"class": "License",
"licenseType": "regKey",
"regKey": "LYIXU-MUWRM-HGADE-QGBMS-QSCUJJH"
},
"myNtp": {
"class": "NTP",
"servers": [
"128.138.140.44"
],
"timezone": "America/New_York"
},
"myProvisioning": {
"apm": "nominal",
"avr": "nominal",
"class": "Provision",
"ltm": "nominal"
},
"prometheus": {
"class": "User",
"partitionAccess": {
"all-partitions": {
"role": "admin"
}
},
"password": "pr0m3th3u$",
"shell": "none",
"userType": "regular"
}
},
"async": true,
"class": "Device",
"label": "BIG-IP declaration for bigTest01",
"schemaVersion": "1.9.0"
},
"provider": null,
"timeout": 150
}
},
"msg": "Server parameter cannot be None or missing, please provide a valid value"
}
Ansible module: bigip_sslo_config_policy
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
The "ssl_forwardproxy_action" option should be renamed to "ssl_action". This was updated in the UI around 8.x to disambiguate the function in forward and reverse proxy scenarios
No options in the policy module to create/re-create this behavior.
I'm getting started with this module and Module Index would help a lot;
Meanwhile the page is rather empty:
https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/module_index.html
Same type of Index is available for "f5_modules" but it is not comfortable to think about differences between them:
https://clouddocs.f5.com/products/orchestration/ansible/devel/modules/module_index.html
Make index available including details of all modules
A clear page describing a way to modify module from f5_modules to f5_bigip i.e:
https://clouddocs.f5.com/products/orchestration/ansible/devel/usage/porting-guides.html
Add any other context or screenshots about the feature request here.
bigip_sslo_service_http
core 2.12.5
16.1.3.1
9.3.41
Ubuntu 20.04
Python 3.8.10
While attempting to run the below playbook to create an http proxy on SSLO 9.3.4.1, I am met with a VLAN error. Interface 1.3 is not in use, nor are the 30 and 40 VLAN tags.
---
- name: Create SSLO service(s)
hosts: all
gather_facts: False
connection: httpapi
collections:
- f5networks.f5_bigip
vars:
ansible_host: "10.0.10.146"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: SSLO HTTP service
bigip_sslo_service_http:
name: "http"
proxy_type: "explicit"
devices_to:
interface: "1.3"
tag: 30
self_ip: "198.19.96.7"
netmask: "255.255.255.128"
devices_from:
interface: "1.3"
tag: 40
self_ip: "198.19.96.245"
netmask: "255.255.255.128"
devices:
- ip: "198.19.96.30%0"
port: 3128
fatal: [localhost]: FAILED! => {"changed": false, "msg": "CREATE operation error: 0633f77e-2a1f-49dc-ae4c-73098a3e50e8 : [OrchestratorConfigProcessor] Deployment failed for Error: [HAAwareICRDeployProcessor] Error: transaction failed:01070256:3: Requested VLAN member (1) is not valid"}
all velos modules:
velos_partition โ Manage VELOS chassis partitions
velos_partition_change_password โ Provides access to VELOS chassis partition user authentication methods
velos_partition_image โ Manage VELOS chassis partition images
velos_partition_interface โ Manage network interfaces on VELOS chassis partitions
velos_partition_lag โ Manage network interfaces on the VELOS chassis partitions
velos_partition_vlan โ Manage VLANs on VELOS chassis partitions
velos_partition_wait โ Wait for a VELOS chassis partition to match a condition before continuing
velos_tenant โ Manage VELOS tenants
velos_tenant_image โ Manage VELOS tenant images
velos_tenant_wait โ Wait for a VELOS condition before continuing
ansible [core 2.12.5]
f5networks.f5_bigip:1.8.0
Python v3.10.4
F5OS v1.3.2
Ubuntu 22.04
nsible [core 2.12.5]
config file = /home/lionel/PyCharmProjects/venv3-10/Ansible/ansible.cfg
configured module search path = ['/home/lionel/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/lionel/PyCharmProjects/venv3-10/lib/python3.10/site-packages/ansible
ansible collection location = /home/lionel/PyCharmProjects/venv3-10/Ansible/collections
executable location = /home/lionel/PyCharmProjects/venv3-10/bin/ansible
python version = 3.10.4 (main, Apr 2 2022, 09:04:19) [GCC 11.2.0]
jinja version = 3.0.3
libyaml = False
Ansible can't connect to VELOS Chassis or VELOS partition using:
ansible_network_os: f5networks.f5_bigip.bigip as described in all the above examples
using following playbook at:
https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/velos_partition_module.html#examples
VELOS partition created
ansible-playbook PlayBooks/f5v2/velospartition.yml -e "cible=Veloshttpapi" --tag=create -vvv
ansible-playbook [core 2.12.5]
config file = /home/lionel/PyCharmProjects/venv3-10/Ansible/ansible.cfg
configured module search path = ['/home/lionel/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/lionel/PyCharmProjects/venv3-10/lib/python3.10/site-packages/ansible
ansible collection location = /home/lionel/PyCharmProjects/venv3-10/Ansible/collections
executable location = /home/lionel/PyCharmProjects/venv3-10/bin/ansible-playbook
python version = 3.10.4 (main, Apr 2 2022, 09:04:19) [GCC 11.2.0]
jinja version = 3.0.3
libyaml = False
Using /home/lionel/PyCharmProjects/venv3-10/Ansible/ansible.cfg as config file
host_list declined parsing /home/lionel/PyCharmProjects/venv3-10/Ansible/inventory/velos as it did not pass its verify_file() method
script declined parsing /home/lionel/PyCharmProjects/venv3-10/Ansible/inventory/velos as it did not pass its verify_file() method
auto declined parsing /home/lionel/PyCharmProjects/venv3-10/Ansible/inventory/velos as it did not pass its verify_file() method
Parsed /home/lionel/PyCharmProjects/venv3-10/Ansible/inventory/velos inventory source with ini plugin
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: velospartition.yml ***********************************************************************************************************************************************************************************************
1 plays in PlayBooks/f5v2/velospartition.yml
PLAY [Manage Partitions] ***************************************************************************************************************************************************************************************************
META: ran handlers
TASK [Create Partition Tests] **********************************************************************************************************************************************************************************************
task path: /home/lionel/PyCharmProjects/venv3-10/Ansible/PlayBooks/f5v2/velospartition.yml:14
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
<10.154.79.40> ESTABLISH LOCAL CONNECTION FOR USER: lionel
<10.154.79.40> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l `"&& mkdir "` echo /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533 `" && echo ansible-tmp-1653920347.7348301-3526-159497453417533="` echo /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533 `" ) && sleep 0'
Using module file /home/lionel/PyCharmProjects/venv3-10/Ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py
<10.154.79.40> PUT /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/tmpmaqouklu TO /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py
<10.154.79.40> EXEC /bin/sh -c 'chmod u+x /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/ /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py && sleep 0'
<10.154.79.40> EXEC /bin/sh -c '/home/lionel/PyCharmProjects/venv3-10/bin/python3.10 /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py && sleep 0'
<10.154.79.40> EXEC /bin/sh -c 'rm -f -r /home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
File "/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py", line 107, in <module>
_ansiballz_main()
File "/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py", line 99, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py", line 47, in invoke_module
runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.velos_partition', init_globals=dict(_module_fqn='ansible_collections.f5networks.f5_bigip.plugins.modules.velos_partition', _modlib_path=modlib_path),
File "/usr/lib/python3.10/runpy.py", line 209, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/lib/python3.10/runpy.py", line 96, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py", line 673, in <module>
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py", line 666, in main
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py", line 421, in exec_module
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py", line 433, in present
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py", line 484, in exists
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/velos_client.py", line 24, in wrap
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/velos_client.py", line 43, in get
File "/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible/module_utils/connection.py", line 200, in __rpc__
ansible.module_utils.connection.ConnectionError: Invalid JSON response: <!DOCTYPE html>
<html>
<head>
<title>404 Not Found</title>
</head>
<body>
<h1>Not Found</h1>
The requested URL /mgmt/shared/authn/login was not found on this server.
<hr>
<address> Server at localhost:8008 </address>
</body>
</html>
fatal: [veloscontroller]: FAILED! => {
"changed": false,
"module_stderr": "Traceback (most recent call last):\n File \"/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py\", line 107, in <module>\n _ansiballz_main()\n File \"/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py\", line 99, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/home/lionel/.ansible/tmp/ansible-local-3522x96i0a6l/ansible-tmp-1653920347.7348301-3526-159497453417533/AnsiballZ_velos_partition.py\", line 47, in invoke_module\n runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.velos_partition', init_globals=dict(_module_fqn='ansible_collections.f5networks.f5_bigip.plugins.modules.velos_partition', _modlib_path=modlib_path),\n File \"/usr/lib/python3.10/runpy.py\", line 209, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/lib/python3.10/runpy.py\", line 96, in _run_module_code\n _run_code(code, mod_globals, init_globals,\n File \"/usr/lib/python3.10/runpy.py\", line 86, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py\", line 673, in <module>\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py\", line 666, in main\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py\", line 421, in exec_module\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py\", line 433, in present\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/velos_partition.py\", line 484, in exists\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/velos_client.py\", line 24, in wrap\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/module_utils/velos_client.py\", line 43, in get\n File \"/tmp/ansible_velos_partition_payload_m6be7875/ansible_velos_partition_payload.zip/ansible/module_utils/connection.py\", line 200, in __rpc__\nansible.module_utils.connection.ConnectionError: Invalid JSON response: <!DOCTYPE html>\n<html>\n<head>\n<title>404 Not Found</title>\n</head>\n<body>\n<h1>Not Found</h1>\nThe requested URL /mgmt/shared/authn/login was not found on this server.\n<hr>\n<address> Server at localhost:8008 </address>\n</body>\n</html>\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
PLAY RECAP *****************************************************************************************************************************************************************************************************************
veloscontroller : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
We need to change ansible_network_os: f5networks.f5_bigip.bigip by ansible_network_os: f5networks.f5_bigip.velos and set ansible_httpapi_port to 8888
bigip_device_info
ansible [core 2.11.4]
config file = /home/lionel/PyCharmProjects/venv3-8-5/Ansible/ansible.cfg
configured module search path = ['/home/lionel/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/lionel/PyCharmProjects/venv3-8-5/lib/python3.8/site-packages/ansible
ansible collection location = /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections
executable location = /home/lionel/PyCharmProjects/venv3-8-5/bin/ansible
python version = 3.8.5 (default, Sep 13 2020, 10:59:50) [GCC 9.3.0]
jinja version = 3.0.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.3.1
Build 0.0.18
Edition Point Release 1
Date Mon Jul 12 23:47:21 PDT 2021
[defaults]
inventory = /home/lionel/PyCharmProjects/venv3-8-5/Ansible/inventory/hosts
roles_path = ./roles
collections_paths = ./collections
retry_files_enabled = False
host_key_checking = false
forks = 15
module_name = raw
vault_password_file = /home/lionel/PyCharmProjects/venv3-8-5/Ansible/.vault_pass.txt
interpreter_python = auto_silent
string_conversion_action = ignore
merge_multiple_cli_tags = True
[persistent_connection]
connect_timeout = 240
connect_retry_timeout = 30
command_timeout = 300
[ssh_connection]
pipelining = True
scp_if_ssh = smart
Ubuntu 20.04
Got KeyError using bigip_device_info module when parameter gather_subset is set with all.
Using following command:
ansible-playbook PlayBooks/bigipv2.yml -e '{"cible":"bigipa"}' -t fact -vvvv
see below the playbook
---
- name: Backup F5 device
hosts: "{{ cible }}"
gather_facts: false
strategy: linear
connection: httpapi
collections:
- f5networks.f5_bigip
vars:
msg: "{{ lookup('pipe','date +%Y-%m-%d-%H-%M-%S') }}"
tasks:
- name: Collect BIG-IP information
bigip_device_info:
gather_subset:
- all
tags: fact
register: output_f5
- debug:
var=output_f5
tags: fact
Should get device info.
TASK [Collect BIG-IP information] ****************************************************************************************************************************************************************************
task path: /home/lionel/PyCharmProjects/venv3-8-5/Ansible/PlayBooks/bigipv2.yml:48
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections/ansible_collections/ansible/netcommon
<10.171.60.227> attempting to start connection
<10.171.60.227> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /home/lionel/PyCharmProjects/venv3-8-5/bin/ansible-connection
<10.171.60.227> local domain socket does not exist, starting it
<10.171.60.227> control socket path is /home/lionel/.ansible/pc/76fcc6d478
<10.171.60.227> redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
<10.171.60.227> Loading collection ansible.netcommon from /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections/ansible_collections/ansible/netcommon
<10.171.60.227> Loading collection f5networks.f5_bigip from /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections/ansible_collections/f5networks/f5_bigip
<10.171.60.227> local domain socket listeners started successfully
<10.171.60.227> loaded API plugin ansible_collections.f5networks.f5_bigip.plugins.httpapi.bigip from path /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/httpapi/bigip.py for network_os f5networks.f5_bigip.bigip
<10.171.60.227>
<10.171.60.227> local domain socket path is /home/lionel/.ansible/pc/76fcc6d478
<10.171.60.227> Using network group action bigip for bigip_device_info
<10.171.60.227> ANSIBLE_NETWORK_IMPORT_MODULES: disabled
<10.171.60.227> ANSIBLE_NETWORK_IMPORT_MODULES: module execution time may be extended
<10.171.60.227> ESTABLISH LOCAL CONNECTION FOR USER: lionel
<10.171.60.227> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg `"&& mkdir "` echo /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303 `" && echo ansible-tmp-1629272996.2444608-126560-40012778561303="` echo /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303 `" ) && sleep 0'
Using module file /home/lionel/PyCharmProjects/venv3-8-5/Ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py
<10.171.60.227> PUT /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/tmp1z2dqbzg TO /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py
<10.171.60.227> EXEC /bin/sh -c 'chmod u+x /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/ /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py && sleep 0'
<10.171.60.227> EXEC /bin/sh -c '/home/lionel/PyCharmProjects/venv3-8-5/bin/python /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py && sleep 0'
<10.171.60.227> EXEC /bin/sh -c 'rm -f -r /home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
File "/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py", line 100, in <module>
_ansiballz_main()
File "/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py", line 92, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_device_info', init_globals=dict(_module_fqn='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_device_info', _modlib_path=modlib_path),
File "/usr/local/lib/python3.8/runpy.py", line 207, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/local/lib/python3.8/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16797, in <module>
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16784, in main
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16502, in exec_module
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 16583, in execute_managers
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 9070, in exec_module
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 9077, in _exec_module
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 9081, in read_facts
File "/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py", line 9092, in read_collection_from_device
KeyError: 0
fatal: [bigipa]: FAILED! => {
"changed": false,
"module_stderr": "Traceback (most recent call last):\n File \"/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py\", line 100, in <module>\n _ansiballz_main()\n File \"/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py\", line 92, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/home/lionel/.ansible/tmp/ansible-local-1265559zhgh_rg/ansible-tmp-1629272996.2444608-126560-40012778561303/AnsiballZ_bigip_device_info.py\", line 40, in invoke_module\n runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_device_info', init_globals=dict(_module_fqn='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_device_info', _modlib_path=modlib_path),\n File \"/usr/local/lib/python3.8/runpy.py\", line 207, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/usr/local/lib/python3.8/runpy.py\", line 97, in _run_module_code\n _run_code(code, mod_globals, init_globals,\n File \"/usr/local/lib/python3.8/runpy.py\", line 87, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 16797, in <module>\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 16784, in main\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 16502, in exec_module\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 16583, in execute_managers\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 9070, in exec_module\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 9077, in _exec_module\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 9081, in read_facts\n File \"/tmp/ansible_bigip_device_info_payload_h2wey8w6/ansible_bigip_device_info_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_device_info.py\", line 9092, in read_collection_from_device\nKeyError: 0\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
PLAY RECAP ***************************************************************************************************************************************************************************************************
bigipa : ok=0 changed=0 unreachable=0 failed=1 skipped=0 rescued=0 ignored=0
bigip_ucs_fetch
Any
Any
The documentation states the module supports a parameter of timeout however the parameter should be async_timeout.
bigip_sslo_config_ssl
>= 9.3
SSLO 9.3 introduced "Default SNI" as "sniDefault" (boolean) field under "GeneralSettings" in the JSON.
When deploying an iRule, there could be an issue with it referencing other objects. The bigip_as3_deploy
module does not return any useful errors, just "declaration failed". But when sending the declaration manually (using curl or postman), it returns an error message.
For example here is a simple declaration that will fail as the iRule is referencing a data group that does not exist:
{
"action": "deploy",
"class": "AS3",
"persist": true,
"declaration": {
"class": "ADC",
"id": "irule-test",
"schemaVersion": "3.26.0",
"irule-test": {
"class": "Tenant",
"prod": {
"class": "Application",
"template": "generic",
"test-vs": {
"class": "Service_HTTP",
"iRules": ["testirulename"],
"virtualAddresses": ["1.2.3.4"]
},
"testirulename": {
"class": "iRule",
"iRule": "when CLIENT_ACCEPTED { if { [class match \"test-key\" equals DoesNotExist] } { log local0. matching } }"
}
}
}
}
}
Results from using the bigip_as3_deploy module:
The full traceback is:
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 382, in main
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 205, in exec_module
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 223, in present
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 217, in upsert
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 298, in upsert_on_device
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_7lgru3rb/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 308, in wait_for_task
fatal: [se1cideltm01]: FAILED! => {
"changed": false,
"invocation": {
"module_args": {
"content": {
"action": "deploy",
"class": "AS3",
"declaration": {
"class": "ADC",
"id": "irule-test",
"irule-test": {
"class": "Tenant",
"prod": {
"class": "Application",
"template": "generic",
"test-vs": {
"class": "Service_HTTP",
"iRules": [
"testirulename"
],
"virtualAddresses": [
"1.2.3.4"
]
},
"testirulename": {
"class": "iRule",
"iRule": "when CLIENT_ACCEPTED { if { [class match \"test-key\" equals DoesNotExist] } { log local0. matching } }"
}
}
},
"schemaVersion": "3.26.0"
},
"persist": true
},
"state": "present",
"tenant": null,
"timeout": 300
}
},
"msg": "declaration failed"
}
And here are the results from curl
:
$ curl -d @/tmp/test-as3.json -k -u 'admin:123456789' -s https://bigip/mgmt/shared/appsvcs/declare | jq .
{
"results": [
{
"code": 422,
"message": "declaration failed",
"response": "01070151:3: Rule [/irule-test/prod/testirulename] error: Unable to find value_list (DoesNotExist) referenced at line 1: [class match \\\"test-key\\\" equals DoesNotExist]",
"host": "localhost",
"tenant": "irule-test",
"runTime": 1071
}
],
"declaration": {
"class": "ADC",
"id": "irule-test",
"schemaVersion": "3.26.0",
"updateMode": "selective",
"controls": {
"archiveTimestamp": "2021-05-19T17:17:22.989Z"
}
},
"code": 422
}
Theres a nice error message telling me what the problem is.
Include the response/error message in the module error output when it fails.
Ansible module: bigip_sslo_config_policy
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Dockerfile:
FROM ubuntu:20.04
# Install components
RUN apt-get update && apt-get -y upgrade \
&& DEBIAN_FRONTEND="noninteractive" TZ="America/New_York" apt-get install -y tzdata \
&& ln -fs /usr/share/zoneinfo/America/New_York /etc/localtime \
&& dpkg-reconfigure --frontend noninteractive tzdata \
&& apt-get install -y gnupg software-properties-common curl awscli git python3-pip \
&& pip3 install ansible f5-sdk bigsuds netaddr objectpath isoparser lxml deepdiff \
&& curl -fsSL https://apt.releases.hashicorp.com/gpg | apt-key add - \
&& apt-add-repository "deb [arch=amd64] https://apt.releases.hashicorp.com $(lsb_release -cs) main" \
&& apt-get update && apt-get install -y terraform
# Configure Ansible
SHELL ["/bin/bash", "-c"]
RUN mkdir -p /sslo/ansible && cd /sslo/ansible \
&& mkdir -p inventory/{group_vars,host_vars} \
&& mkdir -p {library/modules,playbooks,files,roles,scripts,templates} \
&& touch {ansible.cfg,inventory/group_vars/all.yaml,inventory/host_vars/host1.yaml,playbooks/site.yaml,inventory/hosts} \
&& echo $'[defaults]\nhost_key_checking = False\nretry_files_enabled = False\ninventory = ./inventory/hosts\nlibrary = ./library\nroles_path = ./roles\ncollections_paths = ./collection\n' > ansible.cfg \
&& echo $'[all]\nlocalhost' > inventory/hosts \
&& ansible-galaxy collection install f5networks.f5_bigip
WORKDIR /sslo
The service_chain option requires explicit knowledge of the full object name (ssloSC_). This will be confusing to users as they won't know that SSLO has appended the ssloSC_ prefix to the service chain.
No options in the policy module to create/re-create this behavior.
bigip_as3_deploy
ansible 2.9.21
config file = /home/nahun/Documents/repo/ansible.cfg
configured module search path = ['/home/nahun/Documents/repo/modules']
ansible python module location = /home/nahun/Documents/ans-net-venv37/lib/python3.7/site-packages/ansible
executable location = /home/nahun/Documents/ans-net-venv37/bin/ansible
python version = 3.7.10 (default, May 17 2021, 18:26:26) [GCC 9.3.0]
I tried with Ansible 2.10.9 as well.
Sys::Version
Main Package
Product BIG-IP
Version 15.1.2.1
Build 0.0.10
Edition Point Release 1
Date Fri Jan 15 13:43:15 PST 2021
When removing a AS3 tenant using the bigip_as3_deploy
module, it throws an error. It does finish removing the tenant from the BIG-IP, but the module still errors out
Delete any AS3 declaration with absent set
- name: delete as3 declaration
f5networks.f5_bigip.bigip_as3_deploy:
tenant: thetenant
state: absent
This is due to a bug in the remove_from_device()
function. The parameters to wait_for_task
are swapped:
Just swap period
and interval
parameters to fix.
The full traceback is:
Traceback (most recent call last):
File "/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py", line 102, in <module>
_ansiballz_main()
File "/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_as3_deploy', init_globals=None, run_name='__main__', alter_sys=True)
File "/home/nahun/apps/lib/python3.7/runpy.py", line 205, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/home/nahun/apps/lib/python3.7/runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "/home/nahun/apps/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 389, in <module>
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 382, in main
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 207, in exec_module
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 227, in absent
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 233, in remove
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 343, in remove_from_device
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 303, in wait_for_task
TypeError: 'float' object cannot be interpreted as an integer
failed: [se1cideltm01] (item=pdj) => {
"ansible_loop_var": "item",
"changed": false,
"item": "pdj",
"module_stderr": "Traceback (most recent call last):\n File \"/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py\", line 102, in <module>\n _ansiballz_main()\n File \"/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py\", line 94, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File \"/home/nahun/.ansible/tmp/ansible-local-13381709y2pivz5/ansible-tmp-1621449542.7987194-1338205-137511340990263/AnsiballZ_bigip_as3_deploy.py\", line 40, in invoke_module\n runpy.run_module(mod_name='ansible_collections.f5networks.f5_bigip.plugins.modules.bigip_as3_deploy', init_globals=None, run_name='__main__', alter_sys=True)\n File \"/home/nahun/apps/lib/python3.7/runpy.py\", line 205, in run_module\n return _run_module_code(code, init_globals, run_name, mod_spec)\n File \"/home/nahun/apps/lib/python3.7/runpy.py\", line 96, in _run_module_code\n mod_name, mod_spec, pkg_name, script_name)\n File \"/home/nahun/apps/lib/python3.7/runpy.py\", line 85, in _run_code\n exec(code, run_globals)\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 389, in <module>\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 382, in main\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 207, in exec_module\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 227, in absent\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 233, in remove\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 343, in remove_from_device\n File \"/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_66emu0b3/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py\", line 303, in wait_for_task\nTypeError: 'float' object cannot be interpreted as an integer\n",
"module_stdout": "",
"msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
"rc": 1
}
bigip_sslo_config_topology
ansible [core 2.12.5]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3.2
Build 0.0.4
Edition Point Release 2
Date Wed Sep 14 08:12:07 PDT 2022
9.3.41
No specific system/ansible configuration changes
Ubuntu 20.04
Python 3.8.10
On deploy of bigip_sslo_config_topology, the following fatal error is reported:
fatal: [172.16.1.83]: FAILED! => {"changed": false, "msg": "CREATE operation error: 8eac8010-3857-4762-a74c-e57e69a70579 : [OrchestratorConfigProcessor] Deployment failed for Error: [BaseHAConfigProcessor (TopologyBaseHAConfigProcessor)] Error: Failed to resolve reference property appId = f5-ssl-orchestrator-tls, propertyValue: l"}
---
# Reference: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/bigip_sslo_config_topology_module.html#bigip-sslo-config-topology-module-2
- name: Create SSLO Outbound L3 Topology Configuration
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
#ansible_host: "172.16.1.83"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
## Topology
- name: Create an SSLO Outbound L3 Topology
bigip_sslo_config_topology:
name: "demo_out_L3"
state: "present"
topology_type: "outbound_l3"
ssl_settings: "sslconfig"
security_policy: "sslopolicy"
vlans:
- "/Common/client-vlan"
snat: "automap"
gateway: "iplist"
gateway_list:
- ip: "172.16.0.1"
bigip_as3_deploy
ansible [core 2.11.3]
config file = /home/horol/DEV/mvsr/mvdc-ansible/ansible.cfg
configured module search path = ['/home/horol/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/horol/VIRTUALENV/ansible2/lib/python3.8/site-packages/ansible
ansible collection location = /home/horol/.ansible/collections/ansible_collections:/usr/share/ansible/collections
executable location = /home/horol/VIRTUALENV/ansible2/bin/ansible
python version = 3.8.10 (default, Nov 26 2021, 20:14:08) [GCC 9.3.0]
jinja version = 3.0.1
libyaml = False
Sys::Version
Main Package
Product BIG-IP
Version 15.1.4
Build 0.0.47
Edition Final
Date Wed Aug 18 16:45:18 PDT 2021
Ubuntu 20.04.3 LTS
This is AS3 JSON configuration for testing purpose. Deploying this JSON is working (using `bigip_as3_deploy' module), but I can't delete it using the same module (see 'STEPS TO REPRODUCE').
{
"class": "AS3",
"action": "deploy",
"declaration": {
"class": "ADC",
"schemaVersion": "3.32.0",
"id": "tpl-1.0",
"label": "",
"remark": "General template",
"updateMode": "selective",
"test-as3-1": {
"class": "Tenant",
"defaultRouteDomain": 5,
"enable": true,
"appA": {
"class": "Application",
"template": "generic",
"web01-81": {
"class": "Service_HTTP",
"virtualPort": 81,
"virtualAddresses": ["10.99.1.1"],
"pool": "sf_web01"
},
"web02-80": {
"class": "Service_HTTP",
"virtualPort": 80,
"virtualAddresses": ["10.99.1.2"],
"pool": "sf_web01"
},
"sf_web01": {
"class": "Pool",
"monitors": [
"http",
{
"use": "mon-http_web01"
}
],
"members": [
{
"servicePort": 80,
"serverAddresses": ["192.99.1.1", "192.99.1.2"]
}
]
},
"mon-http_web01": {
"class": "Monitor",
"monitorType": "http",
"send": "HEAD / HTTP/1.0\\r\\n\\r\\n",
"receive": "HTTP/1."
}
}
}
}
}
Unable to remove tenant using bigip_as3_deploy
module.
Removing tenant using REST call (e.g. Postman) is working: DELETE https://bigip/mgmt/shared/appsvcs/declare/test-as3-1
or DELETE https://bigip/mgmt/shared/appsvcs/declare/test-as3-1?async=true
.
- name: "Declaration DELETE"
f5networks.f5_bigip.bigip_as3_deploy:
tenant: "test-as3-1"
state: absent
tags: delete
I expect, that Tenant should be correctly removed.
TASK [F5AS3-02: Declaration DELETE] ******************************************************************************************
task path: /home/horol/DEV/mvsr/mvdc-ansible/playbooks/f5as3.pb.yaml:82
The full traceback is:
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_rhg4qr9e/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 382, in main
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_rhg4qr9e/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 207, in exec_module
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_rhg4qr9e/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 227, in absent
File "/tmp/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload_rhg4qr9e/ansible_f5networks.f5_bigip.bigip_as3_deploy_payload.zip/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py", line 235, in remove
fatal: [dca_bigip_guesttst_02]: FAILED! => {
"ansible_facts": {
"discovered_interpreter_python": "/usr/bin/python3"
},
"changed": false,
"invocation": {
"module_args": {
"content": null,
"state": "absent",
"tenant": "test-as3-1",
"timeout": 300
}
},
"msg": "Failed to delete the resource."
}
bigip_ucs_fetch
TOWER 3.8.4, ANSIBLE 2.9.27
BIGIP 14.1.2.6
Using a single playbook where I have successfully utilized v1 modules from the respective collection. Trying to selectively use v2 modules for particular tasks without much success. I am setting the respective variables as follows:
Then I am invoking the bigip_ucs_fetch as follows:
However, receive a server code 400 error with 'Authentication process failed'. The originalRequestBody populated the following:
{"username": redacted, "password": redacted, "loginProviderName": {"server_port": 443, "server": redacted, "user": redacted, "timeout": 300, "password": redacted, "validate_certs": "no"}}
Java exception error with 'Expected a string but was BEGIN_OBJECT at line 1 column 98 path $.loginProviderName
What would be the correct format for the RequestBody for v2 module usage here and what variables am I missing to set? Can I even have a playbook leveraging both v1 and v2 modules?
Ansible module: bigip_sslo_config_topology
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
Pool assignment for an inbound L3 topology doesn't seem to work. No pool is assigned to the topology VIP.
Using this declaration:
- name: Create SSLO Topology
bigip_sslo_config_topology:
name: "l3inboundapp"
topology_type: "inbound_l3"
dest: "10.0.2.200/32"
port: 443
ssl_settings: "sslconfig"
security_policy: "ssloP_sslopolicy"
vlans:
- "/Common/external"
snat: "automap"
pool: "/Common/webapp"
webapp pool should be assigned to the topology. Also, as a function of an "application mode" inbound topology, a pool is assigned and address/port translation are enabled.
webapp pool is not assigned to the topology, and address/port translation are not enabled.
bigip_sslo_service_layer3
>= 9.3
The Inline L3 service definition does not provide a vendor_info option, as described in:ย f5-ssl-orchestrator-service (Inline Layer 3)
None
Ansible module: bigip_sslo_config_policy
ansible [core 2.12.4]
config file = /sslo/ansible/ansible.cfg
configured module search path = ['/sslo/ansible/library']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /sslo/ansible/collection
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.1
Build 0.0.6
Edition Point Release 0
Date Thu Oct 8 02:52:59 PDT 2020
ansible.cfg:
[defaults]
host_key_checking = False
retry_files_enabled = False
inventory = ./inventory/hosts
library = ./library
roles_path = ./roles
collections_paths = ./collection
Running Ansible inside Ubuntu:20.04 Docker container
fatal: [localhost]: FAILED! => {"changed": false, "module_stderr": "'pools'", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"}
---
- name: Modify SSLO Policy
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
ansible_host: "{{ansible_host}}"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "{{ansible_httpapi_password}}"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
## Security Policy
- name: Modify SSLO policy
bigip_sslo_config_policy:
name: "sslopolicy"
policy_consumer: "inbound"
policy_rules:
- name: "us_traffic"
policy_action: "reject"
#ssl_forwardproxy_action: "intercept"
service_chain: "ssloSC_service_chain_1"
conditions:
- condition_type: "client_ip_geolocation"
geolocations:
- type: "countryCode"
value: "US"
- name: "all_ssl_traffic"
policy_action: "allow"
ssl_forwardproxy_action: "intercept"
service_chain: "ssloSC_service_chain_2"
conditions:
- condition_type: "server_port_match"
condition_option_ports:
- "443"
Expected result should be an updated policy.
The above always returns an error message about 'pools'.
bigip_sslo_service_tap
ansible [core 2.12.5]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3.2
Build 0.0.4
Edition Point Release 2
Date Wed Sep 14 08:12:07 PDT 2022
9.3.41
No specific system/ansible configuration changes
Ubuntu 20.04
Python 3.8.10
The TAP service definition defaults to enabling port remap. This option should default to no port remap if not specified.
---
# Reference: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/bigip_sslo_service_tap_module.html#bigip-sslo-service-tap-module-2
- name: Create SSLO TAP Service Configuration
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
#ansible_host: "172.16.1.83"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: SSLO TAP service with interface
bigip_sslo_service_tap:
name: "tap1"
state: "present"
devices:
interface: "1.7"
mac_address: "12:12:12:12:12:12"
bigip_sslo_config_policy
ansible [core 2.12.5]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3.2
Build 0.0.4
Edition Point Release 2
Date Wed Sep 14 08:12:07 PDT 2022
9.3.41
No specific system/ansible configuration changes
Ubuntu 20.04
Python 3.8.10
An SSLO security policy yaml will fail if the security policy already exists. This is not the same behavior as other SSLO modules.
fatal: [172.16.1.83]: FAILED! => {"changed": false, "module_stderr": "'conditions'", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"}
---
# Reference: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/bigip_sslo_config_policy_module.html#bigip-sslo-config-policy-module-2
- name: Create SSLO Outbound Security Policy Configuration
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
#ansible_host: "172.16.1.83"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
## Security Policy
- name: Create an SSLO security policy
bigip_sslo_config_policy:
name: "sslopolicy"
state: "absent"
policy_consumer: "outbound"
default_rule:
allow_block: "allow"
tls_intercept: "intercept"
service_chain: "service_chain_2"
policy_rules:
- name: "Pinners_Rule"
match_type: "match_all"
policy_action: "allow"
ssl_action: "bypass"
conditions:
- condition_type: "ssl_check"
- condition_type: "category_lookup_sni"
condition_option_category:
- "Pinners"
- name: "bypass_pii_traffic"
policy_action: "allow"
ssl_action: "bypass"
service_chain: "service_chain_1"
conditions:
- condition_type: "category_lookup_sni"
condition_option_category:
- "Financial Data and Services"
Hi, I have tested this new ansible v2 version, but ran into issue's talking to big-iq.
When talking towards the big-ip, it seems to work ok, but to big-iq it does not.
I can deploy and patch via the API, using ansible or PHP. And using the old ATC module found on devcentral GitHub (= also API)
But using the same json file with the new module towards big-iq, i can't get it to work.
so for now iโm using the normal API functionality in ansible to talk to the big-iq
with the files (playbook and json decleration ) beneath, i get the following message:
fatal: [localhost]: FAILED! => {"changed": false, "msg": "{'code': 422, 'message': 'status:422, body:{\"code\":422,\"message\":\"Invalid request value \\'[object Object]\\' (path: /declaration) : should have required property \\'class\\' {\\\\\"missingProperty\\\\\":\\\\\"class\\\\\"}\"}', 'originalRequestBody': '{\"code\":422,\"message\":\"Invalid request value \\'[object Object]\\' (path: /declaration) : should have required property \\'class\\' {\\\\\"missingProperty\\\\\":\\\\\"class\\\\\"}\"}', 'referer': '10.10.0.xxx', 'restOperationId': 57963195, 'errorStack': [], 'kind': ':resterrorresponse'}"}
iโm deploying from ansible to big-iq which will then go to the target big-ip
exactly the same json file, does work when using an api call (and it does not matter if iโm doing the api call from ansible of from a php script or even from within vscode)
so somehow, the v2 ansible module, is expecting a different json format? but i canโt figure out, what it is.
do you have any clue? or could there be a fault in the big-iq module part of the plugin?
stripping out the application name at the top, and the target part, makes me able to push this json file using the v2 ansible module to the big-IP.
so itโs only big-IQ related
please let me know your thoughts
- hosts: all
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
ansible_host: "10.10.0.xxx"
ansible_user: "admin"
ansible_httpapi_password: "xxxxx"
ansible_network_os: f5networks.f5_bigip.bigiq
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
- name: Declaration test
bigiq_as3_deploy:
content: "{{ lookup('file', '1.json') }}"
#service_type: "as3"
{
"applicationName": "O-E",
"appSvcsDeclaration": {
"class": "AS3",
"action": "deploy",
"declaration": {
"class": "ADC",
"schemaVersion": "3.31.0",
"target": {
"address": "10.10.0.xxx"
},
"pco": {
"class": "Tenant",
"vip1_443": {
"class": "Application",
"template": "https",
"serviceMain": {
"pool": {
"use": "pl_vip1_80"
},
"snat": "auto",
"enable": true,
"iRules": [
{
"use": "iRule_vip1_443"
}
],
"remark": "lekker hoor",
"serverTLS": "clssl_vip1",
"profileTCP": {
"use": "tcp_vip1_443"
},
"virtualPort": 443,
"profileHTTP": {
"use": "http_vip1_443"
},
"virtualAddresses": [
"10.10.10.1"
],
"persistenceMethods": [
{
"use": "sticky-default_vip1"
}
],
"fallbackPersistenceMethod": {
"use": "sticky-fallback_vip1"
},
"serviceDownImmediateAction": "drop",
"class": "Service_HTTPS"
},
"mon_vip1_80": {
"send": "GET / HTTP/1.0\r\n\r\n",
"receive": "",
"receiveDown": "",
"class": "Monitor",
"monitorType": "http"
},
"pl_vip1_80": {
"members": [
{
"adminState": "enable",
"servicePort": 80,
"serverAddresses": [
"192.168.4.10"
]
}
],
"monitors": [
{
"use": "mon_vip1_80"
}
],
"class": "Pool"
},
"iRule_vip1_443": {
"iRule": {
"text": "when HTTP_REQUEST { log local0. \"test irule\" }"
},
"class": "iRule"
},
"sticky-default_vip1": {
"persistenceMethod": "cookie",
"class": "Persist"
},
"sticky-fallback_vip1": {
"persistenceMethod": "source-address",
"class": "Persist"
},
"clssl_vip1": {
"certificates": [
{
"certificate": "crt_vip1"
}
],
"class": "TLS_Server"
},
"crt_vip1": {
"privateKey": {
"bigip": "/Common/default.key"
},
"certificate": {
"bigip": "/Common/default.crt"
},
"class": "Certificate"
},
"tcp_vip1_443": {
"class": "TCP_Profile"
},
"http_vip1_443": {
"class": "HTTP_Profile"
}
}
}
}
}
}
I've tried a number of playbooks ucs_fetch, bigip_qkview both imperative and declarative and I get errors. If I revert to 15.1.8 they work fine.
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: urllib.error.URLError:
fatal: [f5 -> localhost]: FAILED! => changed=false
module_stderr: |-
Traceback (most recent call last):
File "/usr/local/lib/python3.8/urllib/request.py", line 1354, in do_open
h.request(req.get_method(), req.selector, req.data, headers,
File "/usr/local/lib/python3.8/http/client.py", line 1256, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/local/lib/python3.8/http/client.py", line 1302, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.8/http/client.py", line 1251, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.8/http/client.py", line 1011, in _send_output
self.send(msg)
File "/usr/local/lib/python3.8/http/client.py", line 951, in send
self.connect()
File "/usr/local/lib/python3.8/http/client.py", line 1418, in connect
super().connect()
File "/usr/local/lib/python3.8/http/client.py", line 922, in connect
self.sock = self._create_connection(
File "/usr/local/lib/python3.8/socket.py", line 808, in create_connection
raise err
File "/usr/local/lib/python3.8/socket.py", line 796, in create_connection
sock.connect(sa)
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/centos/.ansible/tmp/ansible-tmp-1669045074.1295142-18969-255249079795967/AnsiballZ_bigip_qkview.py", line 107, in <module>
_ansiballz_main()
File "/home/centos/.ansible/tmp/ansible-tmp-1669045074.1295142-18969-255249079795967/AnsiballZ_bigip_qkview.py", line 99, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/home/centos/.ansible/tmp/ansible-tmp-1669045074.1295142-18969-255249079795967/AnsiballZ_bigip_qkview.py", line 47, in invoke_module
runpy.run_module(mod_name='ansible_collections.f5networks.f5_modules.plugins.modules.bigip_qkview', init_globals=dict(_module_fqn='ansible_collections.f5networks.f5_modules.plugins.modules.bigip_qkview', _modlib_path=modlib_path),
File "/usr/local/lib/python3.8/runpy.py", line 207, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/local/lib/python3.8/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_qkview.py", line 607, in <module>
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_qkview.py", line 600, in main
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_qkview.py", line 224, in exec_module
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/modules/bigip_qkview.py", line 237, in is_version_less_than_14
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/module_utils/icontrol.py", line 551, in tmos_version
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/module_utils/bigip.py", line 31, in api
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/module_utils/bigip.py", line 52, in connect_via_token_auth
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/module_utils/icontrol.py", line 239, in post
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible_collections/f5networks/f5_modules/plugins/module_utils/icontrol.py", line 194, in send
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible/module_utils/urls.py", line 1446, in open
File "/usr/local/lib/python3.8/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/usr/local/lib/python3.8/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/usr/local/lib/python3.8/urllib/request.py", line 542, in _open
result = self._call_chain(self.handle_open, protocol, protocol +
File "/usr/local/lib/python3.8/urllib/request.py", line 502, in _call_chain
result = func(*args)
File "/tmp/ansible_bigip_qkview_payload_swf1ab3w/ansible_bigip_qkview_payload.zip/ansible/module_utils/urls.py", line 582, in https_open
File "/usr/local/lib/python3.8/urllib/request.py", line 1357, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error timed out>
module_stdout: ''
msg: |-
MODULE FAILURE
See stdout/stderr for the exact error
rc: 1
name: Only create new UCS, no download
hosts: f5
gather_facts: false
connection: local
vars:
provider:
password: "{{ ansible_ssh_pass }}"
server: "{{ private_ip }}"
user: "{{ ansible_user }}"
validate_certs: "no"
timeout: 1600
tasks:
name: Only create new UCS, no download
hosts: f5
gather_facts: false
connection: local
vars:
provider:
password: "{{ ansible_ssh_pass }}"
server: "{{ private_ip }}"
user: "{{ ansible_user }}"
validate_certs: "no"
timeout: 1600
tasks:
name: Fetch BIG-IP UCS
hosts: f5
connection: local
gather_facts: False
vars:
provider:
password: "{{ ansible_ssh_pass }}"
server: "{{ private_ip }}"
user: "{{ ansible_user }}"
validate_certs: False
tasks:
name: get current time on localhost
command: date "+%H%M%S-%m%d%y"
register: date
delegate_to: localhost
run_once: True
name: set filename var
set_fact:
ucs: "{{ 'test' + '-' + date.stdout + '-backup.ucs' }}"
name: Download a new UCS
bigip_ucs_fetch:
src: "{{ ucs }}"
dest: "{{ '/tmp/ucs/' + ucs }}"
provider: "{{ provider }}"
delegate_to: localhost
Ansible module: bigiq_as3_deploy
ansible [core 2.12.3]
config file = /opt/ansible/win/ansible.cfg
configured module search path = ['/opt/ansible/shared/library']
ansible python module location = /opt/venvs/ansible50/lib/python3.8/site-packages/ansible
ansible collection location = /opt:/usr/share/ansible/collections
executable location = /opt/venvs/ansible50/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.0.3
libyaml = True
BIG-IP 14.1.4.5 Build 0.0.7 Point Release 5
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=20.04
DISTRIB_CODENAME=focal
DISTRIB_DESCRIPTION="Ubuntu 20.04.4 LTS"
f5networks.bigiq_as3_deploy
modulecontent
field that does not comply with the schema overlay that's being used
Task fails and prints the results to console
For example,
TASK FAILED =>
{'id': 'f837b0c2-b03e-4efe-8df5-45aaff82248c', 'results': [{'code': 422, 'declarationFullId': '', 'message': 'declaration is invalid according to provided schema overlay: data should NOT have additional properties'}], 'declaration': {}}
Task succeeds and playbook continues running. No error messages are displayed
bigip_sslo_service_layer2
ansible [core 2.12.5]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/dist-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.8.10 (default, Mar 15 2022, 12:22:08) [GCC 9.4.0]
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 16.1.3.2
Build 0.0.4
Edition Point Release 2
Date Wed Sep 14 08:12:07 PDT 2022
9.3.41
No specific system/ansible configuration changes
Ubuntu 20.04
Python 3.8.10
Fatal error when trying to configure an inline L2 service
fatal: [172.16.1.83]: FAILED! => {"changed": false, "module_stderr": "'dict object' has no attribute 'service_subnet'", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error"}
---
# Reference: https://clouddocs.f5.com/products/orchestration/ansible/devel/f5_bigip/modules_2_0/bigip_sslo_service_layer2_module.html#bigip-sslo-service-layer2-module-2
- name: Create SSLO Inline L2 Service Configuration
hosts: all
gather_facts: False
collections:
- f5networks.f5_bigip
connection: httpapi
vars:
#ansible_host: "172.16.1.83"
ansible_httpapi_port: 443
ansible_user: "admin"
ansible_httpapi_password: "admin"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
tasks:
## Inline L2 service
- name: Create an SSLO L2 service
bigip_sslo_service_layer2:
name: "FEYE"
devices:
- name: "FEYE1"
ratio: 1
interface_in: "1.4"
interface_out: "1.5"
port_remap: 8080
When AS3 JSON file is deployed with f5networks.f5_bigip.bigip_as3_deploy module, the "error response" (result of the REST call in case of error) is not detailed. Example:
TASK [F5AS3-DEPLOY-02-021: Deploy JSON file to device] ******************************
skipping: [dca_bigip_guesttst_01]
fatal: [dca_bigip_guesttst_02]: FAILED! => {"changed": false, "msg": "declaration failed"}
When the same JSON file is deployed using Postman, the response looks much clearly:
{
"code": 422,
"errors": [
"/test-as3-01/appB/vs_web2-443/profileTCP: should NOT have fewer than 2 properties"
],
"declarationFullId": "",
"message": "declaration is invalid"
}
Is it possible get the same result from ansible module? I'd like to see what is happen, not only "declaration failed"
Working alternative to bigip_as3_deploy
module is builtin uri
module:
## 02 Deploy
- block:
# ## 02-01 Deploy JSON file to device
# - name: "{{pb_prfx}}-02-01: Deploy JSON file to device (bigip_as3_deploy module)"
# f5networks.f5_bigip.bigip_as3_deploy:
# content: "{{ lookup('file', JSON_FILE ) }}"
# register: DEPLOY_RESULT
# when:
# - active_member is defined
# - active_member == true
# tags:
# - deploy
# - deploy-debug
- name: "{{pb_prfx}}-02-02: Deploy JSON file to device (uri module)"
ansible.builtin.uri:
url: "https://{{ bigip_host }}:{{ provider.server_port }}/mgmt/shared/appsvcs/declare"
method: POST
body: "{{ lookup('file', JSON_FILE ) }}"
body_format: json
status_code: [200, 201, 202, 422]
timeout: 300
force_basic_auth: yes
user: "{{ provider.user }}"
password: "{{ provider.password }}"
validate_certs: "{{ provider.validate_certs }}"
register: DEPLOY_RESULT
failed_when: DEPLOY_RESULT.status != 200
no_log: true
when:
- active_member is defined
- active_member == true
tags:
- deploy
- deploy-debug
rescue:
- name: "(debug) ERROR: Deploy error"
debug:
msg:
- "RESULTS: {{ DEPLOY_RESULT }}"
tags:
- deploy-debug
- name: "ERROR: Deploy error"
debug:
msg:
- "msg: {{ DEPLOY_RESULT.msg }}"
- "results: {{ DEPLOY_RESULT.json.results }}"
tags:
- deploy-debug
- deploy
bigip_do_deploy
ansible [core 2.14.1]
config file = None
configured module search path = ['/home/rnot/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/rnot/.local/lib/python3.10/site-packages/ansible
ansible collection location = /home/rnot/.ansible/collections:/usr/share/ansible/collections
executable location = /home/rnot/.local/bin/ansible
python version = 3.10.6 (main, Nov 14 2022, 16:10:14) [GCC 11.3.0] (/usr/bin/python3)
jinja version = 3.1.2
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 15.1.7
Build 0.0.6
Edition Final
Date Thu Jul 28 01:41:23 PDT 2022
N/A
N/A
I would like to send a DO configuration with dryRun set to true to determine if the DO configuration would cause any changes or not.
Example from here:
{
"schemaVersion": "1.23.0",
"class": "Device",
"async": true,
"label": "my BIG-IP declaration for declarative onboarding",
"controls": {
"trace": true,
"traceResponse": true,
"dryRun": true
},
"Common": {
"class": "Tenant",
"mySystem": {
"class": "System",
"hostname": "bigip.example.com",
"cliInactivityTimeout": 1200,
"consoleInactivityTimeout": 1200,
"autoPhonehome": false
}
}
}
Ansible tasks:
- name: Deploy DO configuration
f5networks.f5_bigip.bigip_do_deploy:
content: "{{ lookup('ansible.builtin.template', 'do.json.j2', template_vars=do_config | default({})) }}"
register: f5bigip_do_task
tags: f5bigip
- name: Get DO task status
f5networks.f5_bigip.bigip_do_deploy:
task_id: "{{ f5bigip_do_task.task_id }}"
register: f5bigip_do_details
tags: f5bigip
When dryRun is set to true (the default is false) BIG-IP Declarative Onboarding sends the declaration through all validation checks but does not attempt to deploy the configuration on the target device. The response contains information on what would have been deployed (a diff between the existing configuration and what the declaration would deploy). This can be useful for testing and debugging declarations.
Results seem to be the same whether or not dryRun is true or false.
bigip_as3_deploy
ansible [core 2.11.4]
config file = /home/gerace/Documents/ansible/ansible.cfg
configured module search path = ['/home/gerace/Documents/ansible/library']
ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
ansible collection location = /home/gerace/Documents/ansible/collections
executable location = /usr/local/bin/ansible
python version = 3.6.8 (default, Nov 16 2020, 16:55:22) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
jinja version = 3.0.1
libyaml = True
Sys::Version
Main Package
Product BIG-IP
Version 13.1.3.4
Build 0.0.5
Edition Point Release 4
Date Tue Jun 16 14:26:18 PDT 2020
ansible_user: "admin"
ansible_httpapi_password: "{{ bigip_admin_pass }}"
ansible_network_os: f5networks.f5_bigip.bigip
ansible_httpapi_use_ssl: yes
ansible_httpapi_validate_certs: no
attempting to remove all tenants based applications which also includes objects in /Common/shared.
The play runs, but nothing is changed on the BIG-IP, also nothing is logged in /var/log/restnoded/restnoded.log.
[gerace@europa ansible]$ ansible-playbook another_test.yml -e host=bigtest01
PLAY [test kubernetes api] ************************************************************************************************************************************************************
TASK [Remove all existing AS3 configurations] *****************************************************************************************************************************************
ok: [bigtest01]
PLAY RECAP ****************************************************************************************************************************************************************************
bigtest01 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Add multiple tenants to a device ( I tested with 2 partitions and configs in Common/shared).
Use ansible bigip_as3_deploy module from ansible v2 collection to remove all tenants.
tasks:
- name: Remove all existing AS3 configurations
bigip_as3_deploy:
state: absent
tenant: all
I'd expect the configurations from Common/shared and the two other tenants to be removed.
Nothing happened on the device. Nothing was logged to restnoded.log.
[gerace@europa ansible]$ ansible-playbook another_test.yml -e host=bigtest01 -vvvv
ansible-playbook [core 2.11.4]
config file = /home/gerace/Documents/ansible/ansible.cfg
configured module search path = ['/home/gerace/Documents/ansible/library']
ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
ansible collection location = /home/gerace/Documents/ansible/collections
executable location = /usr/local/bin/ansible-playbook
python version = 3.6.8 (default, Nov 16 2020, 16:55:22) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)]
jinja version = 3.0.1
libyaml = True
Using /home/gerace/Documents/ansible/ansible.cfg as config file
setting up inventory plugins
host_list declined parsing /home/gerace/Documents/ansible/hosts as it did not pass its verify_file() method
script declined parsing /home/gerace/Documents/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /home/gerace/Documents/ansible/hosts as it did not pass its verify_file() method
Parsed /home/gerace/Documents/ansible/hosts inventory source with yaml plugin
Loading collection f5networks.f5_bigip from /home/gerace/Documents/ansible/collections/ansible_collections/f5networks/f5_bigip
Loading callback plugin default of type stdout, v2.0 from /usr/local/lib/python3.6/site-packages/ansible/plugins/callback/default.py
Skipping callback 'default', as we already have a stdout callback.
Skipping callback 'minimal', as we already have a stdout callback.
Skipping callback 'oneline', as we already have a stdout callback.
PLAYBOOK: another_test.yml ************************************************************************************************************************************************************
Positional arguments: another_test.yml
verbosity: 4
connection: smart
timeout: 10
become_method: sudo
tags: ('all',)
inventory: ('/home/gerace/Documents/ansible/hosts',)
extra_vars: ('host=bigtest01',)
forks: 5
1 plays in another_test.yml
PLAY [test kubernetes api] ************************************************************************************************************************************************************
Trying secret FileVaultSecret(filename='/home/gerace/Documents/ansible/.vault-pass.txt') for vault_id=default
META: ran handlers
TASK [Remove all existing AS3 configurations] *****************************************************************************************************************************************
task path: /home/gerace/Documents/ansible/another_test.yml:23
redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
Loading collection ansible.netcommon from /home/gerace/Documents/ansible/collections/ansible_collections/ansible/netcommon
<192.168.0.15> attempting to start connection
<192.168.0.15> using connection plugin ansible.netcommon.httpapi
Found ansible-connection at path /usr/local/bin/ansible-connection
<192.168.0.15> local domain socket does not exist, starting it
<192.168.0.15> control socket path is /home/gerace/.ansible/pc/a52775cbff
<192.168.0.15> redirecting (type: connection) ansible.builtin.httpapi to ansible.netcommon.httpapi
<192.168.0.15> Loading collection ansible.netcommon from /home/gerace/Documents/ansible/collections/ansible_collections/ansible/netcommon
<192.168.0.15> Loading collection f5networks.f5_bigip from /home/gerace/Documents/ansible/collections/ansible_collections/f5networks/f5_bigip
<192.168.0.15> local domain socket listeners started successfully
<192.168.0.15> loaded API plugin ansible_collections.f5networks.f5_bigip.plugins.httpapi.bigip from path /home/gerace/Documents/ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/httpapi/bigip.py for network_os f5networks.f5_bigip.bigip
<192.168.0.15>
<192.168.0.15> local domain socket path is /home/gerace/.ansible/pc/a52775cbff
<192.168.0.15> Using network group action bigip for bigip_as3_deploy
<192.168.0.15> ANSIBLE_NETWORK_IMPORT_MODULES: disabled
<192.168.0.15> ANSIBLE_NETWORK_IMPORT_MODULES: module execution time may be extended
<192.168.0.15> ESTABLISH LOCAL CONNECTION FOR USER: gerace
<192.168.0.15> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_ `"&& mkdir "` echo /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493 `" && echo ansible-tmp-1632849052.1256409-29043-228627444858493="` echo /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493 `" ) && sleep 0'
Using module file /home/gerace/Documents/ansible/collections/ansible_collections/f5networks/f5_bigip/plugins/modules/bigip_as3_deploy.py
<192.168.0.15> PUT /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/tmp2jvhcwty TO /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493/AnsiballZ_bigip_as3_deploy.py
<192.168.0.15> EXEC /bin/sh -c 'chmod u+x /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493/ /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493/AnsiballZ_bigip_as3_deploy.py && sleep 0'
<192.168.0.15> EXEC /bin/sh -c '/usr/bin/python3 /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493/AnsiballZ_bigip_as3_deploy.py && sleep 0'
<192.168.0.15> EXEC /bin/sh -c 'rm -f -r /home/gerace/.ansible/tmp/ansible-local-29027r6d2qgl_/ansible-tmp-1632849052.1256409-29043-228627444858493/ > /dev/null 2>&1 && sleep 0'
ok: [bigtest01] => {
"changed": false,
"invocation": {
"module_args": {
"content": null,
"state": "absent",
"tenant": "all",
"timeout": 300
}
}
}
META: ran handlers
META: ran handlers
PLAY RECAP ****************************************************************************************************************************************************************************
bigtest01 : ok=1 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
velos_partition_image
ansible [core 2.13.3]
config file = None
configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.10/site-packages/ansible
ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.10.5 (main, Jul 25 2022, 15:52:08) [GCC 11.2.1 20220219]
jinja version = 3.1.2
libyaml = False
velos: 1.4.0-4112
N/A
Unable to initiate file transfer. Receive the following error:
fatal: [controller4]: FAILED! => {"changed": false, "msg": "{'ietf-restconf:errors': {'error': [{'error-type': 'application', 'error-tag': 'malformed-message', 'error-path': '/f5-utils-file-transfer:file/transfer-status', 'error-message': ' Only configs/ diags/ diags/core/ diags/crash/ diags/shared/ images/ images/import/ images/staging/ log/ log/confd/ log/controller/ log/host/ mibs/ configs/ diags/shared/ configs/ diags/shared/ images/import/iso/ images/import/os/ images/import/services/ images/staging/ paths are allowed for File transfer status operation.'}]}}"}
- name: Create Partition
connection: httpapi
hosts: "{{lookup('env','controller_inventory_group') or 'controllers_test'}}"
collections:
- f5networks.f5_bigip
any_errors_fatal: true
tasks:
- name: Verify partition image is on Velos controller
velos_partition_image:
image_name: "{{ image_name }}"
protocol: https
remote_host: "{{ server_name }}
remote_path: "{{ uri_to_file }}"
state: present
{
"f5-utils-file-transfer:output": {
"result": "File transfer is initiated.(images/import/iso/F5OS-C-1.4.0-4112.PARTITION.CANDIDATE.iso)"
}
}
fatal: [controller4]: FAILED! => {"changed": false, "msg": "{'ietf-restconf:errors': {'error': [{'error-type': 'application', 'error-tag': 'malformed-message', 'error-path': '/f5-utils-file-transfer:file/transfer-status', 'error-message': ' Only configs/ diags/ diags/core/ diags/crash/ diags/shared/ images/ images/import/ images/staging/ log/ log/confd/ log/controller/ log/host/ mibs/ configs/ diags/shared/ configs/ diags/shared/ images/import/iso/ images/import/os/ images/import/services/ images/staging/ paths are allowed for File transfer status operation.'}]}}"}
Looks like there is a missing field 'local-file' that is required: https://clouddocs.f5.com/api/velos-api/F5OS-C-1.1.0-api.html#operation/data_f5_utils_file_transfer_file_import_post
While the accepted list for this field is noted in error, the correct input for iso file imports is : images/import/iso/
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.