BIG-IQ
689 TopicsBIG-IQ REST - Is it possible to expandSubcollections=true
Hi, I try to get a List of all virtual servers with all of their configurer objects from out BIG-IQ. A request on https:///mgmt/cm/adc-core/working-config/ltm/virtual will give me a list of all virtual servers that the BIG-IQ knows about, but several parts like pools, vlans etc. are just a reference link. On BIG-IP API LTM there is a expandSubcollections Parameter that will (if set =true) resolve such references and get you the hole story. I tried https:///mgmt/cm/adc-core/working-config/ltm/virtual?$top=2&expandSubcollections=true, but still only got reflinks in the result, instead of the resolved data. This seems not be possible on a BIG-IQ, right? In the end all I want is a JSON Representation of all the configurerd virtual servers (wich are thousands in numbers). Querying the LTM itself is no option.Solved60Views0likes4CommentsBigIQ integration with Cisco ACS (TACACS+)
I'm working with Big-IQ Central Manager and would like to authenticate against our TACACS (Cisco ACS) and use the RBAC capabilities; however the documentation is slim at best. I'm getting an error, "User has no roles or groups associations. Trying to compare what we set our LTMs to authenticate using remote roles that are defined in ACS (below) to what I have on our BigIQ. On our LTMs: 1. No users defined local 2. Authentication - Remote - TACACS+ 3. Remote Role Groups a. Group Name = TAC-Auth b. Line Order 20 (Relative to our env.) c. Attribute String = F5-LTM-User-Info-1=TAC-Auth d. Remote Acccess = Enabled e. Assigned Role = Other = %F5-LTM-User-Role f. Partition Access = Other = %F5-LTM-Partition g. Terminal Access = Other = %F5-LTM-User-Console On ACS (Only giving one example) Shell Profiles 1. F5-Device-TACAuth-Admin 2. Custom Attributes a. F5-LTM-User-Info-1 = TAC-Auth b. F5-LTM-User-Console = enable c. F5-LTM-User-Role = Administrator d. F5-LTM-Partition = All BigIQ 1. Auth Providers = a. Name = NA_ACS b. Type = TACACS+ 2. User Groups a. F5_Admin c. Authorization Attributes F5-BigIQ-User-Info = F5_Admin %F5-BigIQ-User-Role = Administrator ACS - Note: My understanding is that since BigIQ doesn't use partitions or the Terminal/Console role it might not be needed. 2. Custom Attributes a. F5-LTM-User-Info-1 = F5_Admin b. F5-LTM-User-Role = Administrator Thank you in advance for any insight! /jeff883Views0likes3CommentsBIG-IQ 8.3 - no BIG IQ Central Management option
Trying to build a BIG IQ v 8.3 on Hyper V but I keep running into an issue where I can licence the box using a trial licence (all appears to be working as expected) create the Master Keys and reset the Password but as soon as I get to the System Personality the option for BIG-IQ Central Management is not available. It only presents me the option of BIG-IQ Data Collection Device. If skip the licence at Step 1 then I also get the option to create a License Manager but that's not really very useful either. 🤨 The guide I am following is the F5 one - BIG IQ Build Guide - and have assigned the VM 32GB RAM and 8 cores after initially trying it with half the above figures which I thought might be the issue but still no joy. Have deleted the VM and recreated using a new copy of the VHD file - same problem seen so I am at a bit of a loss as to what to try next. Any suggestions would be much appreciated.Solved129Views0likes6CommentsF5 - AS3 - BIGIQ / BIGIP SchemaVersion Missunderstanding
Dear community, I was wondering about the AS3 version currently used in order to deploy my AS3 on my BIG-IP target through BIG-IQ. BIG-IQ should install this current AS3 version on F5 BIG-IP target when deploying AS3 declaration. Checking on my BIG-IQ, 3.44.0 curl -sk -H "Content-Type: application/json" -H "X-F5-Auth-Token: $TOKEN" -X GET "https://$BIGIQ/mgmt/shared/appsvcs/info" {"version":"3.44.0","release":"3","schemaCurrent":"3.44.0","schemaMinimum":"3.0.0"} Checking on my F5 BIG-IP, v 3.44.0 #pwd /var/config/rest/iapps/f5-appsvcs # cat version 3.44.0-3 My current AS3 declaration (I'm manually forcing schemaVersion) through BIG-IQ : { "class": "AS3", "action": "patch", "schemaVersion": "3.44.0", "patchBody": [ { "class": "ADC", "schemaVersion": "3.44.0", "target": { "address": "X.X.X.X" }, "op": "add", "path": "/Automation/APP_TEST_1.2.12.140_446", "value": { "class": "Application", "remark": "REFERENCE : NULL_REFERENCE_20241109215237", "schemaOverlay": "AS3-F5-HTTPS-PASSTHROUGH-lb-template-big-iq", .... etc } Application Deployment logs from my BIG-IQ : At the bottom : "schemaVersion": "3.12.0" I don't understand why it's using this older schemaVersion, it should use the current 3.44.0. Is there any policy on BIG-IQ that can enforce this weird behavior ? { "id": "autogen_a4c95a0f-13e3-4078-92c3-3a8e6ea6f10c", "class": "ADC", "controls": { "class": "Controls", "userAgent": "BIG-IQ/8.3 Configured by API" }, "Automation": { "class": "Tenant", "APP_TEST_1.2.12.140_446": { "class": "Application", "remark": "REFERENCE : NULL_REFERENCE_20241109215237", "template": "tcp", "serviceMain": { "pool": "/Automation/APP_TEST_1.2.12.140_446/HTTPS_443_pool", "class": "Service_TCP", "enable": true, "profileTCP": { "use": "/Automation/APP_TEST_1.2.12.140_446/HTTPS_443_tcp_profile" }, "virtualPort": 446, "virtualAddresses": [ "1.2.12.140" ], "persistenceMethods": [ "source-address" ], "profileAnalyticsTcp": { "use": "/Automation/APP_TEST_1.2.12.140_446/Analytics_TCP_Profile" } }, "HTTPS_443_pool": { "class": "Pool", "members": [ { "adminState": "enable", "shareNodes": true, "servicePort": 443, "serverAddresses": [ "1.2.12.13" ] } ], "monitors": [ { "use": "/Automation/APP_TEST_1.2.12.140_446/HTTPS_443_monitor" } ], "loadBalancingMode": "least-connections-member" }, "HTTPS_443_monitor": { "send": "GET /\r\n", "class": "Monitor", "receive": "none", "targetPort": 443, "monitorType": "http", "adaptiveWindow": 180, "adaptiveLimitMilliseconds": 1000, "adaptiveDivergencePercentage": 100 }, "Analytics_TCP_Profile": { "class": "Analytics_TCP_Profile", "collectCity": false, "collectRegion": true, "collectCountry": true, "collectNexthop": false, "collectPostCode": false, "collectContinent": true, "collectRemoteHostIp": false, "collectedByClientSide": true, "collectedByServerSide": true, "collectRemoteHostSubnet": true }, "HTTPS_443_tcp_profile": { "class": "TCP_Profile", "synMaxRetrans": 3, "finWaitTimeout": 5 } } }, "updateMode": "selective", "schemaVersion": "3.12.0" } Thanks in advance for your help !47Views0likes0CommentsBig-iq 8.2 questions
Hello, I am tasked with implementing big-iq 8.2, but the documentation isn't clear about some things. I would like to consolidate the management of a number of different f5 load balancer sets. But there are some constraints, and I lack the big-iq terminology to know what I am looking for. So I will give an example. I have f5 pairs in Colorado, Virginia, Utah, and Oregon. - Colorado and Virginia are failover sites for each other. - Utah, and Oregon are also failover sites for each other. But the Colorado/Virginia sites and Utah/Oregon sites are significantly different. Separating them out in silos seems wrong. how do I keep these managed centrally? Do I create big-ip clusters, or device groups? One for Colorado and Virginia, and the other for Utah, and Oregon? I am just confused, and don't know where to start: I have been digging into the documentation but I think i need to be reset on some of the basics. --jason45Views0likes2CommentsBIG-IQ DNS TPS Per Geo Location
Hi, I recently deployed a BIG-IQ, to manage all my F5 LTM and DNS Tenants, I'm reviewing the information shown on the different dashboards of BIG-IQ. On the DNS Dashboard, there is a section named TPS Per Geo Location. For some reason i'm just seeing the world map, but with no data. Does anyone knows how to enable information on this map? regards,24Views0likes0CommentsNeed help in automating BigIQ session summary reports
I have been asked to work out a way of automating the CSV report from BigIQ Monitoring Access Dashboard. Under Access > Sessions > Session Summary I have been filtering Network_Access as the AP result and then manually exporting the CSV there. Our security who does not have a Splunk server is asking for this every 24 hours. Therefore I am looking to see if there is a way I can have a scheduled job run for this. Only things I am finding are configuration automation or automation dealing with ASM. Any help would be greatly appreciated.14Views0likes0CommentsBIG-IQ Archive Error
Hello, We currently have a BIG-IQ that was successfully performing backups to the backup server until the server guys implemented some stricter SSH timeouts. From what I gather, the archive is failing because the BIG-IQ seems to open up the SCP connection before the backup finishes. Once the backup has finished it then proceeds to pass the username but the SSH session has timed out by that time. Does anyone know of a way to change the order in which the BIG-IQ does the backup, so change it from SSH -> Backup -> SSH credentials -> copy file to Backup -> SSH -> SSH credentials -> copy file to Backup?32Views0likes0Comments