xml
78 TopicsPerl Ltm Config To Xml/Excel (version 3.1)
Problem this snippet solves: This code is an improvement of "Perl Ltm Config To Xml (version 2)" published on previous DevCentral (https://devcentral.f5.com/s/articles/perl-ltm-config-to-xml-version-2-1009) queries the components of a virtual servers configuration and prints it out in an XML format. Is recursive through partitions and prints the following data: VS_name VS_IP: IP and port RD (route domain) VS_state: status (ex. green) and state (ex. enabled) Profiles: name, type, context. If SSL profile aditional info: certificate, chain, key, cipher list Persistence iRules: order and name Pool_name LB-method: method, minimum active members Member: name, IP, port, ratio, priority, status and state Monitors: name, interval, timeout. If HTTP/HTTPS monitors additional info: Send and Receive strings SNAT_name SNAT_members Has a new option "-s" or "-simplified" which outputs multiple values like the profile list or pool members into a unique XML field. Purpose of simplified mode is to be able to have one Virtual Server per row on Excel, with lists showing on a single cell. This code has been tested on BIG-IP version 12.1 and 14.1. How to use this snippet: Default mode: ./LTM-to-XML.pl bigip uid pwd > result.xml Example for simplified mode and removal of partition information: ./LTM-to-XML.pl -s 127.0.0.1 uid pwd | sed "s/\/.*app\///g" | sed "s/\/Common\///g" > result.xml The XML can be dragged and dropped on MS Excel and opened as an XML table. Code : (see ZIP file) Tested this on version: 12.1, 14.11KViews2likes3CommentsThe BIG-IP Application Security Manager Part 5: XML Security
This is the fifth article in a 10-part series on the BIG-IP Application Security Manager (ASM). The first four articles in this series are: What is the BIG-IP ASM? Policy Building The Importance of File Types, Parameters, and URLs Attack Signatures This fifth article in the series will discuss the basic concepts of XML and how the BIG-IP ASM provides security for XML. XML Concepts The Extensible Markup Language (XML) provides a common syntax for data transfer between similar systems. XML doesn't specify how to display data (HTML is used for that), but rather it is concerned with describing data that can be manipulated and presented using other languages. XML documents are built on a core set of basic nested structures, and developers can decide how tags are named and organized. XML is used extensively in web applications today, so it's important to have a basic understanding as well as a strong defense for this critical technology. The XML specification (described in this W3C publication) defines an XML document to be well-formed when it satisfies a list of syntax rules provided in the specification. If an XML processor encounters a violation of these rules, it is required to stop processing the file and report the error. A valid XML document is defined as a well-formed document that also conforms to the rules of a schema like the Document Type Definition (DTD) or the newer and more powerful XML Schema Definition (XSD). It's important to have valid XML documents when implementing and using web services. Web Service A web service is any service that is available over a network and that uses standardized XML syntaxes. You've heard of the "... as a Service" right? Well, this is the stuff we're talking about, and XML plays a big role. On a somewhat tangential note, it seems like there are too many "as a Service" acronyms flying around right now...I really need to make up a hilarious one just for the heck of it. I'll let you know how that goes... Anyway, back to reality...a web service architecture consists of a service provider, a service requestor, and a service registry. The service provider implements the service and publishes the service to the service registry using Universal Description, Discovery, and Integration (UDDI) which is an XML-based registry that allows users to register and locate web service applications. The service registry centralizes the services published by the service provider. The service requestor finds the service using UDDI and retrieves the Web Services Definition Language (WSDL) file, which consists of an XML-based interface used for describing the functionality offered by the web service. The service requestor is able to consume the service based on all the goodness found in the WSDL using the UDDI. Then, the service requestor can send messages to the service provider using a service transport like the Simple Object Access Protocol (SOAP). SOAP is a protocol specification for exchanging structured information when implementing web services...it relies on XML for its message format. Now you can see why XML is so closely tied to Web Services. All this craziness is shown in the diagram below. I know what you're thinking...it's difficult to find anything more exciting than this topic! (Picture copied from Wikipedia) Because XML is used for data transfer in the web services architecture, it's important to inspect, validate, and protect XML transactions. Fortunately, the BIG-IP ASM can protect several applications including: Web services that use HTTP as a transport layer for XML data Web services that use encryption and decryption in HTTP requests Web services that require verification and signing using digital signatures Web applications that use XML for client-server data communications (i.e. Microsoft Outlook Web Access) ASM Configuration Before you can begin protecting your XML content, you have to create a security policy using the "XML and Web Services" option. After you create the security policy, you create an XML profile and associate it with the XML security policy. You can read more about creating policies in the Policy Building article in this series. To create an XML profile, you navigate to Application Security >> Content Profiles >> XML Profiles. When all this is done, the XML profile will protect XML applications in the following ways: Validate XML formatting Mask sensitive data Enforce compliance with XML schema files or WSDL documents Provide information leakage protection Offer XML encryption and XML signatures Offer XML content based routing and XML switching Offer XML parser protection against DoS attacks Encrypt and decrypt parts of SOAP web services Validation resources provide the ASM with critical information about the XML data or web services application that the XML profile is protecting. As discussed earlier, many XML applications have a schema file for validation (i.e. DTD or XSD) or WSDL file that describes the language used to communicate with remote users. The XML profile is used to validate whether the incoming traffic complies with the predefined schemas or WSDL files. The following screenshot shows the configuration of the XML profile in the ASM. Notice all the different features it provides. You can download the all-important configuration files (WSDL), you can associate attack signatures to the profile (protects against things like XML parser attacks -- XML Bombs or External Entity Attacks), you can allow/disallow meta characters, and you can configure sensitive data protection for a specific namespace and a specific element or attribute. Another really cool thing is that most of these features are turned on/off using simple checkboxes. This is really cool and powerful stuff! I won't bore you with all the details of each setting, but suffice it to say, this thing let's you do tons of great things in order to protect your XML data. Well, that does it for this ASM article. I hope this sheds some light on how to protect your XML data. And, if you're one of the users who implements anything "as a Service" make sure you protect all that data by turning on the BIG-IP ASM. The next time someone throws an XML bomb your way, you'll be glad you did! Update: Now that the article series is complete, I wanted to share the links to each article. If I add any more in the future, I'll update this list. What is the BIG-IP ASM? Policy Building The Importance of File Types, Parameters, and URLs Attack Signatures XML Security IP Address Intelligence and Whitelisting Geolocation Data Guard Username and Session Awareness Tracking Event Logging3KViews1like1CommentUseful IT. Bringing Health Record Transfer into the 21st Century.
I read the Life as a Healthcare CIO blog on occasion, mostly because as a former radiographer, health care records integration and other non-diagnostic IT use in healthcare is a passing interest of mine. Within the last hospital I worked at the systems didn’t communicate – not even close, as in there was no effort to make them do so. This intrigues me, as since I’ve entered IT I have watched technology uptake in healthcare slowly ramp up at a great curve behind the rest of the business world. Oh make no mistake, technology has been in overdrive on the equipment used, but things like systems interoperability and utilizing technology to make doctors, nurses, and tech’s lives easier is just slower in the medical world. A huge chunk of the resistance is grounded in a very common sense philosophy. “When people’s lives are on the line you do not rush willy-nilly to the newest gadget.” No one in healthcare says it that way – at least not to my knowledge – but that’s the essence of what they think. I can think of a few businesses that could use that same mentality applied occasionally with a slightly different twist: “When the company’s viability is on the line…” but that’s a different blog. Even with this very common-sense resistance, there has been a steady acceleration of uptake in technology use for things like patient records and prescriptions. It has been interesting to watch, as someone on the outside with plenty of experience with the way hospitals worked and their systems were all silos. Healthcare IT is to be commended for things like electronic prescription pads and instant transfer of (now nearly all electronic) X-Rays to those who need them to care for the patient. Applying the “this can help with little impact on critical care” or even “this can help with positive impact on critical care and little risk of negative impact” viewpoint as a counter to the above-noted resistance has produced some astounding results. A friend of mine from my radiographer days is manager of a Cardiac Cath Lab, and talking with him is just fun. “Dude, ninety percent of the pups coming out of Radiology schools can’t set an exposure!” is evidence that diagnostic tools are continuing to take advantage of technology – in this case auto-detecting XRay exposure limits. He has more glowing things to say about the non-diagnostic growth of technology within any given organization. But outside the organization? Well that’s a completely different story. The healthcare organization wants to keep your records safe and intact, and rarely even want to let you touch them. That’s just a case of the “intact” bit. Some people might want their records to not contain some portion – like their blood alcohol level when brought to the ER – and some people might inadvertently lose some portion of the record. While they’re more than happy to send them on a referral, and willing to give you a copy if you’re seeking a second opinion, these records all have one archaic quality. Paper. If I want to buy a movie, I can go to netflix, sign up, and stream it (at least many of them) to watch. If I want my medical records transferred to a specialist so I can get treatment before my left eye oozes out of its socket, they have to be copied, verified, and mailed. If they’re short or my eye is on the verge of falling out right this instant, then they might be faxed. But the bulk of records are mailed. Even overnight is another day lost in the treatment cycle. Recently – the last couple of years – there has been a movement to replicate the records delivery process electronically. As time goes on, more and more of your medical records are being stored digitally. It saves room, time, and makes it easier for a doctor to “request” your record should he need it in a hurry. It also makes it easier to track accidental or even intentional changes in records. While it didn’t happen as often as fear-mongers and ambulance chasers want you to believe, of course there are deletions and misplacements in the medical records of the 300 million US citizens. An electronic system never forgets, so while something as simple as a piece of paper falling out of a record could forever change it, in electronic form that can’t happen. Even an intentional deletion can be “deleted” as in not show up, but still there, stored with your other information so that changes can be checked should the need ever arise. The inevitable off-shoot of electronic records is the ability to communicate them between hospitals. If you’re in the ER in Tulsa, and your normal doctor is in Manhattan, getting your records quickly and accurately could save your life. So it made sense that as the percentage of new records that were electronic grew, someone would start to put together a way to communicate them. No doubt you’re familiar with the debate about national health information databases, a centralized location for records is a big screaming target from many people’s perspectives, while it is a potentially life-saving technological advancement to others (they’re both right, but I think the infosec crowd has the stronger argument). But a smart group of people put together a project to facilitate doing electronically exactly what is being done today physically. The process is that the patient (or another doctor) requests the records be sent, they are pulled out, copied, mailed or faxed, and then a follow-up or “record received” communication occurs to insure that the source doctor got your records where they belong. Electronically this equates to the same thing, but instead of “selected” you get “looked up”, and instead of “mailed or faxed” you get “sent electronically”. There’s a lot more to it, but that’s the gist of The Direct Project. There are several reasons I got sucked into reading about this project. From a former healthcare worker’s perspective, it’s very cool to see non-diagnostic technology making a positive difference in healthcare, from a patient perspective, I would like the transfer of records to be as streamlined as possible, from the InfoSec perspective (I did a couple of brief stints in InfoSec), I like that it is not a massive database, but rather a “faster transit” mechanism, and from an F5 perspective, the possibilities for our gear to help make this viable were in my mind while reading. While Dr. Halamka has a lot of interesting stuff on his blog, this is one I followed the links and read the information about. It’s a pretty cool initiative, and what may seem very limiting in their scope assumptions holds true to the Direct Project’s idea of replacing the transfer mechanism and not creating a centralized database. While they’re not specifying formats to use during said transfer, they do list some recommended reading on that topic. What they do have is a registry of people who can receive records, and a system for transferring data over the wire. They worry about DNS-style health-care provider lookups, transfer protocols, and encryption, which is certainly a large enough chunk for them to bite off, and then they show how they fit into the larger nation-wide healthcare electronic records efforts going on. I hope they get it right, and the system they’re helping to build results in near-instantaneous secure records transfers, but many inventions are a product of the time and society in which they live, and even if The Direct Project fails, something like it will eventually succeed. If you’re in Healthcare IT, this is certainly a way to add value to the organization, and worth checking out. Meanwhile, I’m going to continue to delve into their work and the work of other organizations they’ve linked to and see if there isn’t a way F5 can help. After all, we can compress, dedupe, and encrypt communications on-the-wire, and the entire system is about on-the-wire communications, so it seems like a perfectly logical route to explore. Though the patient care guy in me will be reading up as much as the IT guy, because healthcare was a very rewarding field that seriously needed a bit more non-diagnostic technology when I was doing it.282Views1like0Comments