Microsoft Powershell with iControl
This component is not supported and we recommend reviewing Joel Newton's Powershell Module for iControlREST Code submission first. From the desk of Joe Pruitt (July 29, 2013) When we shipped DC4, we started looking at Windows PowerShell and how we could build some integration points with our products. The first pass was a set of PowerShell script files that we introduced in the PowerShell Labs section of DevCentral. No soon after we posted them, the requests started pouring in on when we would provide some native PowerShell CmdLets in addition to the function scripts. Well, I spent a little bit of time working some out and whipped out a good first rough draft. I've been holding on to these for a while now but figured they would do better out in the wild then trapped in a folder on my laptop. So, last night I posted an installer for the first release of the iControl CmdLets for PowerShell. Here's a step by step on getting up and running with the new bits. Download and install PowerShell from Microsoft Go to the PowerShell Labs page on DevCentral and select the "Download Now" link. This will download the Cmdlet installer. Run the iControlSnapInSetup.msi installer. This will install the SnapIn into the c:\program files\F5 Networks\iControlSnapIn directory. Start PowerShell from the Windows Start menu. Cd to c:\program files\F5 Networks\iControlSnapIn directory Dot Source the setup script (only once after the install) PS > . .\setupSnapIn.ps1) Load the SnapIn into the Runtime PS > Add-PSSnapIn iControlSnapIn Initialize the iControl connection with the Initialize-F5.iControl CmdLet PS > Initialize-F5.iControl -Hostname bigip_address -Credentials (Get-Credential) Run the Get-F5.iControlCommands CmdLet to list out all the available Cmdlets. PS > Get-F5.iControlCommands Try out some of the CmdLets PS > Get-F5.LTMPool Notes From the Legacy Download: Comment made 08-Jun-2016 by Patrik Jonsson Needed to add .Net 2.0 in add/remove windows features. Then it worked in Windows 10. The installation script should be changed to throw and error if the installutil file does not exist instead of quitting silently. Ken B Comment made 22-Jun-2016 by Ken B The problem I had getting this working was that I had to right-click the downloaded .zip file, properties, and click the "unblock" button on the General tab. Then I had to copy the files from the .zip to a folder under c:\Program Files\f5\icontrol. Then I ran PS As Administrator, then ran .\setupSnapIn.ps1, then I was *finally* able to run the "Add-PSSnapIn iControlSnapIn" command to get things going. Comment made 17-Jan-2017 by Joel Newton You can update the InstallPSSnapin.ps1 script to reference the .NET v4 install utility. Just replace the reference in setupSnapin.ps1 from $env:windir\Microsoft.Net\Framework${platform}\v2.0.50727\installUtil.exe to $env:windir\Microsoft.Net\Framework${platform}\v4.0.30319\installUtil.exe I don't believe there are any plans to replace the snapin with a module. My recommendation would be to use the REST API if possible. Comment made 05-Jul-2017 by Patrik Jonsson You can also use Joel's module: Powershell Module for the F5 LTM REST API Downloads: v11.00.00 Released August 10, 2013 v11.04.01 Released December 02, 2013 v11.05.00 Released February 18, 2014 v11.06.00 Released August 28, 2014 v12.01.00 Released May 09, 2016 v13.00.00 Released March 21, 2017 v13.01.00 Released November 11, 20175.7KViews2likes6CommentsIntroducing PoshTweet - The PowerShell Twitter Script Library
It's probably no surprise from those of you that follow my blog and tech tips here on DevCentral that I'm a fan of Windows PowerShell. I've written a set of Cmdlets that allow you to manage and control your BIG-IP application delivery controllers from within PowerShell and a whole set of articles around those Cmdlets. I've been a Twitter user for a few years now and over the holidays, I've noticed that Jeffrey Snover from the PowerShell team has hopped aboard the Twitter bandwagon and that got me to thinking... Since I live so much of my time in the PowerShell command prompt, wouldn't it be great to be able to tweet from there too? Of course it would! HTTP Requests So, last night I went ahead and whipped up a first draft of a set of PowerShell functions that allow access to the Twitter services. I implemented the functions based on Twitter's REST based methods so all that was really needed to get things going was to implement the HTTP GET and POST requests needed for the different API methods. Here's what I came up with. function Execute-HTTPGetCommand() { param([string] $url = $null); if ( $url ) { [System.Net.WebClient]$webClient = New-Object System.Net.WebClient $webClient.Credentials = Get-TwitterCredentials [System.IO.Stream]$stream = $webClient.OpenRead($url); [System.IO.StreamReader]$sr = New-Object System.IO.StreamReader -argumentList $stream; [string]$results = $sr.ReadToEnd(); $results; } } function Execute-HTTPPostCommand() { param([string] $url = $null, [string] $data = $null); if ( $url -and $data ) { [System.Net.WebRequest]$webRequest = [System.Net.WebRequest]::Create($url); $webRequest.Credentials = Get-TwitterCredentials $webRequest.PreAuthenticate = $true; $webRequest.ContentType = "application/x-www-form-urlencoded"; $webRequest.Method = "POST"; $webRequest.Headers.Add("X-Twitter-Client", "PoshTweet"); $webRequest.Headers.Add("X-Twitter-Version", "1.0"); $webRequest.Headers.Add("X-Twitter-URL", "http://devcentral.f5.com/s/poshtweet"); [byte[]]$bytes = [System.Text.Encoding]::UTF8.GetBytes($data); $webRequest.ContentLength = $bytes.Length; [System.IO.Stream]$reqStream = $webRequest.GetRequestStream(); $reqStream.Write($bytes, 0, $bytes.Length); $reqStream.Flush(); [System.Net.WebResponse]$resp = $webRequest.GetResponse(); $rs = $resp.GetResponseStream(); [System.IO.StreamReader]$sr = New-Object System.IO.StreamReader -argumentList $rs; [string]$results = $sr.ReadToEnd(); $results; } } Credentials Once those were completed, it was relatively simple to get the Status methods for public_timeline, friends_timeline, user_timeline, show, update, replies, and destroy going. But, for several of those services, user credentials were required. I opted to store them in a script scoped variable and provided a few functions to get/set the username/password for Twitter. $script:g_creds = $null; function Set-TwitterCredentials() { param([string]$user = $null, [string]$pass = $null); if ( $user -and $pass ) { $script:g_creds = New-Object System.Net.NetworkCredential -argumentList ($user, $pass); } else { $creds = Get-TwitterCredentials; } } function Get-TwitterCredentials() { if ( $null -eq $g_creds ) { trap { Write-Error "ERROR: You must enter your Twitter credentials for PoshTweet to work!"; continue; } $c = Get-Credential if ( $c ) { $user = $c.GetNetworkCredential().Username; $pass = $c.GetNetworkCredential().Password; $script:g_creds = New-Object System.Net.NetworkCredential -argumentList ($user, $pass); } } $script:g_creds; } The Status functions Now that the credentials were out of the way, it was time to tackle the Status methods. These methods are a combination of HTTP GETs and POSTs that return an array of status entries. For those interested in the raw underlying XML that's returned, I've included the $raw parameter, that when set to $true, will not do a user friendly display, but will dump the full XML response. This would be handy, if you want to customize the output beyond what I've done. #---------------------------------------------------------------------------- # public_timeline #---------------------------------------------------------------------------- function Get-TwitterPublicTimeline() { param([bool]$raw = $false); $results = Execute-HTTPGetCommand "http://twitter.com/statuses/public_timeline.xml"; Process-TwitterStatus $results $raw; } #---------------------------------------------------------------------------- # friends_timeline #---------------------------------------------------------------------------- function Get-TwitterFriendsTimeline() { param([bool]$raw = $false); $results = Execute-HTTPGetCommand "http://twitter.com/statuses/friends_timeline.xml"; Process-TwitterStatus $results $raw } #---------------------------------------------------------------------------- #user_timeline #---------------------------------------------------------------------------- function Get-TwitterUserTimeline() { param([string]$username = $null, [bool]$raw = $false); if ( $username ) { $username = "/$username"; } $results = Execute-HTTPGetCommand "http://twitter.com/statuses/user_timeline$username.xml"; Process-TwitterStatus $results $raw } #---------------------------------------------------------------------------- # show #---------------------------------------------------------------------------- function Get-TwitterStatus() { param([string]$id, [bool]$raw = $false); if ( $id ) { $results = Execute-HTTPGetCommand "http://twitter.com/statuses/show/" + $id + ".xml"; Process-TwitterStatus $results $raw; } } #---------------------------------------------------------------------------- # update #---------------------------------------------------------------------------- function Set-TwitterStatus() { param([string]$status); $encstatus = [System.Web.HttpUtility]::UrlEncode("$status"); $results = Execute-HTTPPostCommand "http://twitter.com/statuses/update.xml" "status=$encstatus"; Process-TwitterStatus $results $raw; } #---------------------------------------------------------------------------- # replies #---------------------------------------------------------------------------- function Get-TwitterReplies() { param([bool]$raw = $false); $results = Execute-HTTPGetCommand "http://twitter.com/statuses/replies.xml"; Process-TwitterStatus $results $raw; } #---------------------------------------------------------------------------- # destroy #---------------------------------------------------------------------------- function Destroy-TwitterStatus() { param([string]$id = $null); if ( $id ) { Execute-HTTPPostCommand "http://twitter.com/statuses/destroy/$id.xml", "id=$id"; } } You may notice the Process-TwitterStatus function. Since there was a lot of duplicate code in each of these functions, I went ahead and implemented it in it's own function below: function Process-TwitterStatus() { param([string]$sxml = $null, [bool]$raw = $false); if ( $sxml ) { if ( $raw ) { $sxml; } else { [xml]$xml = $sxml; if ( $xml.statuses.status ) { $stats = $xml.statuses.status; } elseif ($xml.status ) { $stats = $xml.status; } $stats | Foreach-Object -process { $info = "by " + $_.user.screen_name + ", " + $_.created_at; if ( $_.source ) { $info = $info + " via " + $_.source; } if ( $_.in_reply_to_screen_name ) { $info = $info + " in reply to " + $_.in_reply_to_screen_name; } "-------------------------"; $_.text; $info; }; "-------------------------"; } } } A few hurdles Nothing goes without a hitch and I found myself pounding my head at why my POST commands were all getting HTTP 417 errors back from Twitter. A quick search brought up this post on Phil Haack's website as well as this Google Group discussing an update in Twitter's services in how they react to the Expect 100 HTTP header. A simple setting in the ServicePointManager at the top of the script was all that was needed to get things working again. [System.Net.ServicePointManager]::Expect100Continue = $false; PoshTweet in Action So, now it's time to try it out. First you'll need to . source the script and then set your Twitter credentials. This can be done in your Twitter $profile file if you wish. Then you can access all of the included functions. Below, I'll call Set-TwitterStatus to update my current status and then Get-TwitterUserTimeline and Get-TwitterFriendsTimeline to get my current timeline as well as that of my friends. PS> . .\PoshTweet.ps1 PS> Set-TwitterCredentials PS> Set-TwitterStatus "Hacking away with PoshTweet" PS> Get-TwitterUserTimeline ------------------------- Hacking away with PoshTweet by joepruitt, Tue Dec 30, 12:33:04 +0000 2008 via web ------------------------- PS> Get-TwitterFriendsTimeline ------------------------- @astrout Yay, thanks! by mediaphyter, Tue Dec 30 20:37:15 +0000 2008 via web in reply to astrout ------------------------- RT @robconery: Headed to a Portland Nerd Dinner tonite - should be fun! http://bit.ly/EUFC by shanselman, Tue Dec 30 20:37:07 +0000 2008 via TweetDeck ------------------------- ... Things Left Todo As I said, this was implemented in an hour or so last night so it definitely needs some more work, but I believe I've got the Status methods pretty much covered. Next I'll move on to the other services of User, Direct Message, Friendship, Account, Favorite, Notification, Block, and Help when I've got time. I'd also like to add support for the "source" field. I'll need to setup a landing page for this library that is public facing so the folks at Twitter will add it to their system. Once I get all the services implemented, I'll more forward in formalizing this as an application and submit it for consideration. Collaboration I've posted the source to this set of functions on the DevCentral wiki under PsTwitterApi. You'll need to create an account to get to it, but I promise it will be worth it! Feel free to contribute and add to if you have the time. Everyone is welcome and encouraged to tear my code apart, optimize it, enhance it. Just as long as it get's better in the process. B-).1.7KViews0likes10CommentsUnix To PowerShell - Wc
PowerShell is definitely gaining momentum in the windows scripting world but I still hear folks wanting to rely on unix based tools to get their job done. In this series of posts I’m going to look at converting some of the more popular Unix based tools to PowerShell. wc The Unix “wc” (word count) command will print the character, word, and newline counts for each file specified and a total line if more than one file is specified. This command is useful for quickly scanning a directory for small and large files or to quickly look at a file and determine it’s relative size. The Get-Content Cmdlet will return the number of characters in the full but not the number of lines and words. The following script will emulate the behavior of the Unix “wc” command with a few changes in the way parameters are supplied.3KViews0likes2CommentsUnix To PowerShell – Cut
PowerShell is definitely gaining momentum in the windows scripting world but I still hear folks wanting to rely on Unix based tools to get their job done. In this series of posts I’m going to look at converting some of the more popular Unix based tools to PowerShell. cut The Unix “cut” command is used to extract sections from each link of input. Extraction of line segments can be done by bytes, characters, or fields separated by a delimiter. A range must be provided in each case which consists of one of N, N-M, N- (N to the end of the line), or –M (beginning of the line to M), where N and M are counted from 1 (there is no zeroth value). For PowerShell, I’ve omitted support for bytes but the rest of the features is included. The Parse-Range function is used to parse the above range specification. It takes as input a range specifier and returns an array of indices that the range contains. Then, the In-Range function is used to determine if a given index is included in the parsed range. The real work is done in the Do-Cut function. In there, input error conditions are checked. Then for each file supplied, lines are extracted and processed with the given input specifiers. For character ranges, each character is processed and if it’s index in the line is in the given range, it is appended to the output line. For field ranges, the line is split into tokens using the delimiter specifier (default is a TAB). Each field is processed and if it’s index is in the included range, the field is appended to the output with the given output_delimiter specifier (which defaults to the input delimiter). The options to the Unix cut command are implemented with the following PowerShell arguments: Unix PowerShell Description FILE -filespec The files to process. -c -characters Output only this range of characters. -f -fields Output only these fields specified by given range. -d -delimiter Use DELIM instead of TAB for input field delimiter. -s -only_delimited Do not print lines not containing delimiters. --output-delimiter -output_delimiter Use STRING as the output deflimiter.2.7KViews0likes4CommentsGoogleCL Puts Another Tool in the Devops Integration and Automation Toolbox
Google’s latest offering is a hint of things to come and indicates a recognition of devops as a real discipline Interestingly enough devops is comprised of two disciplines: development and operations. The former traditionally solve problems and address challenges through development, through coding, through a programmatic solution. The latter, operations, is often more administrative focused and its solutions to the same issues and challenges will also be programmatic, just on a different level – that of scripting. There is no right or wrong answer to this one; in fact the concept of devops is about bridging the gap that exists between development and operations and doing so in a way that enables IT to be more agile and able to extract benefits from emerging data center models like virtualization and cloud computing . Thus, it should not be a surprise to view the introduction of a CLI interface to Google as the means by which devops is enabled with another integration and automation option. Ever wanted to upload a folder full of photos to Picasa from a command prompt? We did, a lot, last summer. It made us want to say: $ google picasa create --title "My album" ~/Photos/vacation/*.jpg So we wrote a program to do that, and a whole lot more. GoogleCL is a command-line utility that provides access to various Google services. It streamlines tasks such as posting to a Blogger blog, adding events to Calendar, or editing documents on Google Docs. For example: $ google blogger post --blog "My blog" --tags "python, googlecl, development" my_post.html $ google calendar add "Lunch with Jason tomorrow at noon" $ google docs edit --title "Shopping list" --editor vim GoogleCL is a pure Python application that uses the Python gdata libraries to make Google Data API calls from the command line. Introducing Google Command Line Tool At first glance GoogleCL is probably not a big deal. After all, there already existed an API that developers could leverage to take advantage of Google Data. But consider that most APIs = even though they may be REST-based and therefore more familiar and even usable to ops-focused admins, the lingua franca of the REST world falls squarely in what most would define as being definitely “developer-oriented”: JSON and/or XML. While admins are certainly familiar with HTTP and the ways of the Web, they may not be so comfortable with JSON and/or XML. Exposing an API through a scripting language that ops is familiar with like Python (or TCL, or PERL) is a definite step toward enabling the emerging devops paradigm to flourish and grow. The short example offered by Google in the aforementioned blog provides a glimpse of how documents and tools deployed within Google’s environment might be leveraged and scripted from within the enterprise in support of more integrated, streamlined operations. Imagine, for example, that one of the applications you support wants to leverage one of Google’s existing services. Let’s say you’re going to use a Blogger account as a mirror “just in case” your own blog site gets overloaded because, well, every once in a while you write a really good blog and your existing capacity isn’t quite enough to handle the load along with all the other applications being serviced. So you’re going to mirror that blog on Blogger if the traffic gets above a threshold. You don’t have to actually upload it right away. While in the case of Blogger it’s not incurring any costs to mirror the blog as a matter of operational policy, in other cases it might be and there may be business reasons why you don’t want it mirrored unless you’re forced to because of capacity constraints. So you don’t mirror unless it’s necessary. But then it becomes necessary. How do you rapidly mirror the blog to the external service and simultaneously ensure that it can be leveraged as another resource for requests for that blog? Operationally this becomes simple by executing a script that posts a copy of the blog via GoogleCL to Blogger and then notifying your traffic management device (load balancer, depending on your architecture ) that there is a new resource that should be leveraged to handle the load – which should also be available via either an API or a CLI scripting language. Or perhaps it’s as simple as enabling a network-side script that redirects to the blogger-hosted mirror when the connection count on your internal instances is over X or the network-bandwidth being used is over Y or whatever it is that triggers the need. This method keeps control with the organization but leverages off-premise services in a form of instant cloud-bursting. Instant (on-demand) capacity with a simple CLI script.188Views0likes0CommentsF5 Friday: I Found the Infrastructure Beef
Ask and ye shall receive – F5 joins Microsoft’s Dynamic Data Center Alliance to bring network automation to a Systems Center Operations Manager near you You may recall that last year Microsoft hopped into Infrastructure 2.0 with its Dynamic Datacenter Toolkit (DDTK) with the intention of providing a framework through which data center infrastructure could be easily automated and processes orchestrated as a means to leverage auto-scaling and faster, easier provisioning of virtualized (and non-virtualized in some cases) resources. You may also recall a recent F5 Friday post on F5’s Management pack capabilities regarding monitoring and automatic provisioning based on myriad application-centric statistics. You might have been thinking that if only the two were more tightly integrated you really could start to execute on that datacenter automation strategy. Good news! The infrastructure beef is finally here. WHERE DID YOU FIND IT? I found the infrastructure beef hanging around, waiting to be used. It’s been around for a while hiding under the moniker of “standards-based API” and “SDK” – not just here at F5 but at other industry players’ sites. The control plane, the API, the SDK – whatever you want to call it – is what enables the kind of integration and collaboration necessary to implement a dynamic infrastructure such as is found in cloud computing and virtualization. But that dynamism requires management and monitoring – across compute, network, and storage services – and that means a centralized, extensible management framework into which infrastructure services can be “plugged.” That’s what F5 is doing with Microsoft. Recently, Microsoft announced that it would open its Dynamic Datacenter Alliance to technology partners, including the company’s compute, network, and storage partners. F5 is the first ADN (Application Delivery Network) partner in this alliance. Through this alliance and other partnership efforts, F5 plans to further collaborate with Microsoft on solutions that promote the companies’ shared vision of dynamic IT infrastructure. Microsoft envisions a dynamic datacenter which maximizes the efficiency of IT resource allocation to meet demand for services. In this vision, software and services are delivered through physical or virtualized computing, network and storage solutions unified under the control of an end-to-end management system. That management system monitors the health of the IT environment as well as the software applications and constantly reallocates resources as needed. In order to achieve such a holistic view of the datacenter, solutions must be integrated and collaborative, a la Infrastructure 2.0. The automated decisions made by such a management solution are only as good as the data provided by managed components. Microsoft’s Dynamic Datacenter Toolkit (DDTK) is a dynamic datacenter framework that enables organizations and providers the means by which they can not only automate virtualized resource provisioning but also manage compute, network, and storage resources. F5 now supports comprehensive integration with Microsoft System Center, Virtual Machine Manager, Windows Hyper-V, and more. Both physical (BIG-IP Local Traffic Manager) and virtual (BIG-IP Local Traffic Manager Virtual Edition) deployment options are fully supported through this integration. The integration with DDTK also provides the management system with the actionable data required to act upon and enforce application scalability and performance policies as determined by administrator-specified thresholds and requirements. Some of the things you can now do through SCOM include: Discover BIG-IP devices Automatically translate System Center health states for all managed objects Generate alerts and configure thresholds based on any of the shared metrics Use the PowerShell API to configure managed BIG-IPs Generate and customize reports such as bandwidth utilization and device health Automatic object-level configuration synchronization at the device level (across BIG-IP instances/devices) Monitor real-time statistics Manage both BIG-IP Global Traffic Manager and Local Traffic Manager Deepen just-in-time diagnostics through iRules integration by triggering actions in System Center Migrate live virtual machines Network spike monitoring in the virtual machine monitor console, eliminating bottlenecks before they happen There’s a lot more, but like Power Point it’s never a good idea to over-use bullet points. A more detailed and thorough discussion of the integration can be read at your leisure in the F5 and Microsoft Solution Guide [PDF].177Views0likes0CommentsUnix To PowerShell - Tail
PowerShell is definitely gaining momentum in the windows scripting world but I still hear folks wanting to rely on Unix based tools to get their job done. In this series of posts I’m going to look at converting some of the more popular Unix based tools to PowerShell. tail The Unix “tail” command that is used to display the last 10 lines of each FILE to standard output. With more than one file, precede each with a header giving the file name. There is also a mode where it prints out the last “n” bytes in a file. And for those that want to monitor changes to a file, there is the “follow” option where tail will monitor the file and print out any additions as they are made. I’ve implemented these three options with the follow option only working on line mode. The script could be made to work on byte mode as well, but I’ll leave that to the reader to implement if you really want it. The unix parameters map to the following in my PowerShell script: Unix PowerShell Description -c -num_bytes Output the last N bytes. -n -num_lines Output the last N lines (default 10). -f -follow Output appended data as the file grows. -s -sleep With “-f”, sleep for N seconds between iterations (default 1). -q -quiet Never output headers giving file names. The code will loop through the specified files. For “num line” mode, it will get the contents of the file into a string array and print out the last “n” lines with the default being 10. If the "-follow” switch was given, it will sit in a loop waiting the specified number of seconds before rechecking the file and if any modifications have been made it will print them to the console. This is repeated indefinitely until the script is broken. For byte mode, the content will be loaded into a string and the last “n” characters (up to the size of the file) will be displayed to the console.231Views0likes0CommentsListen To Twitter With The PoshMouth PowerShell Service!
A few weeks ago I was scanning through my daily PowerTips from PowerShell.com and came across one that covered Using COM Objects to Say Hi. In this tip, they describe how to use the Microsoft Speech API (SAPI) to convert text to play through the the windows audio system. Of course I started tinkering around to see how SAPI sounded with various text strings. Once the novelty of that wore off, the next thing I thought of was how I could extend my PowerShell Twitter library PoshTweet with the SAPI library. I quickly built a little script that called the PoshTweet Get-TwitterFriendsTimeline and passed the status values to this call and it just worked! While that was a great proof of concept, it's not very practical. I quickly wrote up the following set of requirements of how I would "like" this little project to come out. It should work as a centralized service. The audio should be persistent. It should be a controlled system. Running off the public timeline would likely kill my system. So I bring to you: PoshMouth! Centralized Service True to my PowerShell roots, I opted to build the service in PowerShell. The main body of the script is as follows: Set-TwitterCredentials -user $TwitterUser -pass $TwitterPass # Start Application Loop While($true) { # Query the current users rate limit status [xml]$rls = Get-TwitterRateLimitStatus -raw $true; $remaining_hits = $rls.hash.{remaining-hits}.get_InnerText(); [DateTime]$reset_time = $rls.hash.{reset-time}.get_InnerText(); Write-DebugMessage "------------------------------------------------------"; Write-DebugMessage "Remaining Hits: $remaining_hits..."; Write-DebugMessage "Reset Time : $reset_time"; Write-DebugMessage "------------------------------------------------------"; if ( $remaining_hits -gt 0 ) { # First pull up all friends timelines [xml]$x = Get-TwitterFriendsTimeline -raw $true -since_id (Get-LastStatusId); Process-StatusEntries $x; # Sleep until next poll Write-DebugMessage "Sleeping for $DELAY seconds..."; Start-Sleep -Seconds $DELAY; } else { # Determine how many seconds needed $ts = $reset_time - [DateTime]::Now; $wait_time = [int]$ts.TotalSeconds + 1; # Sleep until new rate limit is allocated Write-DebugMessage "Waiting $wait_time seconds for new Rate Limit..."; Start-Sleep -Seconds $wait_time; } } Basically the API Rate Limit is queried to determine if calls can be made. If so, then Get-TwitterFriendsTimeline function is called to request the current timeline. Then each entry is processed with the following code: function Process-StatusEntries() { param($status = $null); if ( $status ) { $status.statuses.status | ForEach-Object -Process { # Ignore tweets from this account. if ( ! $_.user.screen_name.ToLower().Equals($TwitterUser) ) { # If audio already exists, we'll assume it's already been processed. if ( ! (Audio-Exists $_.id) ) { Write-DebugMessage "Processing Tweet # $($_.id)..."; $mp3file = Create-AudioFile $_; $url = Publish-AudioFile $mp3file; Tweet-Notification -status $_ -mp3file $url; Set-LastStatusId -id $_.id; } } } } } I had to check the screen_name to make sure it didn't match the twitter account this was running under or an infinite loop could occur where I convert PoshMouth's tweets of tweets of tweets... If the audio already exists in the local repository, then the tweet is skipped. Otherwise, a MP3 is created and published to a web server. Finally an @reply tweet is made to the original posters username with a link to the target mp3 file. Persistent Audio As you noticed above, to make the audio persistent, I needed to save the audio stream to a file on the system. The Speech API allows for saving to an uncompressed .WAV file via the SAPI.SpFileStream object. Since .WAV's are so passe, I figured I'd throw a little LAME action in there and convert that .WAV to a more appropriate .MP3 file. The following code does the dirty work. function Convert-TextToWAV() { param([string]$text = $null, [string]$wavfile = $null); if ( $text -and $wavfile ) { $SSMCreateForWrite = 3; $DoEvents = 0; $SVSFlagsAsync = 1; # remove file if it exists #Remove-Item -ErrorAction SilentlyContinue $wavfile; $spFs = New-Object -ComObject SAPI.SpFileStream $spFs.Open($wavfile, $SSMCreateForWrite, $DoEvents) $voice = New-Object -ComObject SAPI.SpVoice; $voice.AudioOutputStream = $spFs; $voice.Speak($text, $SVFlagsAsync); $spFs.Close() } } function Convert-WAVToMP3() { param([string]$wavfile = $null, [string]$mp3file = $null); if ( $wavfile -and $mp3file ) { & $LAME_EXE -S $wavfile $mp3file; } } function Convert-TextToMP3() { param([string]$text = $null, [string]$mp3file = $null); if ( $text -and $mp3file ) { $wavfile = Get-TempFile "foo.wav"; Convert-TextToWAV $text $wavfile; Convert-WAVToMP3 $wavfile $mp3file; Remove-Item $wavfile; } } function Create-AudioFile() { param($status = $null); if ( $status ) { $created_at = $status.created_at; $id = $status.id; $text = $status.text; $user_id = $status.user.id; $user_name = $status.user.name; $screen_name = $status.user.screen_name; $audio_file = Get-AudioFile $id; [void](Convert-TextToMP3 $text $audio_file); } return $audio_file; } The Create-AudioFile function will take the status output from Twitter, and Convert the text to a .MP3 on the local system with the SAPI SpFileStream object along with the publicly available LAME audio library. Controlled System The process for converting a tweet to MP3 and uploading to our media server takes about 3-5 seconds. If I allowed everyone that followed the Twitter account, this could grow way beyond the ability for my little script to handle. I'd need to build some sort of asynchronous setup to allow for parallel encoding and uploading. This, in addition to the fact that within 5 minutes of creating the account, I already had 3 followers trying to sell me on weight-loss products, I knew this wasn't an option. To limit this, I decided to set it up to only create audio files for accounts that my account was following. This way I could easily control for whom the audio is getting created. Putting It All Together So how can you start getting your tweets automagically audioized? Here's the process: Start following the @TweetMouth account on Twitter. Sit back and wait for me to follow you. Watch for @replies from @TweetMouth. Click on the tinyurl link and you can hear your tweet in all it's Windows Sound System glory! Getting the Bits PoshMouthBot.zip - A zip containing the latest PoshMouthBot.ps1 and PoshTweet.ps1 scripts. If you decide to build and run your own version of PoshMouthBot, you'll need to use your own twitter account and have access to a distribution server for the MP3 files. Most of this is controlled from global variables at the top of the PoshMouthBot.ps1 script. Enjoy and happy Tweeting! -Joe250Views0likes2CommentsA Geek's Guide To Motivating Your Employees - PowerShell Style
I'm sure you've heard of stories about folks who work day after day receiving insults from their bosses. You know, in some peoples minds, insulting others is a way to keep them "in line". If the common folk get too much encouragement, they might start thinking that they aren't replaceable. I remember reading that it's been shown that positive motivation is no way to provide a healthy work environment for your employees but unfortunately in today's world, there just isn't enough training for managers to spread this knowledge around. So, if you find yourself continuously being praised for your great work by your higher-ups, this is a disservice to your organization. Training managers to properly insult you will take time and cost a lot of money. I've got a solution for you that will cost you nada and will give you all the self-loathing that it takes to bring your company into financial success. There are already several insult generators out there on the Internet, but that requires the user to actively go out and look for that insult. Automatic emails don't work well either as that requires that the employee actually reads his/her email. What you really need is a background process that runs on all of your employees computers that will randomly let them know what you think of them. Enter the Sling-Insults PowerShell script. Sling-Insults.ps1 $rand = New-Object System.Random; $voice = New-Object -comObject SAPI.SpVoice; $insults = @( "Hey! Get back to work!", "Who do you think you are anyway!", "Get off of that stinkin InterWeb", "I'll Twitter YOU mister!", "Facebook is for dorks, so you must be one!", "What's that funny orange icon you keep looking at?", "blogs, splogs. Do something productive you idiot!", "I don't care how many followers you have" ); While ($true) { Start-Sleep ($rand.next(60, 300)) [void]$voice.Speak($insults[$rand.next(0, $insults.Count)]); } Just run this baby in the background and your employees machines and every 1-5 minutes, one of the random quotes will be spoken to them allowing you to sit back and relax with the knowledge that your workers are being kept in their places. Now, I understand that there are those who would argue with the benefits of constantly demoralizing workers. For those of you that are into this kind of backwards-thinking, I've got a solution for you as well. With quotes taken from the self-help show "Daily Affirmation With Stuart Smalley", I've got the Give-Praises script as well. Give-Praises.ps1 $rand = New-Object System.Random; $voice = New-Object -comObject SAPI.SpVoice; $insults = @( "I'm good enough, I'm smart enough, and doggone it, people like me.", "I am a worthy human being.", "Denial ain't just a river in Egypt!", "That's just stinkin' thinkin!", "You're should-ing all over yourself.", "Hello little child, I'm going to protect you and no one will ever hurt you again.", "...and that's...okay.", "Whining is anger through a small opening.", "It's easier to put on slippers than to carpet the whole world." ); While ($true) { Start-Sleep ($rand.next(60, 300)) [void]$voice.Speak($insults[$rand.next(0, $insults.Count)]); } So, there you have it. Whether your are into insulting or encouraging your employees, this blog post has something for you. I'd love to hear feedback on which of the two scripts provide better results. Disclaimer: For those that didn't get the humor in this post, it in no means represents the opinion of my employer F5 Networks - nor myself for that matter. Just having a little fun!264Views0likes1CommentDevCentral Top5 01/09/2009
Welcome to the first DCTop5 of 2009! It's a new year and that means you survived the holidays, the in-laws, the traffic jams, the elections, and if you live in Seattle, record breaking snow-storms. Congratulations, it's time to do it all over. There's been so much great stuff going on in DevCentral Land in the past 12 months that it's been hard for even me to keep up at times. So when I tell you that there's even more to come, and that we're turning up the dial further still this year, believe me when I say that it's going to be staggering. We've got a killer new team-member, an invigorated crew, and an endless sea of bits and bytes to conquer ahead of us. I'm excited. You should be excited. It's a whole new year, and it's all happening...trust me. Rest assured, dear reader, that you have nothing to fear. The DC Top5 will again serve as your faithful guide to the land of DevCentral awesomeness in the coming year, just as it was in the previous one. That said, the first step in this year's Top5 journey is to pick out what's been going on the past week (okay two weeks) that you definitely need to see. As usual I've picked my favorite gems to pass along, so here's hoping that you like them as much as I do. I give you the year's inaugural Top5: Investigate the LTM TCP Profile: Delayed & Selective Acknowledgements https://devcentral.f5.com/s/articles/investigating-the-ltm-tcp-profile-acknowledgements In his no-nonsense, down and dirty technical series detailing the different LTM TCP Profile options the LTM offers, Jason continues by looking into D-SACK, ACK on Push and more. This series is absolutely fantastic if you want to really dig into what you can and can't do with your TCP profile, why you might want to, and what you can expect by fiddling with those bits you were curious about. Whether you're a network architect looking to optimize things or you're just curious about what can be done, this one's definitely worth a read. The geek content is even relatively high, which makes me feel right at home. That shouldn't stop any of you semi or non-geeks, though, Jason's articles are definitely approachable by all, while deep enough to satisfy the inner propeller-head in all of us. Introducing PoshTweet - The PowerShell Twitter Script Library https://devcentral.f5.com/s/articles/introducing-poshtweet-the-powershell-twitter-script-library If you aren't convinced by now that Joe is a force to be reckoned with in the PowerShell world, let me attempt to convince you once more. While we all know he's a guru of iControl, Joe continues to display his PowerShell foo in this awesome example of using PowerShell as a twitter client in response to Jeffrey Snover's boarding of the Twitter train. This isn't just a display of Joe's skills, though. This is a darn cool example of PowerShell and some of the cool things that you can do with APIs, social media, and a little ingenuity (and apparently free time while snowed-in over the holidays). 20 Lines or Less #19 https://devcentral.f5.com/s/articles/20-lines-or-less-19 Continuing with my quest for killer iRules that fit in the palm of your hand, 20 Lines or Less rolls on. This edition brings a couple of very cool iRules including one email submission for which I'm quite grateful. More are always welcome (hint, hint). After battling with a slight moral dilemma I even decided to post the example of WAN simulation via iRule, along with the appropriate warning that it's NOT meant for production. It's a CPU killer and a delay inducer but then again, that's kind of the point of something written to...inject artificial delays, isn't it? It also happens to be a darn cool example of iRules and what they can do, even if it's not the original intent of the language. Take a look and handle with care. Phew, we made it! That's one down for 2009, and many more to come. Thanks, as always, for reading and please don't hesitate to submit/send in any feedback or suggestions you might have. #Colin170Views0likes0Comments