Forum Discussion

Filip_Verlaeckt's avatar
Filip_Verlaeckt
Historic F5 Account
Mar 31, 2011

Automated secure file transfer

Hello

 

 

I am trying to solve the following problem:

 

 

What are the possibilities to organise file transfers initiated from a remote client to an internal server behind LTM?

 

 

Requirements are as follows:

 

- file must be encrypted in transit (SCP, FTPS, SFTP,....)

 

- user (or script) must authenticate

 

- destination file server must be selected by LTM based on information in the upload command (e.g. filename or destination host/directory)

 

 

Not sure how to read the filename or directory name with LTM eg. from below command.

 

 

pscp -pw xxxxx file.doc user@ftp_vip_on_ltm:/dir/file.doc

 

 

Any help much appreciated.

 

 

 

  • Hamish's avatar
    Hamish
    Icon for Cirrocumulus rankCirrocumulus
    Ahh... Offloaded scp?

     

     

    Hmm... I've never done any offloaded scp before. You'd have to interpret the scp protocol yourself, then re-encrypt... I don't think it'll be easy. But a worthy challenge none the less.

     

     

    sftp is just ssh as well.. You'll probably have the same problems.

     

     

    because they're both port 22, you'd also need to detect NON scp/sftp and disable processing (Otherwise you're just slowing up ssh interactive access for no good reason).

     

     

    ftps probably has more possibilities. It's just ftp over SSL/TLS. So you can decrypt, and the ftp protocol is well understood. However you do then have to contend with data connections as well as the command connection, so it may actually become more complex than the single stream scp option.

     

     

     

    H
  • You can forget about SCP or SFTP. There's no real way to offload for those on LTM without writing something to specifically handle the protocol; no way is that an easy task. SSH (the transport for SCP/SFTP) does not use SSL/TLS -- it negotiates its own secure channel using Diffie-Hellman and a suite of stream encryption types. It may still be _possible_ to do this with an iRule has the prerequisites are there: AES-128, gzip compression, and HMAC-md5 are all available in LTM iRules (although the supplied AES is CBC mode and probably not directly usable in a stream protocol).

     

     

    Hamish is correct: FTPS may be a better option here since it uses SSL/TLS and can be offloaded -- that which can be offloaded can be selectively load-balanced. But I find myself wondering...

     

     

    Why not just use HTTPS? Your requirements are:

     

     

    - file must be encrypted in transit (SCP, FTPS, SFTP,....)

     

    - user (or script) must authenticate

     

    - destination file server must be selected by LTM based on information in the upload command (e.g. filename or destination host/directory)

     

     

    ... all of which can be accomplished through LTM SSL Offload (item 1), LTM client authentication (item 2), and LTM iRule pool selection (item3). And you can use things like wget or curl to handle files from batch scripts. It may not work reliably for very _large_ file transfers, but for the bulk of corporate datafile moving it's pretty reliable (for me at least). AMEX, for example, uses a scenario very much like this to handle corporate card data information upload/download - an authenticated SSL web site with "drop directories" that tie to specific backend servers for specific customers.
  • Colin_Walker_12's avatar
    Colin_Walker_12
    Historic F5 Account
    Good call Joel, I was thinking the exact same thing when I saw this request the first time. I wandered off to knock out a couple other things and came back to you answering the question already. I wonder if it always works that way. ;)

     

     

    This would be exponentially easier with HTTPS.

     

     

    Colin
  • The main disadvantage to HTTPS for file uploads/downloads is that it doesn't (easily?) support resumption for broken transfers. FTP/S offloading could potentially be done in an iRule, but when I tested, it was really hard to get it working. And trying to support multiple client types blew it up.

     

     

    Aaron
  • Colin_Walker_12's avatar
    Colin_Walker_12
    Historic F5 Account
    Agreed, this is definitely a pros vs. cons case and there may be a requirement that doesn't allow for HTTPS transfers. If they're doable, though, they're definitely easier to implement.

     

     

    Colin
  • HTTP/1.1 supports Content-Range on PUT requests; and it's client-specific as to whether this is supported by the uploading program... but it's possible to do resumable uploads over HTTP using this.

    I've never had cause to use it, but curl (http://curl.haxx.se) supports PUT resumes over HTTP/HTTPS. It does this by first asking the server if the file exists and what size it is (HEAD /filename.ext HTTP/1.1), comparing the local file, and doing PUT with the correct Content-Range offset. This is "curl -C - -T " according to the man page. Curl is, of course, available for pretty much every platform.

    I know it's no ideal solution to require a specific client in order to support resumes, but it'll work if a connecting client needs it.

    Update: Confirmed -- it absolutely does PUT uploads with Content-Range:

    
    PUT /blah%2Etxt HTTP/1.1
    Content-Range: bytes -1-/2352
    User-Agent: curl/7.15.5 (i686-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zl ib/1.2.3 libidn/0.6.5 
    Host: localhost:8800 
    Accept: */* 
    Content-Length: 2352 
    Expect: 100-continue 

    I've got no test environment handy to beat on this, but I'd be willing to bet with an HTTP/1.1 server and Curl, you've got a resumable upload if you want it.

    Update 2: There's no server support in IIS (any version) for the Content-Range header in WebDAV (where IIS keeps the support for HTTP PUT). To add it, you'd need a third-party WebDAV that supports resumable uploads, like the one from ITHit: http://www.webdavsystem.com/server . For UNIX/Linux, there's support in the mod_dav Apache module.

    I tested the IIS way before I tested the Apache way -- bottom line is, the file I transferred partially was maintained on the server side and I was able to automatically resume using Curl.

    So: To do a resumable upload using HTTPS, you'll need to install and configure a server that supports WebDAV/PUT uploads with Content-Range support, and you'll need to instruct your users to select a WebDAV or CLI tool that can upload in resumable form (ITHit WebDAV client or Curl). Hope this helps!