Forum Discussion
Python SDK or iControl REST, which is the base of the SDK, is not optimal for downloading a large file.
This is because there is a 1 MB size limitation in downloading or uploading a file. Any file larger than the limit must be chunked (split into small pieces and concatenate later on the receiving end) manually. See also Demystifying iControl REST Part 5: Transferring Files.
Python SDK does the chunking for you (0.5 MB per chunk). Due to the chunking, it will create a numerous calls (including the one that gets the information on the file size) for a single file retrieval. For example, for a 1 GB file, it calls the iControl REST endpoint 1,910 times. That is not exactly a cheap overhead. In my test, it took 12s for scp and 92s for Python for a 1 GB file.
Another factor that may affect file downloading performance is the system load. Since Python SDK interacts with a various components on the box, any resource contention (such as CPU and memory) would slow the job down (I guess 20-40 min in your case is this). Avoiding the busy time would be a good option.
If you need speed and do not need to use iControl REST, scp (secure copy) is a good alternative solution for file downloadin.
- asalamatovFeb 27, 2020Nimbostratus
Thanks for your detailed answer.
I noticed that using SDK https://f5-sdk.readthedocs.io/en/latest/userguide/file_transfers.html to download UCS file takes lots of time (almost an hour for 270MB file), but when using f5-backup script https://devcentral.f5.com/s/articles/f5-remote-backup-python-script-1163 it's almost instant. Looking into SDK code I see that it's based on f5-backup script, but can't figure out why is download speed varies so much?
I also created an issue on GitHub for this. Just trying to figure out if there is a bug in SDK code? https://github.com/F5Networks/f5-common-python/issues/1548