Uploaded image for project: 'Atlassian Cloud'
  1. Atlassian Cloud
  2. CLOUD-5975

Large backup downloads time out, causing incomplete files.

      Large backup downloads (e.g. created by backup manager) through the WebDAV interface usually stops at 1GB.

      Workarounds

      • Download using a download manager e.g. Firefox's DownThemAll.
      • Use the Bash script shared in the comment below.
      • Use the 'wget' command as suggested below.

            [CLOUD-5975] Large backup downloads time out, causing incomplete files.

            Hi everyone

            As of December 2016, Atlassian removed WebDAV support for both Jira and Confluence Cloud applications. As such, I'm closing out this bug, as no longer relevant to our current cloud capabilities. 

            Please see here for further details.

            Regards
            Nuwan Ginige
            Atlassian Product Management

            Nuwan Ginige (Inactive) added a comment - Hi everyone As of December 2016, Atlassian removed WebDAV support for both Jira and Confluence Cloud applications. As such, I'm closing out this bug, as no longer relevant to our current cloud capabilities.  Please see here for further details. Regards Nuwan Ginige Atlassian Product Management

            I tried with Wget curl and Downloadthemall fire fox extension.
            The file is downloaded but when I unzip it is corrupted and does not extract.

            Mustafa ShahanShah added a comment - I tried with Wget curl and Downloadthemall fire fox extension. The file is downloaded but when I unzip it is corrupted and does not extract.

            Thanks for the tip, Pete – that works for me too. Here's an example close to what I used:

            $ wget -t 0 --user="MYUSER" --password="MYPASS" https://example.atlassian.net/webdav/backupmanager/Application-backup-20130509.zip
            

            It will automatically retry (and resume) the download if it gets interrupted. It looks like this is the simplest option we've seen so far, assuming your system has wget on it.

            Michael Knight added a comment - Thanks for the tip, Pete – that works for me too. Here's an example close to what I used: $ wget -t 0 --user="MYUSER" --password="MYPASS" https://example.atlassian.net/webdav/backupmanager/Application-backup-20130509.zip It will automatically retry (and resume) the download if it gets interrupted. It looks like this is the simplest option we've seen so far, assuming your system has wget on it.

            wget -t 100 https://user:pass@example.jira.com/webdav/svn.dump.gz worked for me..

            and there's a -c option to continue if you already have some as well..

            Peter Drier added a comment - wget -t 100 https://user:pass@example.jira.com/webdav/svn.dump.gz worked for me.. and there's a -c option to continue if you already have some as well..

            MS added a comment - - edited

            I tried Michael's second solution, but it also stopped after 1GB

            curl -k -u "xxxxx:xxxxx" https://xxxxx.jira.com/webdav/backupmanager/JIRA-backup-20130201.zip -O --retry 999 --retry-max-time 0 -C -

            % Total % Received % Xferd Average Speed Time Time Time Current
            Dload Upload Total Spent Left Speed
            25 4093M 25 1050M 0 0 2102k 0 0:33:14 0:08:31 0:24:43 0
            curl: (18) transfer closed with 3191557274 bytes remaining to read

            The bash script worked fine.

            MS added a comment - - edited I tried Michael's second solution, but it also stopped after 1GB curl -k -u "xxxxx:xxxxx" https://xxxxx.jira.com/webdav/backupmanager/JIRA-backup-20130201.zip -O --retry 999 --retry-max-time 0 -C - % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 25 4093M 25 1050M 0 0 2102k 0 0:33:14 0:08:31 0:24:43 0 curl: (18) transfer closed with 3191557274 bytes remaining to read The bash script worked fine.

            Others have had success with cURL's retry option, e.g.:

            curl -k -u "user:password" https://example.atlassian.net/webdav/backupmanager/JIRA-backup-20130131.zip -O --retry 999 --retry-max-time 0 -C -
            

            Michael Knight added a comment - Others have had success with cURL's retry option, e.g.: curl -k -u "user:password" https://example.atlassian.net/webdav/backupmanager/JIRA-backup-20130131.zip -O --retry 999 --retry-max-time 0 -C -

            Adam Toth added a comment -

            The suggested DL manager worked fine for me, though my file was only 500M. It timed our with a normal browser DL 2x worked 1x.

            Adam Toth added a comment - The suggested DL manager worked fine for me, though my file was only 500M. It timed our with a normal browser DL 2x worked 1x.

            Michael Knight added a comment - - edited

            One workaround is to use 'curl' on the command-line to download the file in a 'resume loop'. Here is a script that does this, and will stop once it detects the file has finished downloading. Note that you will need to edit the script to put in the correct host, username and password.

            #!/bin/bash -u
            
            if [[ $# < 1 ]]; then
            	echo "Usage: ${0} FILENAME";
            	echo "Where FILENAME is the name of the file produced by OnDemand's backup manager (e.g. 'JIRA-backup-20121005.zip')."
            	exit 1;
            fi
            
            BACKUP_FILE="${1}"
            MY_HOST="test.atlassian.net"
            USERNAME="myusername"
            PASSWORD="mypassword"
            URL="https://${MY_HOST}/webdav/backupmanager/${BACKUP_FILE}"
            
            # Detect total file size
            FILE_SIZE=$(curl -s --head -u "${USERNAME}:${PASSWORD}" "${URL}" | grep -i '^Content-Length:' | awk '{ print $2 }' | tr -d [[:space:]])
            
            while [[ ! -f "${BACKUP_FILE}" || $(ls -al "${BACKUP_FILE}" | awk '{ print $5 }') -lt ${FILE_SIZE} ]]; do
            	curl -k -u "${USERNAME}:${PASSWORD}" -C - -o "${BACKUP_FILE}" "${URL}";
            	sleep 5;
            done;
            

            Michael Knight added a comment - - edited One workaround is to use 'curl' on the command-line to download the file in a 'resume loop'. Here is a script that does this, and will stop once it detects the file has finished downloading. Note that you will need to edit the script to put in the correct host, username and password. #!/bin/bash -u if [[ $# < 1 ]]; then echo "Usage: ${0} FILENAME" ; echo "Where FILENAME is the name of the file produced by OnDemand 's backup manager (e.g. ' JIRA-backup-20121005.zip')." exit 1; fi BACKUP_FILE= "${1}" MY_HOST= "test.atlassian.net" USERNAME= "myusername" PASSWORD= "mypassword" URL= "https: //${MY_HOST}/webdav/backupmanager/${BACKUP_FILE}" # Detect total file size FILE_SIZE=$(curl -s --head -u "${USERNAME}:${PASSWORD}" "${URL}" | grep -i '^Content-Length:' | awk '{ print $2 }' | tr -d [[:space:]]) while [[ ! -f "${BACKUP_FILE}" || $(ls -al "${BACKUP_FILE}" | awk '{ print $5 }' ) -lt ${FILE_SIZE} ]]; do curl -k -u "${USERNAME}:${PASSWORD}" -C - -o "${BACKUP_FILE}" "${URL}" ; sleep 5; done;

              Unassigned Unassigned
              amohdaris Azwandi Mohd Aris (Inactive)
              Affected customers:
              15 This affects my team
              Watchers:
              25 Start watching this issue

                Created:
                Updated:
                Resolved: