-
Select Topic AreaQuestion BodyUsing Chrome browser, I visited a Release page: I chose a file "free-v10.3.0 Katalon.Studio.dmg" and tried to download it. The URL of the dmg file was The download continued approximatey 5minutes and terminated with failure message. I tried to download the file using wget command and curl command in the command line. I failed in both cases. In the wget session I saw a message: "error 618 jwt:expired". The following is the console output from wget: The following is the output from curl I tried downloading other files. I failed downloading files larger than approximately 300MB. I noticed that the session terminated after 5 minutes after the start. I have a connection to the Internet of 8Mbps --- ordinary speed, not too fast, not too slow, I think. It seemed small sized files are OK. I remember I could successfully download the similar-sized files last week. I guess that the JSON Web Token for GitHub Releases page has been changed recently: expiration was changed from a larger value to 5 minutes, which caused the |
Beta Was this translation helpful? Give feedback.
Replies: 29 comments 48 replies
-
Beta Was this translation helpful? Give feedback.
-
|
I tried GitHub CLI as suggested by Nabil-nl I wasn't successful. |
Beta Was this translation helpful? Give feedback.
-
|
I also tried downloading a resource using curl with my GitHub Personal Access Token: I wasn't successful. |
Beta Was this translation helpful? Give feedback.
-
|
Nabil-nil wrote:
To be honest, I don't understand what he suggested. |
Beta Was this translation helpful? Give feedback.
-
|
I'm also facing this error, I hope there will be a fix |
Beta Was this translation helpful? Give feedback.
-
|
The following issue discusses about the same thing: |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
Use wget's This is of course a work-around. I hope Github will stop this dropping of downloads after just a couple of minutes, it's preposterous. |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
It is 2025-08-13 21:04:26 in Japan; UTC+9 I retried downloading a large file from a Releases page using wget: It worked! No "jwt:expired" error occured. It seems that, just accidentally, I got connected to a GitHub access point with longer time-to-live token. |
Beta Was this translation helpful? Give feedback.
-
|
Still having this issue with the website on Firefox, |
Beta Was this translation helpful? Give feedback.
-
|
Github really needs to get on this. Guys; what if that was their primary source of downloads. They're depending on it working. I understand large files are a burden on the network no matter how you look at it but come on; with the amount of traffic you guys get that's minimal repo browsing surely you can afford to have a long-TTL token... |
Beta Was this translation helpful? Give feedback.
-
|
Large GitHub release assets now use short-lived signed URLs (JWTs) that expire in about 5 minutes, so your slow connection causes the download to fail once the token expires. |
Beta Was this translation helpful? Give feedback.
-
|
When you click a release asset link (or use Historically, GitHub gave fairly long expirations (10–15 minutes+), but they’ve recently tightened it to ~5 minutes for some assets. Use That way, even if the token expires mid-download, Use a faster path or resumable download manager
(You still need to start from the fresh URL, because the token can’t be refreshed mid-download without a tool like Download via API with Authorization header If you use the Releases API with a Personal Access Token and (You can get But here is ready-to-run script Bash Script: Requirements
Example usage The script will:
Here is the parallel version with Script: Requirements Usage example How it works
|
Beta Was this translation helpful? Give feedback.
-
|
Wow, this is what we're going to do to download a file... We're going backwards. |
Beta Was this translation helpful? Give feedback.
-
|
install.packages("remotes") 程序包‘remotes’打开成功,MD5和检查也通过 下载的二进制程序包在
Did you spell the repo owner (
Did you spell the repo owner (
Did you spell the repo owner (
|
Beta Was this translation helpful? Give feedback.
-
|
Turbo Download Manager (3rd edition) worked for me (FF) |
Beta Was this translation helpful? Give feedback.
-
|
Ran into the same error but in a CI/CD build script. Glad to hear it's not just me... Hope this gets fixed really soon, I was confused why downloading godot export templates would just error at 92% |
Beta Was this translation helpful? Give feedback.
-
|
I found this thread as I was having the same issue; but like many others in here (and other threads), it didn't quite help. I asked Copilot for help, and it provided the following PowerShell script for me, which worked great, so I thought I'd share it here. I asked it to write a bash version of the same script, but don't have anything to test it with, so let me know if it doesn't work for you. PowerShell version <#
.SYNOPSIS
Downloads a GitHub release asset in ranged chunks to avoid expiring signed URLs.
.PARAMETER ReleaseUrl
The GitHub release download URL (e.g. https://github.com/owner/repo/releases/download/tag/asset.zip).
.PARAMETER DownloadDir
Directory to save the file. Defaults to the folder where this script resides.
#>
param(
[Parameter(Mandatory)][string]$ReleaseUrl,
[string]$DownloadDir = $PSScriptRoot
)
# derive output path
$FileName = [System.IO.Path]::GetFileName($ReleaseUrl)
$Destination = Join-Path $DownloadDir $FileName
function Get-SignedUrl {
param([string]$Url)
$req = [Net.HttpWebRequest]::Create($Url)
$req.Method = 'HEAD'
$req.AllowAutoRedirect = $false
$resp = $req.GetResponse()
$loc = $resp.GetResponseHeader('Location')
$resp.Close()
if (-not $loc) { throw "Failed to get signed URL from $Url" }
return $loc
}
# 1) Figure out total size
$signedHead = Get-SignedUrl $ReleaseUrl
$headReq = [Net.HttpWebRequest]::Create($signedHead)
$headReq.Method = 'HEAD'
$headReq.AllowAutoRedirect = $false
$headResp = $headReq.GetResponse()
$totalSize = [int64]$headResp.GetResponseHeader('Content-Length')
$headResp.Close()
# 2) Open output stream
$fs = New-Object System.IO.FileStream(
$Destination,
[System.IO.FileMode]::Create,
[System.IO.FileAccess]::ReadWrite
)
# 3) Download chunks with progress
$chunkSizeMB = 5
$chunkSize = $chunkSizeMB * 1MB
$offset = 0
$startTime = Get-Date
while ($offset -lt $totalSize) {
$endByte = [math]::Min($offset + $chunkSize - 1, $totalSize - 1)
$signed = Get-SignedUrl $ReleaseUrl
$req = [Net.HttpWebRequest]::Create($signed)
$req.Method = 'GET'
$req.AllowAutoRedirect = $false
$req.AddRange($offset, $endByte)
$resp = $req.GetResponse()
$stream = $resp.GetResponseStream()
$buffer = New-Object byte[] 8192
$read = 0
$chunkDone = 0
$chunkTotal= $endByte - $offset + 1
# per‐chunk progress
while (($read = $stream.Read($buffer, 0, $buffer.Length)) -gt 0) {
$fs.Write($buffer, 0, $read)
$chunkDone += $read
$chunkPct = [math]::Round($chunkDone / $chunkTotal * 100, 1)
Write-Progress -Id 2 `
-Activity "Chunk bytes $offset–$endByte" `
-Status "$chunkPct% ($chunkDone/$chunkTotal bytes)" `
-PercentComplete $chunkPct
}
$stream.Close()
$resp.Close()
# overall progress
$offset += $chunkDone
$elapsed = (Get-Date) - $startTime
$speed = [math]::Round(($offset/1MB)/$elapsed.TotalSeconds, 2)
$remainSec = if ($speed -gt 0) { [math]::Round(($totalSize-$offset)/( $speed*1MB ), 0) } else { 0 }
$eta = (New-TimeSpan -Seconds $remainSec).ToString("hh\:mm\:ss")
$overallPct= [math]::Round($offset / $totalSize * 100, 1)
Write-Progress -Id 1 `
-Activity "Downloading $FileName" `
-Status "$overallPct% ($offset/$totalSize bytes) @ $speed MB/s – ETA $eta" `
-PercentComplete $overallPct
}
# 4) Cleanup
$fs.Close()
Write-Progress -Id 1 -Completed
Write-Progress -Id 2 -Completed
Write-Host "Download complete:`n $Destination"Example PowerShell usage and output: Bash version #!/usr/bin/env bash
set -euo pipefail
# parameters
release_url="$1"
download_dir="${2:-$(dirname "$0")}"
# derive output
file_name="${release_url##*/}"
destination="$download_dir/$file_name"
# chunk settings
chunk_mb=5
chunk_bytes=$((chunk_mb * 1024 * 1024))
# progress timers
start_ts=$(date +%s)
# helper: get signed URL from HEAD redirect
get_signed_url() {
curl -s -I -H "User-Agent: chunked-downloader" -X HEAD "$release_url" \
| awk '/^Location:/ {print $2; exit}' \
| tr -d $'\r'
}
# 1) get total size
signed=$(get_signed_url)
total_size=$(curl -s -I -H "User-Agent: chunked-downloader" -X HEAD "$signed" \
| awk '/^Content-Length:/ {print $2; exit}' \
| tr -d $'\r')
# ensure output exists
mkdir -p "$download_dir"
: > "$destination"
offset=0
while [ "$offset" -lt "$total_size" ]; do
end_byte=$(( offset + chunk_bytes - 1 ))
[ "$end_byte" -ge "$total_size" ] && end_byte=$(( total_size - 1 ))
# fresh signed URL
signed=$(get_signed_url)
# perform ranged fetch + write at offset
seek=$(( offset / 1048576 )) # in MiB
curl -s \
-H "User-Agent: chunked-downloader" \
-H "Range: bytes=$offset-$end_byte" \
"$signed" \
| dd of="$destination" bs=1M seek="$seek" conv=notrunc status=none
# update counters
bytes_read=$(( end_byte - offset + 1 ))
offset=$(( offset + bytes_read ))
# compute speed, ETA, percent
now_ts=$(date +%s)
elapsed=$(( now_ts - start_ts ))
speed=$(awk "BEGIN{printf \"%.2f\", ($offset/1024/1024)/($elapsed>0?elapsed:1)}")
remain_sec=$(awk "BEGIN{printf \"%d\", (($total_size-$offset)/1024/1024)/($speed>0?speed:1)}")
eta=$(date -u -d @"$remain_sec" +%H:%M:%S)
pct=$(awk "BEGIN{printf \"%.1f\", $offset*100/$total_size}")
printf "\rOverall: %s%% (%d/%d bytes) @ %s MB/s – ETA %s" \
"$pct" "$offset" "$total_size" "$speed" "$eta"
done
echo -e "\nDownload complete: $destination" |
Beta Was this translation helpful? Give feedback.
-
|
I had the same issue. I used ADSL, AT&T 4G and my employer's network (a $90B+ corporation) and 5 different browsers. Failing on a browser could never be restarted, let alone provide a useful error. What worked was several iterations of |
Beta Was this translation helpful? Give feedback.
-
|
💬 Your Product Feedback Has Been Submitted 🎉 Thank you for taking the time to share your insights with us! Your feedback is invaluable as we build a better GitHub experience for all our users. Here's what you can expect moving forward ⏩
Where to look to see what's shipping 👀
What you can do in the meantime 💻
As a member of the GitHub community, your participation is essential. While we can't promise that every suggestion will be implemented, we want to emphasize that your feedback is instrumental in guiding our decisions and priorities. Thank you once again for your contribution to making GitHub even better! We're grateful for your ongoing support and collaboration in shaping the future of our platform. ⭐ |
Beta Was this translation helpful? Give feedback.
This comment was marked as disruptive content.
This comment was marked as disruptive content.
-
|
Various solutions for different CLIs: (sourced from comments here) Replace Powershell*: (majority of Windows users) Invoke-WebRequest -uri <url> -OutFile PLACEHOLDER.zip -MaximumRetryCount 20 -Resume*Requires Powershell 7. To update, do the command below before running the previous command: winget install --id Microsoft.Powershell --source wingetcurl: wget: aria2c |
Beta Was this translation helpful? Give feedback.
-
|
This is still an issue!!!! This needs to get sorted out, not everyone has the damned luxury of fast internet, I am sitting in Africa with as fast a connection as I can get from my provider (50Mb) and am unable to download a frikken 1.2Gb file!!!!! Please, think of some other people other than yourselves sitting in other parts of the world !!! |
Beta Was this translation helpful? Give feedback.
-
|
Yeah I tried almost all solutions above, the only one working for me was using (flags If the release is also available on a different host like SourceForge, downloading from there worked for me as well. |
Beta Was this translation helpful? Give feedback.
-
What ACTUALLY worked: aria2c -x 16 -s 16 "url-to-file" It sped up so much that it can download ~7GB before the 5-minute json thing expires and kills the download. WOOHOO |
Beta Was this translation helpful? Give feedback.
-
|
On android worked with FDM downloader and main link from releases |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
GitHub support replied to me today:
Can someone test? |
Beta Was this translation helpful? Give feedback.

Turbo Download Manager (3rd edition) worked for me (FF)