Here is an alternative solution.
We could split the x Gb dump file in 50 Mb chunks.
As long as the database format does not change all but the last chunk
will be unaltered on subsequent runs.
For example
Dump and split in week 10 produces Chunk_1 up to Chunk-39 (all but the
last are 50 Mb in size)
Dump and split in week 11 produces Chunk_1 up to Chunk-40 (only 39 has
changed, 40 is new)
So only 2 chunks need to be downloaded after the last run.
Then a join operation on all files produces an up to date dump file.
A small script to manage this process would come in handy.
Erik Zachte