Now that we are generating all but the biggest of wiki's reliably I'd
like to start the discussion of retention for older data base dumps.
If we can reliably stick to a two week window for each wiki's dump
iteration, how many dumps would back would it make sense to keep?
Most clients that I've talked to only need the latest and simply look at
the older ones in case the newest dump failed a step.
If there are other retention cases then I'd love to hear them and figure
what's feasible to do.
Operations wise I'd be thinking of keeping somewhere between 1-5 of the
previous dumps and then archiving copies of each dump at 6month windows
for permanent storage. Doing that for all of the current dumps is way
more space then we have currently available but that's also why were
working on funding for those storage servers.
Is that overkill or simply not enough? let know.
--tomasz