<div dir="ltr">Hi!<div><br></div><div>The underlying filesystem (ZFS) uses block-level deduplication, so unique chunks of 128KiB (default value) are only stored once. The 128KB chunks making up dumps are mostly unique since there is no alignment so deduplication will not help as far as I can see.</div><div><br></div><div>Best regards,</div><div><br></div><div>Count Count</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Jul 28, 2020 at 3:51 AM griffin tucker <<a href="mailto:gtucker4.une@hotmail.com">gtucker4.une@hotmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div lang="EN-AU">
<div class="gmail-m_-4824975385899825970WordSection1">
<p class="MsoNormal">I’ve tried using freenas/truenas with a data deduplication volume to store multiple sequential dumps, however it doesn’t seem to save much space at all – I was hoping someone could point me in the right direction so that I can download
multiple dumps and not have it take up so much room (uncompressed).<u></u><u></u></p>
<p class="MsoNormal"><u></u> <u></u></p>
<p class="MsoNormal">Has anyone tried anything similar and had success with data deduplication?<u></u><u></u></p>
<p class="MsoNormal"><u></u> <u></u></p>
<p class="MsoNormal">Is there a guide?<u></u><u></u></p>
</div>
</div>
_______________________________________________<br>
Xmldatadumps-l mailing list<br>
<a href="mailto:Xmldatadumps-l@lists.wikimedia.org" target="_blank">Xmldatadumps-l@lists.wikimedia.org</a><br>
<a href="https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l" rel="noreferrer" target="_blank">https://lists.wikimedia.org/mailman/listinfo/xmldatadumps-l</a><br>
</blockquote></div>