Ruben,
When you and I talked 2 years ago about my bandwidth concerns (moving the data out, instead of keeping it siloed in and costing providers)... I did research into technologies that help with that....BitTorrent was one of those, but specifically, some of the cool DHT proposals that help with the data discovery in the first place (such as the talk here of caching).
Caching could be done instead in a different way, perhaps even through DHT.
2 things immediately come to my mind that might help later, using different architecture through DHT instead.
1. Metadata files (could be small indexed Wikidata sets of Things) http://www.bittorrent.org/beps/bep_0009.html 2. Merkle Trees (pieces of pieces of Things or indexed sets) http://www.bittorrent.org/beps/bep_0030.html http://www.bittorrent.org/beps/bep_0000.html
This might be a good research investment for your teams in 2017. Where the world can share small index sets of Wikidata's data...i.e. All Software Companies, or All Men, etc.
In fact there's several good BitTorrent enhancement proposals that could benefit TPF and LDF client/servers potentially: http://www.bittorrent.org/beps/bep_0000.html
I wish you and your research teams the best in 2017. -Thad