I know we can’t do this with any copyrighted materials. But a lot of books, music, art, knowledge is in the creative commons. Is it possible to create one massive torrent that includes all that can be legally included and then have people only download what they actually want to enjoy?

  • aelwero@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    I mean… That pretty much describes torrents period… What is the functional difference between hosting a single torrent with everything, and hosting a torrent per item?

    If the expectation is that you only include files you want when downloading the torrent, you’re only going to be seeding that portion.

    Seems like it would just make the search function harder, and make it harder to determine the “health” of individual items…

    I don’t understand the benefit…

    • AnarchistsForDemocracy@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      For example zlibrary is 220TB of books and scientific articles, that included in the torrent would be great along with all the stuff that is arts and music.

      Basically it would be a way to combat media vanishing off of the net over time. Basically a noah’s arche for all of mankinds knowledge.

      It would be great to have everything in one single spot to make it easier to contribute and get stuff. We’d also be more easily capable of combining our forces to maintain/create the thing.

      • aelwero@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        That makes a great focal point for what I was saying actually ;)

        It’s 220TB, so youll have incredibly few people who download the whole torrent. Most will open the torrent list and select a small number of items from it to download. The files selected the most will get seeded frequently, the ones that never get selected by anyone will have only the originator seeding it (if they continue to do so).

        It’s functionally no different than if each individual file is a torrent… Except that the seeding info is going to be wonky on the single 220TB torrent because nobody is downloading it intact, only in pieces.

        It’s also much easier to find a specific file if it is it’s own torrent vs. one of a billion files in a single mega torrent.

        Just because you put it on an index in a torrent doesn’t mean the file still exists somewhere. That media can still vanish…

        What would do what you’re suggesting this torrent would do (which a torrent cannot actually do) is a Yottabyte capable computer somewhere storing all those files… You’d need that to keep the seeding intact for the whole torrent…

        • AnarchistsForDemocracy@lemmy.worldOP
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Maybe one could tweak BitTorrent for this one mega torrent so that you have to seed/leech one of the least popular files along with every popular file?

          • aelwero@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Ok… well now we’re getting crazy :)

            A much better approach to what you’re talking about with that one is probably to approach the problem from the other end of the snake.

            Torrents work at keeping files intact communally specifically because they’re popular files, and the more popular, the more “healthy” a torrent is, because it’s transitting more often and being stored in chunks in a bunch of places.

            If you’re trying to keep an archive of everything (and frankly, what I’m about to suggest could literally store the whole ass internet), you need to focus on the obscure crap nobody is likely to ever look for… The stuff that can’t survive over torrent because it’s obscure.

            You can do that by share, similar to a torrent, but you wouldn’t want a setup that encouraged users to share files, you’d want a setup that encourages users to share storage.

            Like you provide a hypothetical tnerrot network (made up just now, torrent backwards) and as a condition of using this tnerrot network you allow say 20GB (or more, as internet gets bigger, drives get bigger, games get bigger, this allocation can get bigger as that happens…) on your hard drive that it uses to store the actual files, and in exchange you can pull any file stored in the tnerrot network. Instead of marvel movies (or whatever legal file has that kind of oomph) having a billion seeds and obscure science report having one, everything would have 2 or 3 dedicated seeds because every file would be seeded by whatever computers (2 or 3 separate ones, for redundancy) tnerrot stores it at.

            You’d need a few commercial servers, because hosting a file that gets thousands of download requests a day wouldn’t be friendly for random guy in Ohio or wherever, but for the vast vast majority of the files, you shouldn’t have major issues.

            Space sharing, not file sharing, is what you’d need to do what you’re thinking. You’d need to invent the tnerrot…