So, I have about 12 GB and I want to fill it with files that need archival.

My criteria are:

At risk of becoming lost

Being usable files without specific hardware (needing a software emulator is fine)

Willing to break my rules if I’m.given better ideas

  • ReversalHatchery@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 days ago

    Oh my sweet summer child, 12 GB is not a lot! :)

    well, for one you can start saving webpages you found helpful, maybe your useful links collection or bookmarks, if you have any of those. I would recommend using Firefox and the singlefile addon, or the webscrapbook addon. feel free to look into their settings, but don’t let it overwhelm you, of you need it take it in smaller pieces, there’s no shame in it.
    since this is mostly text, it shouldn’t take up that much space quickly, and it’s also very efficiently compressible! for example with 7zip.

    if you often watch videos, like on YouTube or somewhere else, and you find something useful or otherwise you think it’s worth preserving (entertainment is also a valid reason), you can grab it too. have a look at yt-dlp, it’s very versatile, very configurable, and not only for youtube.
    but this will easily take up a lot of space, videos are huge and not really compressible losslessly.

    other than that, have a look at the DataHoarder community: !datahoarder@lemmy.ml (I hope the link works). for even more, you may check the datahoarder and opendirectories subreddits through libreddit/redlib

  • ReversalHatchery@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 days ago

    oh and if you’re interested in archiving, definitely check out the Archive Team!

    they are always running archiving projects, they even participate in preserving reddit content, and they have a connection with archive.org and the Wayback Machine.
    they maintain a virtual machine image that you can run at home even on a simpler PC, and help in their projects. It does not consume much storage actively, only some network bandwidth. It’s basically a distributed archiving tool, a lot of people running it download all kinds of data (good for performance and to avoid restrictions) for the selected project, and upload it to AT for preservation

  • ExperimentalGuy@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    10 days ago

    The internet archive has a lot of torrents that need seeding if you’re up for something like that? I think they’re still down, but once they’re back up, it’s pretty easy to help out with that.

  • Gamers_mate@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    10 days ago

    12GB is small but that just means you might want to opt for links you find useful like ReversalHatchery said.You could probably store a lot of fanfiction and tutorials if its just text. I have had an idea of creating a special storage code that can store large amounts of information in a smaller file size but I am bad at maths and programming and have no idea what I am doing so it might take a while.

    • Ava@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      10 days ago

      In general, this is definitely an area where the best approach is to just find an existing tool for what you need and use that. Especially for text data, compression is a pretty well-studied field and there are plenty of public (and open-source, if that’s a requirement) tools that will do a fantastic job at reducing size. Rolling your own is likely to result in significantly worse compression rates, and if you make an error your data could be irreparably destroyed which you won’t know until you try to access it later.

      If your data is incredibly specific you might be able to do better, but it’s usually best to ignore that sort of optimization until you actually need it.