Skip to main content

Backup

Thinking about this because the raw video from Kiersten's shoots is LARGE. I don't anticipate going back and re-editing these projects but it would be nice to keep the full quality versions somewhere. To that end I think I'd like to re-encode the raw videos to a more compressed format to keep on spinning storage, and offload the full quality files to an offline medium. Bluray discs could easily hold the raw footage and not require tons of hardware to keep up with. Some sort of convention for keeping track of where the original data lives should be devised.

I think I'll create a JSON document with a common title like COLD_STORAGE.json in directories where some original data has been offloaded to physical media. This could contain the file listing and a reference to the disc(s) where the data is. Then if it gets to the point of needing a more robust solution those could be scraped from the filesystem into some DB application

The Master Plan

for already compressed stuff e.g. original videos:

  1. figure out how to optimally distribute the input files into directories of a specific capacity - disc capacity minus some percentage for parity data - 5% seems like a huge amount in a promising way
    • fpart can do this?
    • for a 25GB BD-R ~23.7 GB data, 1.185 GB parity
    • need to figure out the actual size to have a full disc after mkisofs
    • iso format - level 3, rockridge extension, joliet extension (only for windows, shrug), padding (default)
      • NO LEADING DOTS
  2. store the mapping for each file from source directory to archive directory in a database - sqlite is easiest and very compatible
  3. generate parity for the contents of each directory with par2 to a target size of 5%
  4. mkisofs for each directory of data and parity
  5. burn iso to each disc
  6. store the sqlite db in the place where the original files were - this lives on warm backup