Trying to recreate a version control system for my music collection, with one crucial difference ... 🤯
I want to have a mirror of my local music collection on my server, and a script that periodically
updates the server to, well, mirror my local collection.
But crucially, I want to convert
all lossless files to lossy, preferably before uploading them.
That's the one reason why I can't just use git - or so I believe.
I also want locally deleted files to be deleted on the server.
Sometimes I even move files around (I believe in directory structure) and again,
git deals with this perfectly. If it weren't for the lossless-to-lossy caveat.
It would be perfect if my script could recognize that just like git does, instead of deleting and reuploading the
same file to a different location.
My head is spinning round and round and before I continue messing around with find
and scp it's time to ask the community.
I am writing in bash but if some python module could help with it I'm sure I could
find my way around it.
TIA
additional info:
Not all files in the local collection are lossless. A variety of formats.
The purpose of the remote is for listening/streaming with various applications
The lossy version is for both reducing upload and download (streaming) bandwidth. On mobile broadband FLAC tends to buffer a lot.
The home of the collection (and its origin) is my local machine.
I think the best way to handle this would be to just encode everything and upload all files. If I wanted some amount of history, I'd use some file system with automatic snapshots, like ZFS.
If I wanted to do what you've outlined, I would probably use rclone with filtering for the extension types or something along those lines.
If I wanted to do this with Git specifically, though, this is what I would try first:
First, add lossless extensions (*.flac, *.wav) to my repo's .gitignore
Second, schedule a job on my local machine that:
Watches for changes to the local file system (e.g., with inotifywait or fswatch)
For any new lossless files, if there isn't already an accompanying lossy files (i.e., identified by being collocated, having the exact same filename, sans extension, with an accepted extension, e.g., .mp3, .ogg - possibly also with a confirmation that the codec is up to my standards with a call to ffprobe, avprobe, mediainfo, exiftool, or something similar), it encodes the file to your preferred lossy format.
If so, run git add --all && git commit --message "Automatic commit" && git push
Optionally, automatically craft a better commit message by checking which files have been changed, generating text like Added album: "Satin Panthers - EP" by Hudson Mohawke or Removed album: "Brat" by Charli XCX; Added album "Brat and it's the same but there's three more songs so it's not" by Charli XCX
Third, schedule a job on my remote machineserver that runs git pull at regular intervals.
One issue with this approach is that if you delete a file (as opposed to moving it), the space is not recovered on your local or your server. If space on your server is a concern, you could work around that by running something like the answer here (adjusting the depth to an appropriate amount for your use case):
Another potential issue is that what I described above involves having an intermediary git to push to and pull from, e.g., running on a hosted Git forge, like GitHub, Codeberg, etc.. This could result in getting copyright complaints or something along those lines, though.
Alternatively, you could use your server as the git server (or check out forgejo if you want a Git forge as well), but then you can't use the above trick to prune file history and save space from deleted files (on the server, at least - you could on your local, I think). If you then check out your working copy in a way such that Git can use hard links, you should at least be able to avoid needing to store two copies on your server.