Add a Media Consolidate feature for Single File Format documents.
I'm currently working on a course with hundreds of 4K videos and I have a remote editor doing the editing.
It is challenging having to zip up these very large files (which can take a while), drag them into Google Drive, wait for them to upload - and then for the editor having to drag them out of Google Drive, wait for them to unzip (which again takes a while) before he can work on them. Then him having to repeat the same zipping and dragging process again on the way back - for me to review.
It would great if there was a Media Consolidate feature for Single File Format documents (one which contains all data & media within it).
For example, Final Cut Pro X there's a Consolidate feature that pulls the media into the document at that point.
With this feature I could massively improve my workflow. I could just:
- Record a video
- Save it straight to Google Drive
- Once uploaded, my editor can open it straight away in Google Drive
- He can save when finished
- I can review his work right within Google Drive, without any zipping, unzipping, dragging around etc.
This would really remove a lot of hassle and improve the workflow.
I hope you'll consider it.
Danny Connell said:
Save it straight to Google Drive
Danny Connell said:
I can review his work right within Google Drive, without any zipping, unzipping, dragging around etc.
The problem is the services may corrupt the document even if the media remains intact.
This may be one reason why some higher-end NLEs work on creating their own custom shared media workflows with a server specific to such work.
Danny Connell said:
I think it’s highly unlikely though. I’ve never had any single file corrupted by Dropbox or Google Drive in over a decade of regular use.
When using ScreenFlow and saving to a Cloud drive or working from a Cloud drive, the file can certainly become corrupted which is why ScreenFlow throws a warning if you attempt to do that.
Could there be a "manifest" file in every packaged project that documents what the current state of all files should be (or just roll this into one of the other non-video files in the package)? For instance, the sizes and last edit dates of every asset? Then when opening the project, the manifest is checked against the actual files to make sure things are up to date, and if they aren't, it throws up a huge red flag that editing may cause corruption? Even simpler, just create some checksum for every file and save it in the project. If the checksums don't match at open time, sound the alarm to the user.
Another thing that might help prevent mayhem with multiple editors is a lock file that is created whenever someone opens a file. So long as that file exists, anyone opening the file elsewhere should get a huge warning that someone might be editing the project elsewhere, do you really want to edit, etc.? Properly closing the project could then delete the lock file, and the manifest system should still make sure that syncing is complete before another user opens it up.
Or is there some other way we are worried about data corruption? My only bad experiences have been the two above scenarios, where syncing isn't complete and I try to edit something, or if someone is editing at the same time.
It seems there has to be some way to make this work rather than blaming cloud services for being what they are. I can't save something to the Desktop anymore without risking iCloud chopping it up to bits, it seems.
Danny Connell Obviously I've been thinking about this quite a bit (too much?) lately, and I may have a third-party solution to your problem using Hazel. Hazel watches folders you care about, and when conditions of your choosing are met, it will do things to files in those folders (say, zip all screencast documents that have the word "ready_to_edit" in their titles and move them to another folder... even a Google Drive folder!).
You could create rules that recursively watch your [off-Drive] work folder for changed Screenflow packages, zip them up, and then move them to a Drive folder (maybe called "to edit". You can even make it match your folder structure, so if you have some base folder called "Screenflow Projects", and everything is in there, it can recursively watch all sub-folders, so you don't have to create new rules for every project.
Then your editor could have a rule on their end that watches "to edit" for new zip files, which moves and expands them in an identical file structure on their computer, but off of Drive, for editing. They could have a similar set of rules watching their work off-Drive folder structure, which takes newly edited screenflow documents (perhaps with "done_editing" in their name), zips them, and moves them to a different drive folder, called "edited", which syncs them back to you, perhaps even overwriting your original work location.
It'd be a bit tricky to get your rules just right, but once it's set up, you wouldn't have to worry about zipping or unzipping ever again, and neither of you would need to directly edit files that live in Drive. Only downside is that if you want to leave the zip files on drive, rather than just using it as a vehicle, they'd take up roughly twice as much space since you'd have two versions.
I'm experimenting with this now, and it seems promising.