Tech

CopySafe™ - Don't risk your files

Gunleik
September 8, 2023

The proprietary CopySafe™ technology has been developed bottom up by the Quine team since 2016
The combination of three main ideas  separate CopySafe™ from anything.

  • Unique advanced referenced advanced multi-step file-security .
  • Unique automation of workflows based on file-recognition and in-file metadata .
  • Unique file-categorizing and end-user application integration.

Unique advanced referenced advanced multi-step file-security

Security first. 

Checksummed file-transactions has been the industry-standard for initial copying of original media to two or more destinations for the last 15+years, and this with the required checksum receipt and mhl.files are of course at the heart of anyQuine copy-operation.

This approach was and to a large extent is: Optimized for local (non-network)copies where you copy your material once for it to remain stationary through the post-process after initial ingest.

Files in modern productions aren’t moved once, but often multiple times – many times subsets and only variants of the files are moved to collaborators. After the initial copy users essentially end up doing “drag and drop” or sending the files away with tools without access to the initial checksum and “what variant of the file reference back to which exact original” gets lost in transaction.

Or you copy with one tool, and things are distributed from there in a myriad of ways like:

Chat-apps, mail-based apps, FTP and/or professional “agent based” transferring tools. In all these transactions, safe or unsafe:

  • The relation to the original file is technically lost
  • Manual work needs to be done in both ends
  • Automation becomes impossible

A typical scenario is that producers tediously make field-copies to multiple disks with checksumming, but when these same files are to be assembled – more often than not to a network based NAS – to be able to sort all the material into the right folders, all security is thrown overboard through manual drag-and drop operations to network disks. And from there, it just gets worse.

There are multiple problems with this, the two biggest:

Copying to any kind of network storage, has a whole variety of extra pitfalls over copying to locally attached storage. Routers, switches, cables and at worst  - wi-fi-connections you don’t control can introduce packet-loss and corruptions the OS’s are made to allow for, which can be destructive for a media-file going through a post process. If you then add the complexity of safe cloud-transactions to the problem, the security you initially had of the first copy to a single locally connected disk, becomes illusory.

With Quine’sCopySafe™ technology, all transactions of files going through the Quine tools, are validated and through our databases referenced back to the initial checksum of the initial creation of that specific file, whether locally, over network or through the cloud.

We made a solution to automatically synchronize and distribute media and files in a consistent, validated and structured form from the on-set copy to a master-project, and also for all sharing of files from that master-project to later copies, be it to local disks, to the cloud or from one user to another, locally or through the cloud.

In all these instances the copies are always validated against the initial copy of that file.

Taking away the need to transfer the files in other tools (whether these are FTP/UDP/HTTPS-based doesn’t really matter), makes it possible for us to always validate all transactions against an initial reference point and alert the user if there is an issue or if a transaction needs to be re-run. More often than not, we automatically fix transactional problems through automated re-runs, but there are scenarios – like when WiFi or bad hardware is involved, where the user must intervene to fix the problem, whether this happens on the initial copy, or the10th copy of some file to a VFX house on the other side of the planet.

QuineCopy does all of this for free.

Unique automation of workflows based on file-recogniton and built-in metadata

Outside the security aspect, we saw a huge opportunity lost in the dumb treating of files in most copy-tools.

By looking at each file and extracting the file-metadata to our database, we allow for advanced initial automation like deciding what kind of media this is, shooting-day, what LUT’s (if any) have been used and allow for file-type specificworkflows at ingest, and sort all media automatically into the right folder in a project-structure.

An example:
Let’s say you have three cameras in a production which is to be edited in Prores
An ARRI S35 shooting ARRI-RAW with individual LUTs on each clip
A Canon R3 shooting CanonRAW log3

ABlackMagic Broadcast Shooting ProRes in REC 709 in the correct deliveryresolution

The ARRI and Canon cameras will need camera-specific copying, transcoding andLUT settings to yield the expect results for editorial, while the BlackMagic files are “edit-ready and don’t need to be transcoded.

Based on project-settings and individual metadata recognition, QuineCopy can get assets from these three cameras queued up for copying and processing, treat each camera individually “right” and if needed for dailies and logging: upload the right variant of the asset (either the original or the transcoded proxy)for remote participants in a single drag-and-drop operation per camera magazine.

The database of file-specific metadata comes into action in downstream operations. Doing things right in the first step, allows for automation and correct display of metadata independently of which variant of a file you are working with, whether you look at the file locally, browse QuineCore on the internet or you look at the file in the editorial application of choice (Premiere, Resolve and MediaComposer are currently supported)

Unique file-categorizing and end-user application integration
Everything has become simplified through file-based recordings and collaborative tools…
Is just  not true.

In the old days with a film-scan - you would scan/digitize all material to for exampke a 10-bit Cineon DPX sequence for online while you made an editorial format like DnX36 or ProresProxy for the editorial phase and everyone worked off the same RAID or SAN to finalize the project.

If you shot to tape, you had an uncompressed SDI signal to take through a similar process, or you had a simple unified DV signal to ingest and finish from directly into an NLE.

“The simplicity of file-based recordings” inadvertently exposes everyone to many more workflow-challenges than the above “old school” workflows, which often were handled by Film or Broadcast engineers. The paradox is that things have become more complex, while the typical user is less technically aware and competent.

We embrace the democratization of tools! We love the diversity of opportunities in today’s market. BUT we see that most daily drivers are not aware of the complexities or capable to handle them.

Any workflow plan starts with planning and then stepping backwards. In practice we see that people only think through the initial collection.

And the users shouldn’t need to be masters in informatics! They are creators, not image technicians, broadcast engineers or codec specialists.

Because of the diversity of recording-formats, color-spaces and codecs, we now rely even more on offline/online. This means that you transcode all your recorded material to a common edit-codec and resolution to simplify the editorial process, and allow for edits to be done on “normal” computers and not high-end online stations with similarily specified storage throughput.

The challenge then is that you quickly end up with 2-4 different variants of the same file, used for different parts of the workflow.

  1. An Original
  2. An Edit-proxy
  3. A Dailies/Web-preview
  4. A VFX variant
  5. An Online variant

To a user, these are essentially the “same” file and may even look identical.
For workflow purposes they are variants of the same source that can be used interchangeably for different purposes in the production lifespan.

For a modern PAM like QuineCore, we need to handle not only the different categories of data (like cameras, audio-recorders, still-images or documents), and the individual sources inside each category (like a Canon or a BlackMagic), but also need to track the different variants of these files and serve the right variant for the right production purpose to the right user and role in the production.

CopySafe™ extracts all the necessary metadata and categorizes all the variants of the files to be able to automatically serve the right variant of the file with consistent metadata to the right users at any time. And if metadata is updated on one variant of the file, we can automatically update the metadata to all other variants of that file.

Together these three elements allow for the unrivalled robustness in metadata-consistency, automation and security that CopySafe™ gives.

Quine has the PhD’s in file- and metadata management so you don’t have to.

(CopySafe™ can be licensed on request to be implemented in other tools)

Book a demo with us.

Experience the power of Quine yourself.

Book a demo