New technology, research, software for better media archiving, Part Two

In PART TWO of her report, Anna has a change of perspective on parallel transfer, reports on error-detection software and fixity, and other topics.

Anna Frick and David Glasser attended the International AES Conference on Audio Archiving, Preservation & Restoration. The June event was held at the National Audiovisual Conservation Center in Culpeper VA, operated by the Library of Congress. The conference welcomed 170 participants from around the world, for three days of show and tell on one of our favorite topics, media archiving.

Here is Part Two of Anna’s report:

A case for parallel transfers: why it’s not a terrible idea in some situations.
At Airshow, we pride ourselves on always doing 1-to-1 transfers to ensure the highest quality transfer, without error, surprises or dropouts. However, there are cases where a parallel transfer workflow (in other words, transferring multiple pieces simultaneously, saving time and money) is appropriate and possibly downright essential. We learned how to adapt our foolproof workflow to accommodate this requirement. Contact us to discuss whether this is an option for your collection.

Pardon me, did you drop that sample? Prism’s new-ish error detection software, Verifile.
Prism Sound’s Ian Dennis talked about Verifile, software released earlier this year, that checks out your audio files and verifies that there have been no dropped, added or altered samples, clicks, pops, channel-swapping, processing, etc. that can sometimes pop up in your digital files. It uses a rolling hash hidden in the dither of the file. And it’s free! Find out more here:

Tape recorders at NAVCC
Just one of many shelves full of really old stuff at the NAVCC
iZotope is still my favorite audio restoration program, but now I think I understand why.
iZotope’s Principal DSP engineer Alexey Lukin gave us an illuminating rundown of how their noise reduction tools work, including the use of psychoacoustic masking to retain the musicality and natural sound when removing unwanted noise. He also talked about how iZotope is utilizing machine learning and neural nets to further enhance their toolkit, giving us all better tools to use in our restoration practices.
I’ve avoided understanding XML completely. But that needs to change.
With all the talk about best practices for digitization and file management, the only thing left to discuss was access. To be able to accurately and reliably access a collection, it needs to be documented and searchable. XML manifests make this possible for many platforms.
“Trust but verify” is the mantra.
Harvard Libraries says a full fixity check of their archives can take as long as 45 days to complete. While ideally, we would want to verify the fixity of our files on a daily basis, it’s just not practical. So there were many discussions on how to keep our digital files safe and error-free over time. It boils down to this: accurate transfer > quality check (human and/or automated) > duplication of files in multiple locations > verification of those files on a block-by-block basis routinely, and on a file-by-file basis as time permits. The first is easily done with the likes of software to ensure fast file sharing, it’s the other processes that need to be sped up and remain accurate.


You’ll find Part One of Anna’s report here.

Want to discuss more, share your thoughts or chat about archiving? Email Anna here.