Skip to content

Commit

Permalink
Update Case_study.md
Browse files Browse the repository at this point in the history
Added network contention info to the introduction section
  • Loading branch information
stephenmcconnachie authored Apr 15, 2024
1 parent 291e0e3 commit 224a7dd
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions Doc/Case_study.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
**by Joanna White, Knowledge & Collections Developer**

At the [BFI National Archive](https://www.bfi.org.uk/bfi-national-archive) we have been encoding DPX sequences to FFV1 Matroska since late 2019. In that time our RAWcooked workflow has evolved with the development of RAWcooked, DPX resolutions, flavours and changes in our encoding project priorities. Today we have a fairly hands-off automated workflow which handles 2K and 4K image sequences. This workflow is built on some of the flags developed in RAWcooked by Media Area and written in a mix of Bash shell and Python3 scripts ([BFI Data & Digital Preservation GitHub](https://github.com/bfidatadigipres/dpx_encoding)). In addition to RAWcooked we use other Media Area tools to complete necessary stages of this workflow. Our encoding processes do not include any alpha channel or audio file processing, but RAWcooked is capable of encoding both into the completed FFV1 Matroska.

A note on our RAWcooked performance in context of the operational network: This case study covers DPX sequence processing within the operational network where we run dozens of Windows and Mac workstations, and a similar number of Linux servers, addressing 20 nodes of NAS storage - all these devices connected via a mixture of 10Gbps, 25Gbps and 40Gbps. Data flows to the network storage from automated off-air television recording (over 20 channels), born-digital acquisition of cinema and streaming platform content, videotape and film digitisation workflows, as well as media restored from our data tape libraries. And data flows from the network storage to the data tape libraries as we ingest at high volume. We are in the process of upgrading the network to use only 100Gbps switches with higher-than-10Gbps cards on all critical devices; but meanwhile the throughput we achieve is constrained by the very heavy concurrent use of the network for many high-bitrate audiovisual data workflows; and this network contention impacts on the speed of our RAWcooked workflows.

This case study is broken into the following sections:
* [Server configurations](#server_config)
Expand Down

0 comments on commit 224a7dd

Please sign in to comment.