Formats, Codecs, & Workflow

02.07

The anatomy of a video file

  • A container type: AVI and Quicktime MOV are two examples of container types.
  • The video signal: The actual video data, often compressed.
  • The audio signal: The actual audio data, often compressed.
  • A codec: COmpressor/DECompressor. The program used to encode and decode the video (and audio) signal(s).

The characteristics of a video signal

  • Frame size: This is the pixel dimension of the frame (1920×1080 = full HD, 1280×720, 640×480, etc)
  • The Aspect Ratio: This is the ratio of width to height. Typical aspect rations are 4:3, 16:9 (most common), 21:9 (cinema), 1:1 (instagram;)
  • Frame rate: This is the speed at which the frames are played back. It’s best to maintain the video’s native frame rate when compressing your video. If there is an option for keyframes, choose the same value you used for frame rate. For streaming and in most cases, choose “constant” frame rate instead of “variable” frame rate.
  • Bitrate: The bitrate or data rate is the amount of data used to describe the audio or video portion of the file. It is typically measured in per second units and can be in kilobytes, megabytes or gigabytes per second. In general, the higher the bitrate, the better the quality. Most software that can edit or transcode video will offer you the ability to set the bit rate for the new file. Sometimes this setting lets you specify a constant bit rate (this is how video is typically recorded by the camera). Sometimes this bit rate can be variable, offering a “target” for the software to try and hit, and a maximum allowable rate for the stream. This can produce files that are more optimized for delivery via the Internet or optical disc.

The characteristics of an audio signal

  • The audio sample rate: This is how often the audio is sampled when converted.
  • The bit depth: The resolution of each sample (16bit unless otherwise specified)
  • The bitrate: amount of data per second in the audio stream

Lossy vs Lossless Compression

  • Lossless: Lossless compression makes a file smaller, but the decoded file is unaltered after decompression has taken place, low compression ratio. Usually, it’s no more than 2:1. (the animation codec is an example of a lossless compression algorithm)
  • Lossy: Lossy compression means you lose some image, video, or audio information–a balancing act between quality, data rate/file size, color depth, and fluid motion. (H.264 is an example of a lossy compression algorithm that can be used for both high quality and small super compressed renders.)

For the most part, we will deal with lossy compression algorithms. Lossless compression might be used when outputting something to bring back into your project, but if you manage your files well, don’t even bother with a lossless render. It will take up lots of space and if you do need to change things down the road, you can just reopen the project file and re-render.

Intraframe vs Interframe compression

  • intraframe: Information for each frame (better for editing) ProRes DNxHD
  • interframe: Information for keyframes and the changes between them (better for distribution) MPEG-2, H.264

Common codecs:

  • DivX. fancy paid,
  • MPEG-4 (DVDs)
  • H.264 (twice as efficient as MPEG-4, what we will talk about most)

Containers: kinda just folders with some info about its contents

  • MOV (oh hi quicktime)
  • Flash (.flv, .swf) is a dinosaur
  • MKV – future-proof, supports nearly all a/v formats, good subtitle support
  • MP4 – Recommended for web (preferred format for youtube and vimeo (MPEG-4 or H.264, AAC or AC3 for sound)… use this one!