Video Bit Rates Explained

Just a short article to clarify something that is mostly misunderstood: bit rates.

Bit rates are a way of telling you how much video information is actually being generated and stored, much like a compression ratio might for still photography (e.g. JPEG Fine stores more data than JPEG Basic). 

So generally speaking, larger numbers for video bit rates are better. Where the confusion comes in has to do with video formats. You can't directly compare bit rates across 8K, 4K, or 1080P (FullHD) without doing some math. That's because the larger formats have more data associated with them in the first place:

  • 8K = 7680 x 4320, or 33.1mp
  • 4K = 3840 x 2160, or 8.3mp
  • FullHD = 1920 x 1080, or 2.1mp

Let's leave RGB out of it for the moment, and consider that each format is recording Bayer information as raw data. Let's also assume 10-bit data for the moment, as it makes the math clearer:

  • 8K = 331,000,000 bits required per frame
  • 4K = 83,000,000 bits required per frame
  • FullHD = 21,000,000 bits required per frame

For bit rates to be directly compared, you need to basically multiply by 4 as you increase to each larger video format size. If you had the same bit rate for 4K and Full HD, for example, the 4K system would have to be throwing away three-quarters of the data. In other words, a bit rate of 200Mbps for 4K is about the same level of compression as a bit rate of 50Mbps for FullHD. I don't use those numbers arbitrarily, by the way: 50Mbps FullHD is generally considered the lowest acceptable rate for broadcast use (partly because broadcasting and distributing video tends to have downstream compression happening that further reduces the data).

It gets much more complicated than just that simple number comparison, as the type of compression being used can have drastic impacts on comparing bit rates. For example, H.264 is an old compression method that's not as good as H.265 (all else equal). Moreover, things like lens sharpness, AA filters, diffraction, et.al., would also have impacts on detail, and the amount of detail will impact the bit data, too.

I've seen a lot of new videographers say something like the following: "My old FullHD camera only had a 50Mbps bit rate and my new 4K camera gets 100Mbps, so it must be better." Nope. Something is taking away half the "equivalent" data in the 4K system, even though the number is higher. 

This, by the way, is one reason why CFExpress is clearly going to gain ground (at least until someone actually produces an SDExpress card and reader that a camera maker decides to use): as video formats go from FullHD to 4K to 8K, the bandwidth needed is multiplying 4x each time, and go higher than that if you want raw data that's 4:2:2 or better and uncompressed. It's why the Canon R5 has CFExpress and not CFast, which really wouldn't be up to the job of recording 8K raw. Ditto the Nikon Z8/Z9. 

Other byThom Web sites: DSLRS: dslrbodies.com | general: bythom.com| Z System: zsystemuser.com | film SLR: filmbodies.com
Site information:
 Privacy Policy | Site Map | Contact

sansmirror: all text and original images © 2025 Thom Hogan — portions Copyright 1999-2024 Thom Hogan
All Rights Reserved — the contents of this site, including but not limited to its text, illustrations, and concepts, 
may not be utilized, directly or indirectly, to inform, train, or improve any artificial intelligence program or system.

Advertisement: