

HDR and high color depth displays have bumped that to 10-bit color, with 12-bit and 16-bit options as well, though the latter two are mostly in the professional space right now.

The standard in the past has been 24-bit color, or 8 bits each for the red, green and blue color components. Whatever your GPU renders internally (typically 16-bit floating point RGBA, where A is the alpha/transparency information), that data gets converted into a signal for your display. Every pixel on your display has three components: red, green and blue (RGB) - alternatively: luma, blue chroma difference and red chroma difference (YCbCr/YPbPr) can be used. What all digital connections - DisplayPort, HDMI and even DVI-D - end up coming down to is the required bandwidth. To understand the above chart in context, we need to go deeper. There are still other considerations, like the auxiliary channel on HDMI, but that's not a major factor. 16b/18b encoding improves that to 88.9% efficiency, while 128b/132b encoding yields 97% efficiency. That means only 80% of the theoretical bandwidth is actually available for data use with 8b/10b.

8b/10b encoding for example means for every 8 bits of data, 10 bits are actually transmitted, with the extra bits used to help maintain signal integrity (eg, by ensuring zero DC bias). The DisplayPort and HDMI digital signals use bitrate encoding of some form - 8b/10b for most of the older standards, 16b/18b for HDMI 2.1, and 128b/132b for DisplayPort 2.0. Note that there are two bandwidth columns: transmission rate and data rate.
