Advanced Digital Video (Part 1)

Site: QSC
Course: Q-SYS Quantum Level 1 Training (Online)
Book: Advanced Digital Video (Part 1)
Printed by: Guest user
Date: Thursday, 21 November 2024, 4:34 AM

Description

Video Transcript

00:07
Welcome everyone to “Introduction to digital video concepts”, as part of our QSC Quantum Training,
00:13
an advanced service and troubleshooting curriculum.
00:16
My name is Patrick Heyn and I’ll be giving you a brief introduction
00:19
to some of our more advanced video topics as they relate to NV Series video end points.
00:25
If this is your first experience with Q-SYS Video,
00:27
we highly recommend that you START with our Q-SYS Video 101 online training.
00:33
This should give you a good baseline on the NV Series’ functionality before we dive into deep-geek stuff.
00:41
Speaking of which, let’s get start with a discussion of video color, shall we?
00:46
RGB or Red, Green and Blue color model is the most commonly used today
00:52
because it mostly closely models the human eye
00:55
and it also best represents the actual pixel data on video display devices.
01:01
The human eye contains rods, that detect the light intensity and cones, that perceive color.
01:08
Each pixel contains a subpixel
01:11
usually an RGB element which are controlled to display a desired color and light intensity.
01:17
The RGB color model can be mathematically mapped and transformed into something called a Color space.
01:23
Color space defines the maximum and all allowable color on a given device.
01:29
The complete subset of colors within a color space is defined as its color gamut.
01:34
Many of these standards you see on the left here, define more than just the mapping function,
01:39
color space and color model, as they can also define dynamic range, resolutions, frame rates, bit depths, etc.
01:47
Speaking of which, the color bit depth is the number of bits that define a single color of a pixel.
01:54
The color precision portion of the video can be defined as bits per pixel or per color.
02:00
With 8, 10 or 12 bit color, it is defined as bits per color not per pixel.
02:06
The complete range is called Full range color and 16 – 235 is called Limited range color.
02:13
Typically SMPTE/CE rates support a Limited color range and IT/VESA rates support a Full color range.
02:21
Now don’t get color space confused with color encoding.
02:24
Color encoding systems are defined as sets of rules that are used to encode RGB data
02:30
into a more easy-to-use form that can be transmitted using less bandwidth.
02:35
Using the YCbCr encoding system the Luma (Y) is the brightness portion of the image,
02:41
sometimes referred to as the black and white or monochromatic portion of the image.
02:46
Chroma (C) is the color portion of the image.
02:49
Transforming an RGB signal into the luma and chrominance components has many useful properties.
02:56
RGB signals carry redundant data,
02:58
one reason being that the luminance is being carried over all three RGB components.
03:03
Transforming to YCbCr is performable via a mathematical transform
03:08
which is 100% recoverable when transforming back to RGB.
03:13
Digital subsampling is frequently used when transmitting and or saving digital data.
03:18
The human visual system is much more sensitive to brightness than it is to color.
03:23
When a video signal is separated into the luma and chrominance components,
03:27
the luma portion can be transmitted via high resolution bandwidth.
03:31
However, the chrominance portion can be subsampled to reduce bandwidth requirements.
03:37
This image shows the chroma was significantly compressed but the luma remains uncompressed.
03:42
The subsampling process is expressed by a 3 or a 4 (4 parts if alpha channel is included).
03:48
These subsampling ratios are used to create a conceptual bitmap J pixels wide and 2 pixels high.
03:56
Ratios use arbitrary variables J:a:b which are:
04:02
J= the sample space for the conceptual bitmap
04:06
a = the number of unique samples in the first row
04:10
b = the number of unique samples in the between the first and second row
04:15
The resultant bitmap is not an actual representation of how many pixels are displayed on a device.
04:22
Instead, this just gives us the representation of how many color samples are transmitted per pixel
04:28
and can be used to calculate the amount of bandwidth reduction.
04:31
For example, 4:4:4 means 4 samples.
04:36
4 Unique samples in the first row and 4 unique samples between the first and second row.
04:42
4:2:2 means you still got 4 samples.
04:46
2 Unique samples in the first row and 2 unique samples between the first and the second row.
04:51
But this does provide a bandwidth reduction by 1/3
04:55
Here are the supported color formats, RGB which is always 4:4:4: and YCbCr 444 and 422.
05:04
And that's pretty much the baseline vocabulary on color.
05:08
Let’s take a quick break now and dive into some more Ancillary video data when you get back