The High Speed Video IC Architecture

Get Complete Project Material File(s) Now! »

Analog and Digital Video Data

When the video system jumped from grayscale to their colored counterparts, band-width requirements increased threefolds to accommodate red, blue and green sig-nals. Hence alternate methods to transmit the color picture while utilizing the same bandwidth were tried. This gave rise to the composite video signal, which is still used in NTSC, PAL and SECAM video standards. Many similar ways of representing video data has evolved over the years, all of them are mathematically related to the RGB.
Other standards used in video consumer industry include S-Video, YPbPr in the analog domain and YCbCr and RGB as digitized versions of YPbPr and RGB.
Various analog and digital audio/video interfaces exits. The table below lists a few popular digital and analog interfaces in the decreasing order of their video quality [6]:
• HDMI (digital YCb Cr )
• DVI (digital RGB)
• Analog YPbPr
• Analog RGB
• Analog S-Video
• Analog Composite

Color Spaces

A color space is a mathematical representation of a number of colors in terms of three or more coordinates. The three primary colors are the old red, green, blue (RGB) that are mixed to form any desired color. R’G’B’ are mathematically manipulated non-linear form of RGB colors, popularly known as gamma corrected colors, adopted instead of the true linear RGB colors. To save bandwidth, cost and the processing power, R’G’B’ are mathematically manipulated yet further, to derive several other forms of video signals, some of them will be mentioned below.
The color systems have access to the image pixel, directly stored in color in-tensity and format, to expedite the pixel refreshing process. This is the reason why most video standards use the luma and the two color difference signals.
Brightness or luminance (Y) derived from linear RGB of a pixel, has a nonlin-ear gamma corrected variant called luma while chroma (C’) consist of color, hue and saturation information ,and is utilized instead of the chrominance signal (C). Fig. 1.3 indicates a very simplified RGB signal flow showing how different video interfaces are obtained from initial RGB signals.

Analog Video Transmission

The original NTSC and PAL systems were singled wired transmission systems, commonly called composite video baseband signal (CVBS). They had a bandwidth limited to less than 6 MHz with voltage amplitudes between −40 IRE (−286 mV NTSC/−300 mV PAL) to +100 IRE (714 mV NTSC/ 700 mV PAL) with slight variation between standards.
IRE, comes from Institute of Radio Engineers, and is a unit of measurement for composite analog video waveforms. Since amplitude of active video signal at any instance measures the pixel’s brightness, IRE are quantified in percentage, starting from blanking level to the highest voltage of reference white level. In NTSC systems reference white level is 100 IRE which equals 714 mV.
The CVBS signal combines luma with the chroma signal, both occupying the same frequency spectra, essentially making it difficult to separate them without some picture distortion. Thats why the S-Video kept them as separate signals, while maintaining similar bandwidth as CVBS. As a consequence two wires had to be used.
Component video as an improvement upon S-Video, eliminated the need for modulation of chroma signal, reduced errors and introduced the color difference signal instead. The resulting component video signal utilized the luma (Y), the blue color difference P′b, and the red color difference P ′r . The two color difference signals were kept on separate wires, giving rise to a three wire interface. The digital domain counterpart is referred to YC′b C′r .
Component analog video further comes in different formats. They include the standard definition (SD), enhanced definition (ED) and high definition (HD).
HD video
The high definition video typically includes 720p, 1080i and 1080p. The luma sig-nal in 720p and 1080i has a bandwidth limited to 30 MHz while the color difference up to 15 MHz. The luma on 1080p has a bandwidth limit of 60 MHz and 30 MHz for the color difference. The bandwidth and sync widths of course do vary for the different frame rates and sampling rates for the above mentioned standards. The luma channel in HD video requires 1 Vpp , while the color difference requires 700 mVpp . Due to the tri-sync levels in HD signals, and due to faster rates, the sync width could be as short as 0.15 µs in 1080p. Please refer to Table. 1.2 for a comparison of specifications on various video formats.
Sync information
Video formats present the sync information in varying forms, sometimes as sepa-rate signal; at times on only one stream others (e.g. SONY keeps it on green, hence named Sync-On-Green (SOG)TM . Some interfaces such as the component video interface embed it in the luma signal, thus referred to as Sync-On-Luma (SOY).
1.2.3 Y′ C′B C′R Color Space
Y′ C′b C′r is a scaled and offset adjusted version of the YUV color space. Y is defined to have a nominal 8-bit range of 16 − 235; Cb and Cr are defined to have a nominal range of 16 − 240. The are several YCbCr sampling formats, such as 4 : 4 : 4, 4 : 2 : 2, 4 : 1 : 1 and 4 : 2 : 0.

The Video Signal Composition

The analog video signal is a sub-volt signal containing the timing and intensity information for each horizontal line drawn. The timing pulse ensures the display device remains synchronized with the video signal. The two fields are synchronized using the horizontal and vertical synchronization signals. The resulting signal is a composite video signal. Each horizontal video line consists of horizontal sync, back porch, front porch, active pixel region [9, 10]. Although this video signal waveform has this generalized form since the early days of broadcast TV, many formats come with differing schemes, and any particular video format may lack an interval, sync pulse etc, among others. Video designers generally take measures to deal with all kinds of various standards, as a way to design all encompassing video AFEs.

Synchronization Pulses

The horizontal sync is the synchronization pulse that indicates the start of new video horizontal line. It is preceded by the the front porch and is followed by the back porch interval. Sometimes this hsync duration is used by clamp circuits to restore the DC.
The vertical sync is the vertical synchronization series of pulses that marks the end of one field and signals the screen to perform a vertical retrace.


The width of the sync pulse is also mentioned as sync-tip, especially in clamped systems where the sync-tip level of −300 mV is often important enough to deal with.

Front and Back Porch

The interval of the video signal between the end of color burst and the start of active video signal is the back porch. Alternately, the total interval between the sync-tip and the start of active video signal is also referred to as back porch. After the end of active video signal line, the sync-tip is separated by the front porch interval.

Color Burst

A high frequency signal in NTSC, which provides a phase and amplitude reference for a particular color, and is mostly located at back porch is the color burst. Typically 8 to 10 cycles, and has an amplitude of ± 20 IRE of the color reference frequency.


Since color burst is often located on back porch, the interval between color burst and sync-tip is referred to as breezeway.

Blanking Interval

The whole duration which consists of sync-tip and the front and back porch in-tervals constitutes the vertical or horizontal blanking intervals. The duration allocated for retrace of the signal from the rightmost edge of the screen back to the first left edge, to start another scan line is referred to as horizontal blanking. Similarly vertical blanking interval is the period allocated for retrace of the signal from the bottom right, back to the top left edge, to start another field or frame. This retrace was highlighted in Fig. 1.2.

Blanking and Black Level

These are the specific voltage levels of the analog waveform during blanking time or the on screen black voltage level during the active video signal. In most systems, the sync-tip level (≈ −300 mV, 7.5 IRE) is the only duration, where voltage goes to a slightly more negative value than blanking level, while some have black level at same level as that of blanking level.


The circuit that forces a portion of the video signal to a specific DC voltage, to restore the DC level is referred to as clamp circuit. Usually clamping is done either on the back porch, or the sync-tip of the video waveform. Also called DC restore, a black level clamp to ground circuit, forces the back porch voltage to be equal to zero volts. On the other hand, a peak level clamp forces the sync-tip voltage to be equal to a specified voltage.

Chroma Signal

This is the actual color information of the video signal. This signal consists of two quadrature components modulated on a carrier at the color burst frequency. The phase and amplitude of these analog waveform determine the color content of each pixel. Sometimes referred to incorrectly as chrominance, which in fact is the displayed color information.

Luma Signal

The monochrome or black-and-white portion of the analog video signal. This term is sometimes incorrectly called luminance, which refers to the actual displayed brightness.

Color Saturation

This is the amplitude of the color modulation on a standard video signal. The larger the amplitude of this modulation, the more saturated (intense) the color would get. This amplitude control is commonly a contrast control in television sets.

Requirements on Popular Video Standards

Tables 1.1 & 1.2 highlight some of the analog requirements that some video con-sumer industry standards impose on the video analog front ends, along with their signal specifications. Requirements for both TV as well as VGA standards are en-listed, although this thesis would primarily address the high definition television standards.
Video Analog Front Ends
This thesis is related to the analog front end of a high speed video digitizer IC, designed at the division of Electronic Systems, Department of Electrical Engineer-ing. The project has been carried out as a combined teamwork among graduate students and researchers at the division under the technical leadership of Dr.J Jacob Wikner. The video IC design task was divided into various mixed signal and all-digital blocks of the video digitizing and time-reference channel streams.
The video IC to be implemented, was a state-of-the-art design targeting video resolutions defined by the high definition video standards, featuring:
• a high performance 12-bit digitizer AFE
• up to 300 MS/s maximum conversion rate
• low jitter all digital PLL and DLL
• 65 nm CMOS process


The High Speed Video IC Architecture

The architecture of the video IC, containing various building blocks has been outlined in Fig. 2.1. The dashed line marks the boundaries of the two separate channels in the IC, the digitizing channel and the time-reference channels.

Time-Reference Channel

The time-reference channel stream as shown in Fig. 2.1 consists of clock generation and currents and voltages reference blocks. Essentially, there exists an all-digital phase-locked loop (PLL), a delay-locked loop (DLL), an RC wakeup oscillator, bandgap reference, slicer and voltage regulator blocks.
The signal chain starts from the multiplexer selecting proper reference either from the slicer detecting timing information from input video signal, or generated digitally, or as an external trigger. The PLL takes this signal and generates a higher frequency signal aligned to its input reference (hsync of video signal). The delay-locked loop (DLL) will shift this high frequency clock in phase, producing a total of 32 equally spaced phases. The reason for generating 32 phases is to maintain the parametric design for yield on minimum of three sigma quality standards (3% ≈ 1/32).
All-digital DLL & PLL
The PLL is responsible for generating higher frequency with an output range of 10−300 MHz, whereas the DLL having similar input range up to 300 MHz generates 32 phases maintaining fix duty cycle of 50% and a long term jitter of ± 2%. Each input in the multiplexer stream would have a DLL and a PLL. The PLL is also an all-digital version, replacing the VCO with digital DCO block and maintaining strict 50% duty cycle. The reason for measuring long term jitter over 2000 clocks, is to capture the effect of parameters such as 1/f noise, etc., on overall clocks.
Oscillator is an ultra low power 10 µW RC type wakeup oscillator which maintains a standby mode, unless digitally triggered upon detection of on-screen activity.
Voltage regulator and bandgap reference
A voltage regulator provides reference voltages required by on chip components, and is derived from the externally available supply. A bandgap reference ensures fix supply voltage for on chip components, safe from process, voltage and temperature variations.

Digitizing Channel

The digitizing channel consists of an input multiplexer, low pass filter, the PGA, a 12-bit ADC, bias, clamp circuit and auxiliaries. A substantial part is also the digital error and gain correction block in the channel. Modern video AFEs consist of five of such digitizing blocks to cover most of the different video standards. Some auxiliary blocks generate reference voltages for the ADC and all other on chip components.

Signal Chain of the Video Digitizing Channel

The AC coupled analog video signal (YPbPr , RGB) is multiplexed among a range of input devices (VGA, DTV tuner, S-Video, set-top box, DVI, HDMI etc.), whose DC point is restored with the help of the clamp circuit. The video signal gets band limited with the help of an active antialiasing filter. It is then fed to the programmable gain amplifier which generates a differential signal, buffers the signal and scales it in amplitude, as required by the system. Finally, this signal is digitized by the 12-bit TI-PSAR ADC.

AC Coupling

AC coupling allows the video designer to set optimal DC level, independent of the driving signal’s DC bias level.e.g, an ADC driver circuit sets the clamping or blanking levels of the video signal, equivalent to internal ADC code-zero voltage, regardless of the driving signal’s absolute DC level.

Clamping and DC Restoration

To digitize an AC coupled video waveform, DC restoration is necessary; to put the DC component back into the signal. With AC coupled signal, the bias voltage will vary with video content and the brightness information would be removed. This circuit adjusts the clamp level to the correct brightness of the picture during the back porch or the sync-tip section of the video signal.

Anti Aliasing Filter

The images dictated by the DAC/ADC’s sampling frequency would possibly fold down in the baseband, degrading the picture quality. Hence antialiasing filter would necessitate. Even if ADC/DAC has digital filtering, analog pre-filtering would still be required. Such a filter would also reduce EMI interference, as well as the signal noise floor by reducing bandwidth.

Type of Filter

While designing a video filter one must consider the specification on the filter’s group delay, passband flatness, corner frequency and perhaps roll-off rate. Once a corner frequency is selected, we would probably desire the flattest passband possible with most attenuation near the ADC/DAC sampling frequency. From this perspective a Chebychev or Cauer filter seems a good option. On the other hand, group delay must prevent excessive ringing and overshooting, when the picture changes abrupt frequencies (a black-to-white and back on every pixel). If let go, it can appear as fuzzy edges on the display screen, although the attenuation would be good! Chebychev or Cauer filter will maintain this ringing due to variations in group delay. A Bessel filter offers best group delay but has the disadvantage of transmission zeroes in both pass and stop bands. A good balance hence, is offered by a Butterworth filter due to its maximally flat amplitude response, a reasonable rate of attenuation and respective group delay [8].

Split Filter Architecture and Digital Tuning

The video signal specifications preferably require a fifth-order Butterworth filter, but the design complexity increases considerably upon moving to higher order active analog filters. Hence, one may opt for a ’split architecture’ comprised of a passive filter and an active Gm-C or OTA-C filter, instead. The passive filter would cater for the dominant poles whereas higher-frequency poles would be dealt by the active implementation. Tuning can be done digitally, implemented as a feedback control loop.


The ADC which takes the differential signal from the PGA in the video AFE chain, is a 12-bit, 300 MS/s time-interleaved, parallel, successive approximation register architecture. The 12 bits provides higher resolution and has been time-interleaved by a factor of 16, multiplying significantly the sample rate of ADC, up to 300 MS/s. The selection of SAR ADC is motivated by the faster, short-channel 65 nm process which encourages all-digital components, and SAR architecture remains least on analog content. The time interleaved architecture further helps in achieving higher throughput. Sigma-delta would be suitable for low bandwidth and has more analog components, like does the pipelined architectures which may also increase power consumption. The offset and gain errors are maintained at 1% maximum (translates to ± 32 LSB).

Table of contents :

1 Introduction 
1.1 Evolution of Video Technology
1.1.1 The Continuous Video Picture
1.2 Analog and Digital Video Data
1.2.1 Color Spaces
1.2.2 Analog Video Transmission
1.2.3 Y′C′
1.3 The Video Signal Composition
1.3.1 Synchronization Pulses
1.3.2 Sync-Tip
1.3.3 Front and Back Porch
1.3.4 Color Burst
1.3.5 Breezeway
1.3.6 Blanking Interval
1.3.7 Blanking and Black Level
1.3.8 Clamp
1.3.9 Chroma Signal
1.3.10 Luma Signal
1.3.11 Color Saturation
1.4 Requirements on Popular Video Standards
2 Video Analog Front Ends 
2.1 The High Speed Video IC Architecture
2.1.1 Time-Reference Channel
2.1.2 Digitizing Channel
2.1.3 Signal Chain of the Video Digitizing Channel
2.1.4 AC Coupling
2.1.5 Clamping and DC Restoration
2.1.6 Anti Aliasing Filter
2.1.7 Type of Filter
2.1.8 Split Filter Architecture and Digital Tuning
2.2 ADC
2.3 Programmable Gain Amplifier
2.3.1 Revisited PGA Architecture due to CMOS Process Limitations
2.3.2 PGA Specifications
2.3.3 Linearity and Noise
2.3.4 Slew Rate
2.3.5 Bandwidth
2.3.6 Leakage
2.4 On Screen Artifacts due to Errors in Video AFE
2.4.1 Effect of Flicker Noise
2.4.2 ADC Errors
2.4.3 Timing Errors
2.4.4 Leakage
3 OTA Architecture 
3.1 A Pseudo Differential OTA and Common Mode Feedforward Technique
3.2 Common Mode Feedback using Cascaded OTA Structures
3.3 Frequency Response
3.3.1 Cascaded OTA structures and Compensation
3.4 Noise and OTA Nonlinearity
3.4.1 Noise vs. Speed vs. Linearity
4 Behavior Level Video PGA Modeling 
4.1 Resistor Emulation by Switched Capacitor Circuits
4.1.1 Switches in Signal Path
4.2 Analysis of Switch Capacitor PGA Switching Scheme
4.3 Oversampling in Video PGAs
4.4 Noise Considerations and Minimum Size of Capacitor
4.5 Developing a Higher Level OTA Model
4.5.1 Single Pole OTA Model
4.5.2 Enhanced OTA Simulation Model
4.5.3 The Current Limiting Model
4.5.4 Designing Enhanced OTA Model
4.6 Modeling a Switched Capacitor Video PGA
4.7 Sync-Tip Compensation – an SC Level Shifter
4.8 Simulation Testbench of the PGA Switching Scheme
5 Transistor Level Design 
5.0.1 Modern Trends in Low Voltage Analog Design
5.1 The 65 nm CMOS Process
5.2 OTA Design in 65 nm CMOS
5.2.1 Design of OTA with CMFF
5.2.2 Achieved Specifications for OTA (with CMFF only)
5.2.3 Designing the OTA Structure with CMFB Devices
5.3 Cascading OTA Structures
5.4 Nested Miller Compensation
5.4.1 Reducing Output Capacitance
5.4.2 Standard Expressions – Unreliable
5.5 Noise Analysis
5.6 Non-Linearity and Distortion
5.7 Bandwidth vs. Stability
5.8 Final OTA Specifications
6 Conclusions and Future Work 
B Skill Scripts
B.1 Pole and Zero Locations
B.2 Harmonic Distortion


Related Posts