Why is DVI Better than VGA?

VGA/SVGA is an analog signal. It was designed for CRT-based devices. The analog source transmits each horizontal line of the image, and it varies its output voltage to represent the desired brightness. This is used to vary the intensity of the scanning beam as it moves across the screen. The appearance of each pixel may be affected by adjacent pixels, or by electrical noise and other kinds of analog distortion. DVI, however, uses a digital protocol in which the desired illumination of pixels is transmitted as binary data. The display reads each number and applies that brightness to the appropriate pixel.

Be Sociable, Share!
Back