Thursday, April 30, 2009

Can anyone tell me the difference between the Blue and White input on graphics cards and monitors?

On the back of graphics cards it has a white and blue port for the monitor. Which one is better to use or does it not matter?

Can anyone tell me the difference between the Blue and White input on graphics cards and monitors?
The blue is a VGA connector.





The white is probably DVI-I, possibly DVI-D





DVI is a much newer standard and comes in three flavors:





DVI-A is analog. This has exactly the same signals as a VGA connector but in a DVI connector shell. Signal quality is a bit better. You can get adapters that will remap the signals from a DVI-A connector to a VGA connector.





DVI-D is digital. This will give a better image for flat panels as the video signal stays digital from the GPU through to the panel. VGA it gets converted from digital to analog, pushed up the (lossy) cable, sampled and converted back to digital.





DVI-I is analog and digital. Usually seen on the back of video cards. Since the analog signals and the digital signals use different pins you can fully populate the connector and support either.





If you have a second VGA monitor you can connect it via a VGA to DVI-A adapter, provided that the video card has DVI-I rather than DVI-D





If you have a DVI-D input on the monitor, then you will get a slightly better picture connecting via a DVI-D cable than the VGA.
Reply:The blue one is the old VGA connector and the white one is the new HDMI connector. I would use the HDMI connector if your card and monitor both support it.
Reply:don't listen to the first answerer.... the white output is DVI and the blue one is VGA. DVI gives better graphics. Just make sure your graphics card and your monitor support DVI.

shoes stock

No comments:

Post a Comment