Which is better, DVI or VGA?
Jun 10, 2006 at 3:44 AM Thread Starter Post #1 of 34

trains are bad

Headphoneus Supremus
Joined
Aug 31, 2005
Posts
2,221
Likes
12
I just bought a new monitor and I'm having some video playback quality issues. I have my monitor hooked up via DVI. Should I use DVI just because I have it? Or is it any better?
 
Jun 10, 2006 at 3:50 AM Post #2 of 34
Are we talking about DVI-D or DVI-I?
 
Jun 10, 2006 at 4:11 AM Post #4 of 34
DVI-D is digital only. DVI-I combines DVI-D and VGA on one cable.

As far as I know, nothing has a DVI-D input. Most video cards have a DVI-D output, however, so that they can be used to connect to a DVI screen or to a VGA screen via a DVI-I>VGA dongle. Monitors, DVDPs, and TVs are generally DVI-D.
 
Jun 10, 2006 at 1:19 PM Post #6 of 34
DVI is digital in most scenario's.
DVI-D: digital only connector (f.i. Philips/JVC LCD TV's)
DVI-A: analog only connector (never seen one in use)
DVD-I: analog + digital connector (capable of transporting both types of signals)

In theory the digital output of a video card eliminates the D-A and A-D conversion in the PC and monitor. So it should provide a cleaner signal. But most monitors that have both D-SUB and DVI connectors have better analog circuits anyway (generally they are higher end models). So you won't find a big improvement in general when using the DVI connector.
 
Jun 10, 2006 at 1:33 PM Post #7 of 34
For LCDS, it's said that DVI (being digital), is a much better connection and you should use it if you have it.
 
Jun 10, 2006 at 2:49 PM Post #9 of 34
Nahhh, analogue is better, its like Vinyl vs. CD. DVI-D adds that digital edginess to the picture.

Just kidding...
biggrin.gif
very_evil_smiley.gif


What exactly do you mean by "video playback quality issues". Have you checked if your desktop resolution is the native resolution of the display?
 
Jun 10, 2006 at 3:27 PM Post #11 of 34
Quote:

Originally Posted by blueworm
Whatever works for you. I have tried both and I prefer the vga conection because it allows higher refresh rates 75Hz instead of 60Hz.
DVI-D is supposed to be better.



Refresh rates don't mean anything on LCD, because, the screen dosn't refresh the same was a CRT does.

DVI is vastly better, especially on high res displays. More bandwidth.
 
Jun 10, 2006 at 3:50 PM Post #12 of 34
It all depends on the setup.

If you have a CRT, use VGA for better refresh rates. For LCD, DVI is supposed to be better, but some of the old AGP 2x cards with DVI output have isuues resychonizing, so you might get a better quality especially for DVD playing with VGA. If you have software issues with DVI, VGA is safe heaven.

With stable new rig, DVI wins hands down.
 
Jun 10, 2006 at 3:52 PM Post #13 of 34
Correct me if I'm wrong, but TFTs still update the Pixels line by line with the given refresh rate. This often leads to tearing in fast moving games. I often have to turn on vsync in games to get rid of this.

I am not sure if the refresh rate is fixed by the TFT. DVI-D also supports higher refresh rates than 60 Hz. This depends on the resolution, since ultimately this is bandwidth limited.
 
Jun 10, 2006 at 4:42 PM Post #14 of 34
Quote:

Originally Posted by blueworm
Whatever works for you. I have tried both and I prefer the vga conection because it allows higher refresh rates 75Hz instead of 60Hz.
DVI-D is supposed to be better.



You should always use 60hz on an LCD. They don't refresh like a CRT (the whole frame is displayed on the screen at once). If you use a higher refresh rate, it increases the response time. (which is bad)
 
Jun 10, 2006 at 4:54 PM Post #15 of 34
Are you sure that all pixels are updated at the same time?
So I guess the tearing is then the effect of "page flipping" or something like that in the buffer of either the graphics card or the LCD itself?
 

Users who are viewing this thread

Back
Top