QUOTE(TGD @ Jul 12 2006, 04:12 PM)
No man..thats not possible. There really isnt any diffrence with hdmi vs component on any tv that dont accept 1080p. I dont know how people say such things lol..its more like compoent uses multible cables and hdmi only has one... thats the REAL diffrence right now unless you got a native 1080p hdtv and your enjoying 1080p broadcasts... which your not and wont be for years.
I know why you seen a diffrence, your tv probally has memory settings for each input and your compoent input is calibrated diffrently then the hdmi input...you know brightness,contrast,sharpness,etc settings lol
Uh, I beg to differ:
QUOTE
the big difference between HDMI/DVI and Component/VGA is a truely digital signal vs. alanog signal.
In the Xbox 360, all the GPU's outputs and stuff are in a purely digital form, they then get converted to an analog singal in the video encoder. Similarly, a HDTV takes in the analog signal and converts it abck to digital, to control the screen itself.
I work at a comapny that designs audio amplifiers, I know first hand the effects of converting from analog to digital and back, you lose quality. Unlike digital signals, there is a degree of uncertainty in an analog signal, and it often does not get correctly interpreted on the other end.
However, if you have a straight digital signal from the output device all the way to the TV, you are certian that the signal is getting there properly, without any distortion from outside sources, like RF interferance and other noise that plagues analog signals far worse than digital.
So, for the average joe, the difference is not that noticable, the data is the same (the onyl difference is the way it is transmitted), but for an A/V nut/audiophile like me, i can very easily see the diffrence
Basically it works like this:
In an analog signal, a certain voltage on the line represents a value, for example, say 0V = 0 and 1V = 255, and all other vontages in between have a corresponding value from 0-255. Now, if the transmitting device wanted to output a value of 173, it would output the voltage, but it's not 100% accurate. Interferance, por quality cables, etc. can alter that voltage so it might be interpreted on the other end as being 172, 174, ...
However, in a digital signal, there are only two possible values for a voltage on the line: high or low (1 or 0)
Many combinations of these 1's and 0's make the actual value (173 = 10101101). So, for the receiving end to misinterpret the value as 172, it would have to get bit 0 completely wrong, mistaking a high signal as a low. So, there is a much greater degree of accuracy and certainty in a digital signal.
And that, it why HDMI is better than component.