xboxscene.org forums

Xbox360 Forums => Xbox360 Hardware Forums => Xbox360 Audio/Video Technical => Topic started by: Hellbeans on May 08, 2006, 12:34:00 PM

Title: My Vga Saga
Post by: Hellbeans on May 08, 2006, 12:34:00 PM
Hi, im new to the forums so excuse me for sounding noobish?

My box is connected to my PC monitor, a MAG 19" CRT using the 1st party VGA cable.
But im not quite sure if games are operating at a smooth enough framerate and Im not sure how to get the best out of this setup.

In Oblivion for example, I have framerate issues at specific locations, which as Ive read is normal, but I just wanna be sure my box isnt underperforming.

In PDZ I had a couple of locations where the framerate dropped naggingly in a certain location.

G.R.A.W, the latest game I got also feels quite weird (I posted this in the respective forum).

Any advice, suggestions, or comments, is very welcome.
Title: My Vga Saga
Post by: twistedsymphony on May 09, 2006, 06:09:00 AM
framerate drops like that have nothing to do with the video adapter you're using.

Oblivion is a big game... the framerate drops sometimes when it's loading in the background. I can't speak for the other games but what you're describing sounds completely normal.
Title: My Vga Saga
Post by: Hellbeans on May 09, 2006, 08:05:00 AM
Thanks, im a bit calmer now.

edit: this post has been edited, because the sentence first said "Thanks, im a bit calmer I now".

This post has been edited by Hellbeans: May 9 2006, 03:06 PM
Title: My Vga Saga
Post by: hignaki on May 10, 2006, 09:05:00 PM
QUOTE(twistedsymphony @ May 9 2006, 01:09 PM) View Post

framerate drops like that have nothing to do with the video adapter you're using.

I hate to say this, but that is incorrect.  Sites (and many gamers, you can do it yourself) have done tests that prove that using a vga cable at a higher resolution can drop framerates considerably in games.  Go play Tomb Raider:Legend with a VGA cable at the highest settings and watch framerates crawl in areas, where the same areas run just fine on a regular TV connection (640x480) or even a slightly lower resolution with vga/HD Component.
Title: My Vga Saga
Post by: VinnySem on May 11, 2006, 06:04:00 AM
QUOTE(hignaki @ May 10 2006, 11:12 PM) View Post

I hate to say this, but that is incorrect.  Sites (and many gamers, you can do it yourself) have done tests that prove that using a vga cable at a higher resolution can drop framerates considerably in games.  Go play Tomb Raider:Legend with a VGA cable at the highest settings and watch framerates crawl in areas, where the same areas run just fine on a regular TV connection (640x480) or even a slightly lower resolution with vga/HD Component.
 


I hate to say this, but that is incorrect. It has nothing to do with the cable, it has to do with the resolution the game is being rendered at. Compare the same game at the same resolution on different cables, the framerates should be very close if not identical. Obviously, as you increase resolution, framerate will decrease as a function of the GPU's power.

Title: My Vga Saga
Post by: twistedsymphony on May 11, 2006, 06:14:00 AM
QUOTE(VinnySem @ May 11 2006, 08:11 AM) View Post

I hate to say this, but that is incorrect. It has nothing to do with the cable, it has to do with the resolution the game is being rendered at. Compare the same game at the same resolution on different cables, the framerates should be very close if not identical. Obviously, as you increase resolution, framerate will decrease as a function of the GPU's power.


yes... perhaps I should have been more specific...

it's a combination of the game and the resolution.

Games like Tomb Raider are programed to run at multiple resolutions. using a VGA adapter in 640x480 VS using composite in 480i should produce the same results but running the VGA adapter in 1280x720 vs the 480i composite will produce different results because the console has to work harder to produce the higher resolution.

Alternatively PGR3 renders at 600P internally and scales from there, so no matter if your output resolution is 480i or 1080i the results in terms of frame rate will be the same because the console is working at the same difficulty throughout.

Again this has NOTHING to do with the cable and is completely NORMAL when playing the game at a particular resolution.
Title: My Vga Saga
Post by: hignaki on May 11, 2006, 06:04:00 PM
Sorry, I must have said it wrong.  What I meant was that yes, it does depend on the video adapter you're using (VGA v component) if you have the VGA at anything besides 640x480.  Didn't mean to sound like a dumbass.
Title: My Vga Saga
Post by: Hellbeans on May 16, 2006, 07:20:00 AM
QUOTE(hignaki @ May 12 2006, 02:04 AM) View Post
Sorry, I must have said it wrong.  What I meant was that yes, it does depend on the video adapter you're using (VGA v component) if you have the VGA at anything besides 640x480.  Didn't mean to sound like a dumbass.


...What's your point?
What do you mean at anything besides 640x480?
Title: My Vga Saga
Post by: hignaki on May 16, 2006, 03:09:00 PM
QUOTE(Hellbeans @ May 16 2006, 02:27 PM) View Post

...What's your point?
What do you mean at anything besides 640x480?

The same things that were said more eloquently by the previous posters.  With your VGA cable, you can set the resolution from the system settings menu.  At 640x480, you will get the same performance (framerate) as you would with the standard cables.  Set it to anything higher (like, 1024x768), and performance will drop.

Not that difficult of a concept to understand.
Title: My Vga Saga
Post by: thax on May 16, 2006, 05:03:00 PM
QUOTE(hignaki @ May 16 2006, 10:16 PM) *

The same things that were said more eloquently by the previous posters.  With your VGA cable, you can set the resolution from the system settings menu.  At 640x480, you will get the same performance (framerate) as you would with the standard cables.  Set it to anything higher (like, 1024x768), and performance will drop.

Not that difficult of a concept to understand.

From my understanding all frames are rendered at 720p resolution (with the exception of PGR3) then the webtv chip scales the output to the configured resolution. This would make the performance the same across all settings and cables.
Title: My Vga Saga
Post by: Hellbeans on May 16, 2006, 05:49:00 PM
QUOTE(thax @ May 17 2006, 01:03 AM) View Post

From my understanding all frames are rendered at 720p resolution (with the exception of PGR3) then the webtv chip scales the output to the configured resolution. This would make the performance the same across all settings and cables.


Ok you'll have to excuse me for being safety-paranoid, but does it mean 1024x768 is the 720p equivalent, or I can only get the best performance (vga-wise) with 640x480?
Title: My Vga Saga
Post by: hignaki on May 16, 2006, 09:38:00 PM
QUOTE(thax @ May 17 2006, 12:03 AM) View Post

From my understanding all frames are rendered at 720p resolution (with the exception of PGR3) then the webtv chip scales the output to the configured resolution. This would make the performance the same across all settings and cables.

I know; that's the funny thing.  I understood it to be just scaled either way.  But, have you played Tomb Raider at all?  There is a clear difference, and I don't know why.  I DO know that not all games are output exactly the same in all of the configurations; for instance, the menu on COD2 in 640x480 has nice, big letters, and the compass is moved and altered; whereas running it at 1280x1024, the compass/menu items are tiny compared to what they were, and would be readable ONLY with the higher resolution.  This leads me to believe that the game knows what output setting it is at, and can change at the developers will.  

 It's been reported on other places like the [H]ardForums and stuff, too.  Might google it later to find out more, if you wanted to.
Title: My Vga Saga
Post by: twistedsymphony on May 17, 2006, 08:30:00 AM
QUOTE(thax @ May 16 2006, 07:03 PM) View Post

From my understanding all frames are rendered at 720p resolution (with the exception of PGR3) then the webtv chip scales the output to the configured resolution. This would make the performance the same across all settings and cables.

That's not entirely true...

MS requires all games be made to support 720p resolution (1280x720). The WebTV scaler chip can scale any game resolution to any other resolution supported by the console.

SOME Games are actually built to render at a multitude of resolutions. For instance Condemned can render at both 640x480 (480i/p) as well as 1280x720 (720p). So if you have your display set to 480p then the scaler isn't doing anything because the game is actually rendering at that size.

Kameo on the other hand ONLY supports 1280x720, so no matter what your console is set to output it will always be rendering 720p and scaling it to your output.

Interesting enough all of the first party game ONLY support 720p (PGR is 600p but that's another story), where most 3rd party games support that as well as 480p, some even natively support 1080i (it should say on the box)

That's why some games run smoother when playing through a 480 based connection, because they're actually rendering at that resolution.

Now when using a VGA cable it's kind of a crap-shoot. Some games, even if you're displaying at 1280x1024 even though it can fully fit the 720p render it will choose to render the game at 640x480 because it's a fullscreen format as opposed to a widescreen. This is based entirely on how the developer decided the game should work in such a situation. It sucks but that's how it is.

Title: My Vga Saga
Post by: Hellbeans on May 17, 2006, 08:41:00 AM
So how do you suggest finding the best matching res man?
Title: My Vga Saga
Post by: thax on May 17, 2006, 09:57:00 AM
Wow, thanks for the clairification hignaki and twisted. It is totally surprising that the developers didn't just optimize for a single resolution. This really makes using component 720p an advantage in some cases, as games are likely to be tested heavily at this resolution.
Title: My Vga Saga
Post by: twistedsymphony on May 17, 2006, 11:00:00 AM
QUOTE(Hellbeans @ May 17 2006, 10:41 AM) View Post

So how do you suggest finding the best matching res man?

720p through component video will always be the best resolution. Fullscreen 480i/p is probably the next most RELIABLE resolution, past that you really just have to try things out and see what works/what doesn't.
Title: My Vga Saga
Post by: oswald on May 17, 2006, 12:01:00 PM
QUOTE
720p through component video will always be the best resolution. Fullscreen 480i/p is probably the next most RELIABLE resolution, past that you really just have to try things out and see what works/what doesn't.


Yep, and from my testing this theory holds on the VGA cable as well.  For Tomb Raider and other games that exhibit this phenomenon, the best setting I could find was 1280x720 (widescreen), in other words, 720p.

fullscreen is for chumps anyway.
Title: My Vga Saga
Post by: Hellbeans on May 18, 2006, 11:59:00 AM
So I have to get a widescreen monitor? or is there a way to workaround the fact that it widescreens your arse?
Note im talking about 1024x768 - could it detract from performance? would game suffer from this?

When I set it to 1280x720 it looks weird, its only a 19" crt...
Title: My Vga Saga
Post by: twistedsymphony on May 18, 2006, 02:16:00 PM
See this is where a DVI or HDMI connection would have come in handy because the signal is a fixed pixel width... sending a raw 1280x720 image to even a fullscreen monitor would still display in the proper format (with black bars of course).

Instead VGA sends the signal and the monitor tries to stretch it to fit since there's really no defined geometry to it.
Title: My Vga Saga
Post by: Hellbeans on May 18, 2006, 10:25:00 PM
That's interesting.

Do you have any general suggestions or is it "Go get an HDTV your'e screwed" (screwed part being that I have no money, whatsoever, nope, I don't.)

*Goes back to drug dealing.
Title: My Vga Saga
Post by: twistedsymphony on May 19, 2006, 06:19:00 AM
While I've never used the official VGA cable the reports of the over brightened image and general compatability problems leads me to believe that for Xbox360 use you'd be better off using the component cables with an external VGA transcoder

While theoretically the signal wouldn't be as good as straight VGA, IMO it's a superior solution when using a 360.
Title: My Vga Saga
Post by: oswald on May 19, 2006, 07:55:00 AM
I've got my 360 hooked up to my LCD TV now, but when it was attached to my CRT monitor via vga, I simply resized the screen (through the monitor controls) to display the video in widescreen.  It took awhile to get the aspect ratio right, but as long as your monitor has decent vertical size controls, you should be able to play in real widescreen without dropping extra coin for a HDTV.
Title: My Vga Saga
Post by: Hellbeans on May 19, 2006, 11:12:00 AM
QUOTE(oswald @ May 19 2006, 04:02 PM) View Post
I've got my 360 hooked up to my LCD TV now, but when it was attached to my CRT monitor via vga, I simply resized the screen (through the monitor controls) to display the video in widescreen.  It took awhile to get the aspect ratio right, but as long as your monitor has decent vertical size controls, you should be able to play in real widescreen without dropping extra coin for a HDTV.


I tried that, but everything looks stretched and weird, or maybe its trying to get used to it? (I mean my eyes, not the monitor.)