I meant this as a short post on xbox.com however it exploded to slightly more than I intended. I thought someone here might find the information herein interesting and even helpful
-------------------------------
The way the "refresh" for a game works is linked to the way a TV refreshes the image in its screen... Lets take the example of NTSC standard CRT TV in 480i @ 60hz.
Once every 1/60th of a second (16.6ms) the TV will update the image on the screen from an internal buffer. TV's have only 1 buffer for storing the image so it is important that the image be changed during the point when the CRT is moving from the bottom scanline to the top scanline. This is called the vertical sync (vsync) and its a very small amount of time however changing the image takes only a small amount of time so things work out fine.
Now a game generates the image to replace the buffer within the TV, in order to hit 60 fps the game must present a new image exactly once every 16.6ms assuming that the first presented image occured exactly when the TV hit the vsync. If this is done then the game will run buttery smooth to the eye and you will see ZERO tearing.
Based upon this a game have 16.6ms to generate that new image in order to maintain 60fps OR it could present a new image at half that rate (every 33.3ms) and still see no tearing as it would hit every second vsync.
If your game frame takes 15ms to render then you easily run @ 60fps however if your game frame takes 17ms to render then you don't hit 60fps and in order to not see tearing you MUST wait until the next vsync which essentially means you run @ 30fps.
The reason some games seem to run at variable fps but still have no tearing is that this vsync occurs regardless, thus when you look at a wall with nothing but the wall rendering the frame takes no time and presents @ full 60fps but when you look over that valley vista and you're rendering 10bajillion pixels it takes a LONG time to render and thus the game presents @ a lower rate (60,30,15...).
The reason some games show a variance of fps outside of 60,30,15... is also easily explained... imagine the above situation where one frame renders in 15ms and the next in 17ms... well hardware like the 360 has leeway within the sync functions that essentially provides a window of opportunity during which to present a frame. If the TV image is presented while the TV is drawing the top say 40 scanlines you WILL NOT see the update happen (its OFF the visible screen) and so a window of 40 scanlines (8.3%) is a perfectly window of opportunity during which to present. This essentially allows the programmer 17.9 ms to render his frame before the hardware will wait for the next vsync and reduce to 30fps. This method however creates a push effect where the next frame WILL be late because the previous frame took too long and eventually the frame rate will stutter and re-sync.
The reason some games show massive variance and LOTS of tearing is because they set the window of opportunity too wide (say 40%) and thus they allow 23.4ms to render the frame... again this results in the push issue and the tear MOVES down the screen until it wraps around reduces to 30fps momentarily and then the circle starts again. This is BAD however it is also the most common form of adjustment when a project is GPU wise overbudget (aka GPU bound) for whatever reason (normally rushed by publishers or badly designed).
NFS:MW exhibits the final description and THAT is the reason for the slight judder in the frame rate every 10 seconds or so... the push issue wraps around.