In general, aliasing in digital video causes areas of the moving picture to shimmer or flicker. The affected areas are those with the most fine-grained video detail.
If you stop the motion and look at a still frame of the video, the shimmering will stop, but you will see a spurious pattern overlaying the fine detail in the image. This often called a moiré pattern. As you step from one still frame to the next, the moiré pattern will writhe. Played at normal speed, the writhing becomes a shimmering.
You can think of any still or moving image as being made up of various visual frequencies. The frequencies, which can be represented as sine waves, are overlaid or superimposed atop one another so as to make up a recognizable image. The finer the detail in any portion of the image, the more prevalent the higher frequencies are in that portion of the image.
But digital video doesn't understand continuous sine waves. It chops the image up into pixels. The pixel grid is 1920 x 1080 pixels for 1080i/p, 1280 x 720 for 720p, and 720 x 480 for 480i/p.
The crux of the aliasing problem has to do with the fact that the pixel grid, however coarse or fine, imposes a limit on the highest visual frequency that can be accurately rendered by that grid. Since a pair of adjacent pixels can be thought of as being able to (crudely) represent the positive and negative swings of one complete cycle of a sine wave, the highest frequency that can be represented without aliasing is one that is just less than one-half the pixel-grid frequency (which is a square-wave frequency, rather than a sine-wave frequency).
If the square-wave frequency puts 1920 pixels across the screen, then fewer than 1920 ÷ 2 = 960 sine wave cycles can be accurately represented in the horizontal direction. A similar logic applies to the vertical dimension, or to visual information that uses both dimensions at once.
This is all a fancy way of saying that for any given video resolution there is a maximum visual frequency that can be represented, if aliasing is to be avoided.
Think of using a digital camcorder to make a video of a picket fence as you ride by it in your car. If you are zoomed in on the fence, the visual frequency of the pickets as they move by is relatively low, and you will get no aliasing. But if you gradually zoom out, at some point you will start to see aliasing. This is the point at which the visual frequency of the moving pickets climbs to half that of the pixel grid of the camcorder. Any frequency above that threshold will produce aliasing.
The aliasing potential in digital video is compounded whenever digital video is re-rendered or scaled.
If the video's native resolution is, say, 720p, and it is scaled to 1080p, aliasing can creep in. After all, each 720p frame is itself an image with various frequencies of visual information in it. Among those frequencies are those representing the square-wave 720p pixel grid itself. The square-wave frequency of the 1080p pixel grid is less than twice that of the 720p grid. So, unless some sort of anti-aliasing technique is used, simply scaling up from 720p to 1080p may introduce aliasing.
By similar logic, scaling up from 480i/p to 780p can introduce aliasing.
Video scalers accordingly use sophisticated digital filters to offset the potential for aliasing. Video scaling takes place in the PS3 when a game that is nominally in 720p is output at 1080p. Grid is such a game.
If the PS3 is set up to output Grid at its native 720p into a native-1080p TV, the TV itself will scale the video up to 1080p. (This is what I do.) In this example, the user has the option to let the PS3 or the TV do the upscaling. Possibly, one of the two choices as to which device does the upscaling will introduce less aliasing than the other.
Another possibility is that aliasing can be reduced or eliminated by telling the 1080p TV to use a one-for-one pixel mapping for 1080p input. My New Samsung LN52A650 TV has a picture-size setting called "Just Scan" that does this. Ordinarily, the 16:9 setting for HDMI input on this TV enlarges the picture slightly so that its edges lie outside the frame of the screen; this is called "overscan." It is done because some TV broadcasts have visual "garbage" at the edges of the picture, particularly at the top edge. Overscan hides the garbage.
Hiding the picture's edges requires re-scaling the picture slightly, which can in theory introduce aliasing. If your TV has a way to defeat overscan and your PS3 games evidence aliasing, you might try defeating the overscan ("just scanning") as a way to reduce or eliminate aliasing.
But I gather that most PS3 game aliasing problems lie deeper than this. For example, a post in this GameSpot forum thread reads:
All sony exlcusives offer AA the problem comes with ports. Since ports are done from 360 to ps3 the developers do whatever they can to have similar looking games with decent performance meaning they put all their "hard" work ito the 360 versions and port it over and take out certain details such as AA in order to maintain decent performance. But yes ps3 does offer AA look at Racvthet and clank Uncharted Resistance Heavenly Sword Ninja Gaiden. The only game with aliasing problems that i know of thats sony exclusive is GT5.
My interpretation is that AA ("anti-aliasing") is being done for some PS3 games and not for others. Most or all of the games written (usually by Sony) for the PS3 are "anti-aliased" such that the PS3 can scale them to any of its supported output resolutions without aliasing creeping in. On the other hand, games that are written for other game consoles such as the Xbox 360 and then ported to the PS3 are not necessarily "anti-aliased."
I gather that the Xbox does not internally scale game video. (I don't really know this; anyone who knows more should feel free to correct me.) If the game is 720p, it is output at 720p.
The PS3's games can scale to higher resolutions than they were written for. I think, but don't really know, that this is something the game itself chooses to take advantage of, or not. For instance, I believe Grid does not scale itself from its native 720p to 1080p.
I believe — but again, I don't really know — that this game-internal upscaling capability may be separate from the PS3's usual method of upscaling video.
Whether or not the upscaling is game-internal or done externally to the game by the PS3, it seems that some PS3 ports from Xbox 360 and other platforms introduce aliasing during upscaling. The forum poster I quoted seems to think that there could have been "anti-aliasing" included in the game ports, but, due to the negative effects of anti-aliasing on game performance, there wasn't.
I am going to investigate this subject further and post about what I learn. For now, those who are irritated by aliasing in their PS3 games should be aware that it may be the unavoidable result of how the games were ported to the PS3. There may be nothing "wrong" that they can fix by using different settings on the PS3 or the TV.