Which NES games look better with NTSC artifacts?

Discuss emulation of the Nintendo Entertainment System and Famicom.

Moderator: Moderators

strangenesfreak
Posts: 155
Joined: Thu May 03, 2007 3:07 pm
Contact:

Which NES games look better with NTSC artifacts?

Post by strangenesfreak »

I'm not sure which forum to put this thread in; I chose NESemdev, because I guess this sort of has to do with emulation... :? But I apologize if this would be the wrong forum.

I'm curious as to which NES games look better with NTSC artifacts rather than without. In my experience, Blaster Master and the Castlevania games look better with NTSC artifacts, but most early 80's (1983 - 1987) games, the Mario games and other games with bright, simplistic graphics look slightly worse. To me, it seems to be that games with detailed, gritty graphics look better with NTSC artifacts, while games with simplistic graphics (and earlier released games) look worse.

But what are other good examples of games that look good with NTSC artifacts - and those that look worse?
User avatar
Bregalad
Posts: 8036
Joined: Fri Nov 12, 2004 2:49 pm
Location: Caen, France

Post by Bregalad »

It's hard to say, I was impressed to see how Castlevania 3 looks better with them, while it doesn't affect this much some other games. Super C and games of the Gradius series also looks significantly better with them. Batman looks much better, too.

Final Fantasy games looks quite better without them I think, but I don't know much why.
I guess games with dark graphics looks better with them, while dark with more fantasic graphics looks better without them.
User avatar
blargg
Posts: 3717
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

Games with outlined graphics will look less different, because this lessens the impact of NTSC limitations. I've encountered this issue connecting a SNES to my TV via S-Video versus composite; games like Super Mario World, Zelda: A Link to the Past, and Super Mario Kart look great with S-Video, while things like Donkey Kong Country look poor due to the pixel perfect image, but good with composite. Like you say, attempts at gradients and texturing look better with composite, because it adds more variation and texture, while S-Video or RGB exposes the limited number of colors of the system too much.
strangenesfreak
Posts: 155
Joined: Thu May 03, 2007 3:07 pm
Contact:

Post by strangenesfreak »

I also noticed the Genesis version of Toy Story looks more pixelated playing on an emulator than when I played it on a TV before, something that I recall a reviewer commented on as well. I think Toy Story is also a game that looks better with composite rather than in its pixel perfect format, since I believe it renders its graphics in pseudo-3D style similar to Donkey Kong Country.

IMO, the best graphic style for composite-predominant consoles would be styles that often use dithering, but still remain clear to look at - and also colorful depending on the atmosphere. That way, games would look fine whether with or without NTSC artifacts.
User avatar
BMF54123
Posts: 410
Joined: Mon Aug 28, 2006 2:52 am
Contact:

Post by BMF54123 »

A lot of Genesis games use an odd dithering technique consisting of vertical lines, rather than the usual dot pattern, because an NTSC TV will blend the lines together (though I don't know about Toy Story, never played it). Ristar makes use of this in combination with shadow mode on the map screen, and Sonic 2 uses it in Chemical Plant Zone to simulate "transparent" tubes.
User avatar
Bregalad
Posts: 8036
Joined: Fri Nov 12, 2004 2:49 pm
Location: Caen, France

Post by Bregalad »

That way, games would look fine whether with or without NTSC artifacts.
In fact my opinion is that most games looks fine with or without NTSC artifacts. I don't remember any game which looked horrible on the console looks good on an emulator or the other way arround.
I personally am not a big fan of dithering, althrough I still use this technique sometimes. To have good graphics, I usually avoid having a large surface all using the same color unless in some particular cases (background sky or something). If you have a large place with nothing in it, add details or shadow effects. With this you will usually end up with good graphics, with and without NTSC artifacts.
LocalH
Posts: 180
Joined: Thu Mar 02, 2006 12:30 pm

Post by LocalH »

Better is subjective. Some people feel that simply because the NTSC emulation generates an image that you'd also see when capturing a real NES, that it's "better". Other people may feel that, although some games have graphics that were designed with this in mind, for the vast majority of games it really doesn't matter. Myself, I'm a wholehearted advocate of the NTSC emulation. I feel it really adds that touch of "authenticity" to the experience. Plus, even if one doesn't care whether or not the artifacts are emulated, I feel that if you use the NTSC emulation with RGB presets, that it's still preferable to not using it at all, because a real RGB signal, when captured, is vertically sharp but slightly soft horizontally, and when set to RGB preset Blargg's filters exhibit this same behavior. To me, that just makes it feel more "real" (example, try running any of the VS. games with an RGB preset, it looks really good IMO).

As far as Genesis games go, yeah, I'm of the belief that more games used such dithering, pretty much every main Sonic game uses it, and I've seen it in countless other games. The famed "256-color" Eternal Champions CD even used it (because in no way does it really display 256 real colors at once). It's a very effective technique on the Genesis because, at least in NTSC regions, I would guess that maybe 1% of Genesis consoles - maybe 2% at most - are hooked up with anything better than composite/RF video.
User avatar
blargg
Posts: 3717
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

The one aspect of the NTSC composite signal that I've found most important is the limited chroma resolution. Using just this and no luma artifacts still allows crisp pixels that some people like. "RGB" on left (bleed = -1.0), "S-Video" on right (bleed = -0.25), both with resolution = +1, sharpness = +1.

Image
Image
Image
Image
NewRisingSun
Posts: 1312
Joined: Thu May 19, 2005 11:30 am

Post by NewRisingSun »

Why are the "resolution" settings for Composite and S-Video presets anything other than maximized? If I understand "resolution" correctly as the N/Y channel bandwidth, there is no channel limit in Composite and S-Video environments. The ONLY environment where there is a limit on N/Y channel bandwidth is a radio-frequency modulated signal; here, there is a 4.2 MHz limit on the N/Y bandwidth, which means an effective resolution of 4.2 MHz * 52+59/90 µs = 221.153... pixels, down from the NES' pixel clock of Fsc*6/4 * 52+59/90 µs = 282.824... pixels. That bandwidth limit is precisely the difference between RF and composite, after all.

Right now, the composite setting looks like RF, and the S-Video setting looks like a well-comb-filtered RF signal. Better change those presets. What exactly does the "Sharpness" control do?
User avatar
blargg
Posts: 3717
Joined: Mon Sep 27, 2004 8:33 am
Location: Central Texas, USA
Contact:

Post by blargg »

Why are the "resolution" settings for Composite and S-Video presets anything other than maximized?
Because a PC display has more resolution than a TV. Doesn't composite impose a pseudo-limit due to the chroma carrier's frequency, or can a good comb filter separate the two?
What exactly does the "Sharpness" control do?
Sharpness applies edge enhancement to the resulting image, that is, where there is a delta in luma it increases this, and compensates the nearby deltas to keep the total delta the same.
NewRisingSun
Posts: 1312
Joined: Thu May 19, 2005 11:30 am

Post by NewRisingSun »

Because a PC display has more resolution than a TV.
But even a TV has more than 282 pixels of horizontal resolution, so this doesn't apply here. You can make this point for high-resolution (512x240) modes of the Super NES.
(I've once taken the time to count the number of RGB phosphors per line on my old trinitron TV, and it happened to be exactly 480 pixels horizontally. :))
Doesn't composite impose a pseudo-limit due to the chroma carrier's frequency
Why would it? If we can produce high-resolution pictures with our algorithm without pseudo-limiting at the subcarrier frequency, why couldn't a TV set do the same? Most TV sets are actually WAY more sophisticated than our little algorithm here. You don't even need a comb filter for that, just a notch filter at 3.58 MHz would be sufficient.

Again, if there WAS a limit at 4.2 MHz, or as you suggest, at 3.58 MHz, there would be no point in using a baseband composite connection over a radio-frequency modulated signal. But there is. ;)
Sharpness applies edge enhancement to the resulting image
Hm. On my TV, there is indeed a sharpness control. Below center, it operates like your "resolution" (Y channel filtering, with no filtering at center), above center, it operates like your "sharpness" (edge enhancement). It might be less confusing if you combined the two in the manner I've described, and clearly indicated the center position with no filtering and no edge enhancement.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Post by tepples »

NewRisingSun wrote:Why are the "resolution" settings for Composite and S-Video presets anything other than maximized? If I understand "resolution" correctly as the N/Y channel bandwidth, there is no channel limit in Composite and S-Video environments.
We have to distinguish between TVs and computer monitors. The Apple IIe in 80-column text mode and "double hi-res" mode was capable of generating a signal all the way up to 7.2 MHz for use with specialized monitors, but TVs of the time couldn't display it clearly. This is because they used a crossover circuit to separate the luma (0-3.0 MHz) from the chroma (3.0-4.2 MHz). Composite computer monitors such as the ones that filled school computer labs appear to have used a (more expensive) notch filter, along with a "monochrome" button to ignore the color burst and disable all chroma processing. The advantage of baseband over RF wasn't the ability to handle signals above 4.2 MHz as much as fewer processing steps, where each step introduces noise and filter roll-offs.
The ONLY environment where there is a limit on N/Y channel bandwidth is a radio-frequency modulated signal; here, there is a 4.2 MHz limit on the N/Y bandwidth, which means an effective resolution of 4.2 MHz * 52+59/90 µs = 221.153... pixels
Harry Nyquist wrote:You forgot to double it.

Oh, and the Y is short. It's NIK-wist, not NYE-kwist.
For a signal in monochrome mode (luma up to 4.2), the horizontal resolution is 442 pixels. With the crossover, this drops to 3.0 MHz * 2 * (52 + 2/3) µs = 316 pixels, very close to the "320x240" commonly quoted for LDTV and each field of SDTV.

Some of the richness of color in games that use dithering, like Blaster Master, comes from aliasing. An isolated pixel looks different depending on where the pixel falls relative to the phase of the color subcarrier.
NewRisingSun
Posts: 1312
Joined: Thu May 19, 2005 11:30 am

Post by NewRisingSun »

So the NES generates a 2.68 MHz signal?
but TVs of the time couldn't display it clearly. This is because they used a crossover circuit to separate the luma (0-3.0 MHz) from the chroma (3.0-4.2 MHz).
Is that a verified or an ad-hoc explanation? I would rather assume that old TV sets, as opposed to computer monitors, didn't have a baseband composite input and thus had to have an RF-modulated, and thus bandlimited, signal fed to them, and that this is the actual cause of the lack of sharpness. There were certainly TV sets available with good notch and even comb filters available in the early-to-mid 1980s.
Some of the richness of color in games that use dithering, like Blaster Master, comes from aliasing.
The richness of color in Blaster Master comes from chroma subsampling horizontally, not from aliasing.
An isolated pixel looks different depending on where the pixel falls relative to the phase of the color subcarrier.
Those are cross-color and cross-luma artifacts. They cannot be used for effect with games, because the absolute phase of the color subcarrier is undefined on the Famicom.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Post by tepples »

NewRisingSun wrote:So the NES generates a 2.68 MHz signal?
A signal with alternating white and black pixels is a square wave, with fundamental frequency 2.68 MHz. The signal has harmonics at 8 MHz and above, but these are filtered out either inside the NES or inside the TV. (I don't have an oscilloscope, so I can't test it myself.)
NewRisingSun wrote:Those are cross-color and cross-luma artifacts. They cannot be used for effect with games, because the absolute phase of the color subcarrier is undefined.
They cannot be used predictably, except that they are guaranteed to differ from one 8x8 or 16x16 pixel metatile to the next. The games that use subcarrier crosstalk artifacts rely on the effect when they are used unpredictably.
NewRisingSun
Posts: 1312
Joined: Thu May 19, 2005 11:30 am

Post by NewRisingSun »

but these are filtered out either inside the NES or inside the TV.
And I'm arguing that the NES doesn't filter at all, and that the TV doesn't necessarily filter either.
The games that use subcarrier crosstalk artifacts rely on the effect when they are used unpredictably.
And what games would that be? Again, to make that point, you have to separate crosstalk artifacts from chroma subsampling. I agree that Blaster Master and other use chroma subsampling for effect, but how and where do they use crosstalk artifacts?
Post Reply