Page 1 of 1
Which emulator most accurately represents NTSC colors?
Posted: Sun Nov 02, 2008 6:15 pm
by SecretServiceDude
I've been testing my game on FCEUXD SP 1.07, and I think the title screen looks great on that emulator.
When I run the game on Nestopia 1.40, the title screen looks noticeably different. All the colors seem to have an extra greenish tint to them.
This is what I'm talking about:
The title screen on the left says, "Look at me! I'm vibrant! I'm fun! PRESS START already, willya?"
The title screen on the right says, "Life sucks."
Then it occurred to me: What if the Nestopia colors are actually more accurate? It'd be better to find out sooner rather than later, so I can choose my colors appropriately.
Nestopia has a reputation for being extremely accurate; does that reputation apply to its color output as well?
Posted: Sun Nov 02, 2008 6:39 pm
by Memblers
Don't worry about it, it will look different on various TVs also. Some people say NTSC stands for "Never Twice the Same Color".
Posted: Sun Nov 02, 2008 6:44 pm
by strangenesfreak
There is never one accurate palette, since every TV generates color differently with different color decoders. That said, Nestopia can accurately display any appropriate color TV decoder, given you have the R-Y, G-Y, and B-Y angles and gains. Nestopia starts out with a Consumer decoder, which is the one known as Sony CXA2095S/U. I've read that the default Canonical decoder is accurate for PAL games, but not NTSC games, and that a canonical decoder for NTSC TVs, which I've read is uncommon, would actually be shifted 15 degrees forward. If you test different encoders, you'll see that hues $2 and $8 are actually unreliable - $2 can be either a greenish or reddish blue, and $8 can either be a yellowish orange or a yellowish green. I'm not sure if this is inconsistent with Japanese NTSC-J TVs also; both of the 2 Japanese encoders that I know of make $2 a reddish blue and $8 a yellowish green.
If you're using a computer monitor (I believe LCDs especially), or any other monitor at brighter than default settings, you'll need to be wary of how you use certain color combinations. This can be a problem if you're mixing colors like $0c and $01 or $07 and $06; on a overly bright or high-gamma monitor, it may either look bad or even mess up the luminance order altogether. I made a thread about that
here. Set your computer monitor to default settings to see if it can roughly display the CRT gamma. I've read that LCD computer monitors (maybe not TV?) have linear gamma, so you may need to adjust your video card/palette to a gamma correction of 0.45 or so.
Posted: Sun Nov 02, 2008 7:41 pm
by Zepper
If you care about NTSC artifacts, go look into an emulator with that NTSC filter, like Nestopia; otherwise, the palette seems empirical for each emulator author though. You could try
my emu, as the palette was very welcome.
<joke>As last instance, try NESticle!</joke>
Posted: Sun Nov 02, 2008 8:41 pm
by SecretServiceDude
Fx3 wrote:If you care about NTSC artifacts, go look into an emulator with that NTSC filter, like Nestopia; otherwise, the palette seems empirical for each emulator author though. You could try
my emu, as the palette was very welcome.
I just experimented with the NTSC filter on Nestopia, and I must say, it's pretty awesome. I think I'll go with that until I'm ready to test on actual hardware.
EDIT: Dude, you're the guy who made RockNES? That's rad! I've been using that emulator for years.
Posted: Sun Nov 02, 2008 10:34 pm
by tokumaru
SecretServiceDude wrote:I just experimented with the NTSC filter on Nestopia, and I must say, it's pretty awesome.
Yeah... That filter is a pretty good preview of what the game will look like on a TV.
Posted: Mon Nov 03, 2008 5:50 am
by tepples
strangenesfreak wrote:I've read that LCD computer monitors (maybe not TV?) have linear gamma
The controller chip in every PC LCD monitor that I've tested has implemented something close to the sRGB curve.
Posted: Mon Nov 03, 2008 3:56 pm
by strangenesfreak
tepples wrote:The controller chip in every PC LCD monitor that I've tested has implemented something close to the sRGB curve.
Yeah, I've read about that too...it's really confusing. My PC LCD at default settings passes an
LCD gamma test for 2.2, but doesn't work at all with a
CRT gamma test. My PC CRT at default settings gives me 3.0 from that CRT gamma test and 2.85 for 48% and 2.5 for 25% and 10% from the LCD test, but I think CRTs and LCDs convert to 2.2 gamma differently. I have a hunch that CRT PC monitors assume a higher input gamma than LCD PC monitors...does anyone know if CRT and LCDs interpret input gamma differently?
EDIT: I think the real problem is that computer monitors just interpret input gamma differently from TVs. After configuring my PC CRT to a gamma of 2.2, it now looks very similar to the PC LCD. So maybe the input gamma of the NES is 2.2, and the input of computers is linear?