Was VHS really that bad?

You can talk about almost anything that you want to on this board.

Moderator: Moderators

tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Was VHS really that bad?

Post by tepples »

Yeah, ad guys see 1080i and say "1080-I gotta get me some of this! Now that I have even bigger space for disclaimers, I can make more outrageous claims." Then the commercial is shrunk down to 640x360 and blurred vertically so that it doesn't flicker horribly when interlaced, and all the text is too small to make out.
User avatar
Drew Sebastino
Formerly Espozo
Posts: 3496
Joined: Mon Sep 15, 2014 4:35 pm
Location: Richmond, Virginia

Re: Was VHS really that bad?

Post by Drew Sebastino »

Just wait until 4K broadcasting becomes the big thing. Oh boy...
User avatar
rainwarrior
Posts: 8062
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: Was VHS really that bad?

Post by rainwarrior »

1080i has always seemed the worst idea to me. I really don't understand why we needed an (awful) interlaced mode in HD when we already had 720p if you couldn't handle the bandwidth.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Was VHS really that bad?

Post by tepples »

rainwarrior wrote:1080i has always seemed the worst idea to me. I really don't understand why we needed an (awful) interlaced mode in HD when we already had 720p if you couldn't handle the bandwidth.
The first consumer HDTVs were CRT HDTVs, and I guess 1080i was a cheaper upgrade to the tube than 720p.
lidnariq
Posts: 10677
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Was VHS really that bad?

Post by lidnariq »

Interlacing was always a design decision for CRTs, since deinterlacing isn't necessary when the TV just does the right thing in the first place. If LCDs were already predominant at the time we'd have chosen to send 1080p30 instead.

1080i is substantially higher resolution than 720p, and even with deinterlacing artifacts I can see the improvement.

If you want something to actually complain about in the ATSC standard, it's the required overscan.
User avatar
TmEE
Posts: 789
Joined: Wed Feb 13, 2008 9:10 am
Location: Estonia, Rapla city (50 and 60Hz compatible :P)
Contact:

Re: Was VHS really that bad?

Post by TmEE »

Overscan shouldn't exist... waste of bandwidth among other things.
ccovell
Posts: 1041
Joined: Sun Mar 19, 2006 9:44 pm
Location: Japan
Contact:

Re: Was VHS really that bad?

Post by ccovell »

Customers buying the most expensive TVs would counterargue that if overscan didn't exist, the money spent on the edges of their TV tubes would go to waste.

Besides, early TVs were completely circular, then later ellipsoid with the top & bottom of the image pinched, etc...

Later, overscan got put to good use by carrying digital information, Teletext, telesoftware, etc.
User avatar
TmEE
Posts: 789
Joined: Wed Feb 13, 2008 9:10 am
Location: Estonia, Rapla city (50 and 60Hz compatible :P)
Contact:

Re: Was VHS really that bad?

Post by TmEE »

Blanking areas are something else entirely...
lidnariq
Posts: 10677
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Was VHS really that bad?

Post by lidnariq »

Overscan in NTSC? Fine. Early CRTs just weren't precise enough.
Required overscan in ATSC? Abominable. Taking a 720p input stream, scaling it up by 110%, and then displaying the center upscaled 720 lines? WTF were they thinking?
User avatar
TmEE
Posts: 789
Joined: Wed Feb 13, 2008 9:10 am
Location: Estonia, Rapla city (50 and 60Hz compatible :P)
Contact:

Re: Was VHS really that bad?

Post by TmEE »

On top of that good chunk of TVs I see sold here still crop out the edges of the image... so you lose even more image when shit like that is done... On some you can fix up that in the service menu, but just some.
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Was VHS really that bad?

Post by tepples »

lidnariq wrote:Overscan in NTSC? Fine. Early CRTs just weren't precise enough.
Required overscan in ATSC? Abominable.
The first ATSC sets were CRTs, and they still weren't precise enough to go all the way out to the edges.
User avatar
rainwarrior
Posts: 8062
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: Was VHS really that bad?

Post by rainwarrior »

lidnariq wrote:Interlacing was always a design decision for CRTs, since deinterlacing isn't necessary when the TV just does the right thing in the first place. If LCDs were already predominant at the time we'd have chosen to send 1080p30 instead.

1080i is substantially higher resolution than 720p, and even with deinterlacing artifacts I can see the improvement.

If you want something to actually complain about in the ATSC standard, it's the required overscan.
So, it looks like the first round of HD CRTs supported 480i and 1080i commonly. I guess that makes some sense that it was easiest to implement if they were both interlaced?

1080i is only substantially higher resolution at 30 hz. At 60 hz it's 921600 vs 1036800 pixels per frame, not as big a difference.

If 1080i kind of hung on as a bridge for devices that also supported traditional 480i, maybe overscan came with that for a similar reason. i.e. the 480i mode is going to do it, easiest if the HD modes work the same? I dunno.

I've never even seen an HD CRT in real life. I didn't even realize they existed until tepples started referring to them. All the HD TVs I've ever dealt with were LCD or Plasma. For those the overscan seems entirely stupid, and interlacing sucks unless your content is <= 30 fps (and at any rate, interlacing looks worse on an LCD than a CRT). Mostly I'm just annoyed that my PS3 defaults to "1080i" as a "superior" supported resolution on my 720p TV, when I'd never want to use 1080i at all. (Gotta disable 1080i every time my PS3 video settings reset.)
lidnariq
Posts: 10677
Joined: Sun Apr 13, 2008 11:12 am
Location: Seattle

Re: Was VHS really that bad?

Post by lidnariq »

NTSC/480i: H=16kHz, V=60Hz
EDTV/VGA/480p: H=32kHz, V=60Hz
1080i: H=34kHz, V=60Hz
720p: H=54kHz, V=60Hz
1080p: H=67kHz, V=60Hz

In a CRT with a magnetically-deflected tube, the greater the range of horizontal deflection rates it can support, the more expensive. (More expensive than just the additional glass, anyway)
tepples
Posts: 22345
Joined: Sun Sep 19, 2004 11:12 pm
Location: NE Indiana, USA (NTSC)
Contact:

Re: Was VHS really that bad?

Post by tepples »

rainwarrior wrote:I've never even seen an HD CRT in real life.
NovaSquirrel's mom (my aunt) is married to someone who owns one.
interlacing sucks unless your content is <= 30 fps
With certain developers prioritizing lighting complexity over frame rate, <= 30 fps has become common.
Mostly I'm just annoyed that my PS3 defaults to "1080i" as a "superior" supported resolution on my 720p TV, when I'd never want to use 1080i at all. (Gotta disable 1080i every time my PS3 video settings reset.)
If you have a 1080 TV, a game locked to 30 fps, and a cable that can't do "high speed" HDMI, 1080i is superior. You need an HDMI cable rated for high speed to guarantee a stable 1080p 60 Hz picture. This was important back in the early PS3 days when not all HDMI cables in stores were high speed.

Standard HDMI: 720p, 1080i, or 1080p/30
High speed HDMI: 1080p/120 (enough for 3D) or 4K/30
User avatar
rainwarrior
Posts: 8062
Joined: Sun Jan 22, 2012 12:03 pm
Location: Canada
Contact:

Re: Was VHS really that bad?

Post by rainwarrior »

tepples wrote:
rainwarrior wrote:interlacing sucks unless your content is <= 30 fps
With certain developers prioritizing lighting complexity over frame rate, <= 30 fps has become common.
I specifically meant content that is a consistent framerate (e.g. film). Games are not that, in general. It's usually erratic framerate (often locked to no more than 30fps, though some games target 60fps for typical load). An unstable framerate plays terribly with interlacing. Unless the frames are paired consistently it's absolutely awful. Some TVs at least have some form of detection for telecline/etc. that can help with this but plenty aren't up to the task of re-synching interlaced pairs for an unstable framerate.
tepples wrote:
Mostly I'm just annoyed that my PS3 defaults to "1080i" as a "superior" supported resolution on my 720p TV, when I'd never want to use 1080i at all. (Gotta disable 1080i every time my PS3 video settings reset.)
If you have a 1080 TV, a game locked to 30 fps, and a cable that can't do "high speed" HDMI, 1080i is superior. You need an HDMI cable rated for high speed to guarantee a stable 1080p 60 Hz picture. This was important back in the early PS3 days when not all HDMI cables in stores were high speed.
1080i is not superior to 720p for an unstable framerate. I'd take upscaled 720p over 1080i for games any day of the week.

This is even assuming that the TV displays 1080i at interlaced 60hz, rather than de-interlacing it to 30hz, which plenty of LCD TVs will do.
Last edited by rainwarrior on Mon Jul 27, 2015 7:21 pm, edited 1 time in total.
Post Reply