×
  • remind me tomorrow
  • remind me next week
  • never remind me
Subscribe to the ANN Newsletter • Wake up every Sunday to a curated list of ANN's most interesting posts of the week. read more

Forum - View topic
Answerman - Why Is Old Anime Still Released In Interlaced Format?


Goto page Previous  1, 2, 3, 4  Next

Note: this is the discussion thread for this article

Anime News Network Forum Index -> Site-related -> Talkback
View previous topic :: View next topic  
Author Message
Just Passing Through



Joined: 04 Apr 2011
Posts: 277
PostPosted: Thu Sep 01, 2016 7:20 am Reply with quote
jsevakis wrote:


Anime is definitely guilty of that and it drives me up the wall. The BBC stuff, however, is produced in 25 fps, which is not compatible with North American televisions. And while that converts to 29.97i and looks fairly good (on an interlaced-capable monitor), converting that to 24p is just a cluster-f---. You either end up with ghosting, stuttering, or a 4% (audible) slowdown. And in Europe, all TV shows pretty much have to be 25 fps.

I've made some private dub archival Blu-rays of stuff like the Manga UK releases of Patlabor 2 and Space Adventure Cobra. Those were both dubbed at 25 fps, so speeding up the remastered video to 25 fps and then converting to 29.97i was basically the only way I could get an acceptable result.


The one time since buying a Blu-ray player, I deliberately bought a DVD when a BD was available, was for the UK version of Life on Mars. They wanted global reach for the Blu-rays, and encoded them at 24p for the US market, but the show was shot native 50Hz for PAL broadcast. The 576i 50Hz DVDs are native, while the Blu-rays slow down the audio by 4%.

I am so glad that modern anime DVDs are being encoded progressively. The old interlaced stuff is so often hit and miss. I actually keep an old CRT TV to watch that stuff on, as it looks pants on an HD panel. But, interlaced NTSC is a small issue. try living with NTSC-PAL standards conversion on all your anime!
Back to top
View user's profile Send private message
gwdone



Joined: 01 May 2008
Posts: 272
PostPosted: Thu Sep 01, 2016 4:48 pm Reply with quote
A very informative article and discussion!!! Thanks!!!
Back to top
View user's profile Send private message
Polycell



Joined: 16 Jan 2012
Posts: 4623
PostPosted: Thu Sep 01, 2016 6:55 pm Reply with quote
Kalessin wrote:
I would have thought that an HD TV would be more flexible with framerates than was the case with CRT displays and that therefore Blu-ray would be more flexible with what it allowed (progressive displays don't have that whole problem with rate of the electron gun being derived from the voltage and thus forcing the framerate to a particular value). When dealing with progressive displays, it really shouldn't be necessary to force the video into either 24 fps or 29.97 fps when the show is really 25 fps . A computer certainly wouldn't care, so I would have thought that an American HD TV could handle the same 25 fps progressive video that a British one could, in which case it would make the most sense to just put the video at 1080p and 25 fps on the Blu-ray. But maybe they've done something specialized in the TVs that makes them less flexible.
A computer will take any framerate, but it still has to convert it to whatever framerate your monitor uses(usually 60, sometimes 75). Your player and/or TV might be able to do the same thing, but the quality depends on the processing power available and algorithm used. The only way to guarantee a perfect image for every framerate is to buy a 600Hz TV, which are specifically made to deal with all this.
Back to top
View user's profile Send private message
DangerMouse



Joined: 25 Mar 2009
Posts: 3984
PostPosted: Thu Sep 01, 2016 9:12 pm Reply with quote
That was a great read.
Back to top
View user's profile Send private message
HdE



Joined: 17 Nov 2015
Posts: 50
PostPosted: Fri Sep 02, 2016 10:09 am Reply with quote
I love these kinds of articles! There's nobody in the business of disc authoring I trust more than Justin to offer a qualified and authoritative answer to these kinds of of questions.

This, though, from way upthread is of particular interest to me:

jsevakis wrote:
DmonHiro wrote:
OK... wait a second. I know all about interlaced anime and how big of a pain in the butt it is to IVTC sometimes, but what's so hard about viewing it. If you're talking about putting a DVD in your computer and watching the VOB files, then most players do the IVTC-in automatically. I've never seen an anime DVD that didn't look de-interlaced on my computer.

I think the asker was either using VLC, or didn't like how software/automatic GPU deinterlacing looks. And honestly, most graphics cards (ATI in particular) do a pretty crap job of it.


I use VLC a great deal, and I've found I can usually get a watchable result from discs that throw out weird visual artefacting just by experimenting with the de-interlacing options the program gives you. (If in doubt, folks, select Yadif X. Failing that, the phosphor setting on newer versions of VLC smooths out a lot of nastiness.) Very occasionally, though, I'll find one that just looks horrible whatever I do with it.

Side point: Sentai's Dokkoida!? springs to mind. That one gets a lot of comments for how lovely it looks. But there's so much pixellation on screen whenever anything moves, it's ridiculous.
(Disclaimer - I live in the UK and I import a lot of region 1 discs. No idea how my DVD set up might affect how they display.)

Now, I'm pretty sure that a confirmed videophile will be able to tell me why tinkering with VLC's settings isn't a 100% satisfactory solution, but it's usually enough for my needs. I forget the model of my graphics card, but being a graphic designer, everything in my main work PC is pretty beefy. I might have to take a look and see what it is just to assuage my curiosity.
Back to top
View user's profile Send private message
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Sun Sep 04, 2016 3:17 am Reply with quote
Shiflan wrote:
That being said, I've had some personal experience messing with various options to get older video to look good on modern TVs. I have several friends into retro video gaming and this is a major issue for that hobby: many older consoles look great when hooked up to a CRT TV but look awful on a modern TV. I still own a lot of anime on laserdisc and the same problem exists there: the only video output from the player is analog and the signal is interlaced. Some TVs do a better job handling the old-school signal than others. One thing that has worked well for me (and some of my gaming friends) is to use one of the video conversion units made by DVDO in between the old analog device and a modern HDTV with HMDI. The DVDO unit does the signal processing much better than most TVs do, and they are reasonably priced on the used market. Alas that wouldn't help those who want to view older programming on their PC.


What are the most common problems that happen when you hook up an old system to a modern television? I always had the impression that even modern TVs were equipped to handle interlaced signals and that there shouldn't be a problem, light guns notwithstanding.

(I never had the pleasure of hooking an old system to a modern TV, as all of my old systems and games were stolen in a burglary before I could try it.)

q_3 wrote:
Are there shows that have been released on disc that aren't streaming because they're interlaced? I'd be interested to see a list of them - and I'd be more likely to buy the discs if I knew that's all we're likely to get!


Plenty. Remember that we're talking about Hulu here, not some anime-specific streaming service. The majority of TV's history is in interlaced, so the majority of TV shows were aired and archived in interlaced. It's problematic if you want to watch an older show and it's not that popular.

TheAncientOne wrote:
Kalessin wrote:

This is the first that I've ever heard of HD TVs being anything other than purely progressive. What do they do, have some mode where they refresh only every other line on the screen to emulate a CRT display?

HDTVs (at least those with a tuner) have to be able to process an interlaced signal, as the 1080 broadcast format is interlaced (only 720 is progressive for OTA signals). How that interlaced input is actually displayed is up to the manufacturer (or at least the manufacturer of the chipset the TV manufacturer uses).


I kind of assumed every HD TV that has a composite port would have to be able to handle an interlaced signal, since when that was the dominant kind of plug, progressive signals weren't very popular yet, and they have to take into account that someone will try to plug a device from the SD era into it using the composite port.

And every HD TV I've handled has at least a composite port. Some even have co-axial, like my dinky 14" Polaroid from 2008. I have never seen an HD TV that only has component and/or HDMI.
Back to top
View user's profile Send private message
Shiflan



Joined: 29 Jul 2015
Posts: 418
PostPosted: Sun Sep 04, 2016 8:42 pm Reply with quote
leafy sea dragon wrote:
What are the most common problems that happen when you hook up an old system to a modern television? I always had the impression that even modern TVs were equipped to handle interlaced signals and that there shouldn't be a problem, light guns notwithstanding.


I've never had an older system simply "not work" on a modern TV. That being said, some TVs (and the rest of the setup) work much better than others. These are the main issues as I know them:

1) Chrominance-Luminance separation. On older media like VHS and LD the analog video signal contained both the Chrominance (color) and luminance (brightness) data. If I understand correctly, this dates back decades to the advent of color TV, when the color signal was superimposed on the brightness signal already in use for B&W TV. Before the signal can be displayed on a digital TV the two must be separated. This is done using a "comb filter". Back in the days before HDMI was the norm manufacturers spent quite a lot of effort on the quality of the comb filter as it was critically important to picture quality. These days most hookups are via HDMI so the comb filter is more of an afterthought than anything else. The best comb filters (that I am aware of) are no longer in production; they were found on certain high end Japanese LD players that were never sold in the west, and also in the Mitsubishi Diamondscan rear-projection "big screen TVs". Many modern TVs are simply not very good at this.

2) digital processing. Justin touched on this a bit in his article. Modern TVs contain what is essentially a computer. It does such things as convert interlaced signals to progressive, and upscale video to whatever the native resolution of the LCD panel is. These video processors are actually very powerful these days and also do things like dynamic brightness control, edge enhancement, or even attempt to remove macroblocking from a poorly compressed DVD. The problem is that the software which these systems run is optimized for "real life" video rather than animation. A TV that renders a hollywood movie perfectly might do a horrible job displaying even simple animation.

3) Not really relevant for watching anime, but the aforementioned video processing "computer" in the TV takes a little while to render frames. Thus there is a slight lag between the output signal of the player (or video game console) and the moment that signal is shown on he screen. For video games that require precise timing that can cause problems because what you're seeing on the TV is a fraction of a second behind what's actually going on in the console.
Back to top
View user's profile Send private message
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Sun Sep 04, 2016 9:13 pm Reply with quote
Thank you! That actually makes a lot of sense. Regarding comb filters, are they physical devices placed into a TV or a program for the computer inside the TV to do? If it's the latter, couldn't they reuse existing comb filters? If it's the former, is it simply just not worth the money to manufacture good quality ones anymore?
Back to top
View user's profile Send private message
Galap
Moderator


Joined: 07 Apr 2012
Posts: 2354
PostPosted: Mon Sep 05, 2016 3:13 am Reply with quote
God damn interlacing, and doing inverse telecine. I remember capturing obscure anime off of VHS, and it was pretty demanding to figure out how to do that and have it look good.

I remember the solution ended up being using Donald Graft's deinterlacing filter, plus some other manipulations. Ended up working out but figuring it out was an involved process.
Back to top
View user's profile Send private message
Shiroi Hane
Encyclopedia Editor


Joined: 25 Oct 2003
Posts: 7580
Location: Wales
PostPosted: Wed Sep 07, 2016 1:42 pm Reply with quote
"although Crunchyroll will put up with some frame blending if that's all that's available"

This is giving me flashbacks to some of their catalogue shows that looked like they could well have been ripped from the DVDs themselves. Like https://goo.gl/photos/LmaT8i7jV51n2gkNA (could be worse; these screenshots are from some episodes I purchased from Blinkbox, provided by GONG)

DmonHiro wrote:
Now.... if you're trying to encode an old anime DVD to a mkv or mp4, and the DVD is interlaced... have fun in hell.

It's even more fun if the NTSC video has been converted to PAL. I keep a collection of my favourite OP/EDs on a memory card for viewing on long journeys etc and some like Konomimi have really suffered (plus I've never had any joy getting de-interlacing to work on Android).

Kalessin wrote:
I can understand leaving the interlacing in when producing a DVD, because it's SD, and your typical SD TV is a CRT and deals with interlaced video. But I don't see how it makes any sense on Blu-ray, regardless of the original content. HD displays aren't CRTs, and so they're progressive, not interlaced

I'd expect the majority of SD TVs are flat panels these days, and there definitely have been HD CRTs - they showed them off on BBC Tomorrow's World in the 80s!

jsevakis wrote:
Yeaaaah noooooo. That won't work at all guys. TVs are VERY strict with their supported frame rates. Nearly all TVs sold outside of PAL countries will outright refuse a 25 Fps signal.

I remember finding out that PS3s will refuse to play a PAL DVD, even if it is Region 1 or region free. I remember looking into importing a PS3 when I first got one, but in order to play games, DVDs and BDs from the UK , Japan and the US I would have needed three separate consoles (a Japanese console would play BDs from the US and Japan and DVDs from the UK and Japan, but not BDs from the UK or DVDs from the US etc.)
I should imagine that any TV model that is sold worldwide would support both 50 and 60Hz, rather than having multiple different hardware versions (I guess larger manufacturers might have region specific firmware, but I wouldn't be surprised if a lot of cheap OEM models didn't have anything that fancy)
Back to top
View user's profile Send private message Visit poster's website AIM Address Yahoo Messenger MSN Messenger ICQ Number My Anime My Manga
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Wed Sep 07, 2016 10:47 pm Reply with quote
Shiroi Hane wrote:
I'd expect the majority of SD TVs are flat panels these days, and there definitely have been HD CRTs - they showed them off on BBC Tomorrow's World in the 80s!


I remember seeing those things in the 90's too. My father liked to stay on top of technological releases, so he'd watch shows, read magazines, and go to conventions about not-quite-consumer-yet machines. The consensus on HD CRTs was "too expensive."

Were they ever released into the general consumer markets? I never saw any HD CRTs for sale, only at those conventions.
Back to top
View user's profile Send private message
Alan45
Village Elder



Joined: 25 Aug 2010
Posts: 9873
Location: Virginia
PostPosted: Thu Sep 08, 2016 7:37 am Reply with quote
leafy sea dragon Wrote:
Quote:
Were they ever released into the general consumer markets? I never saw any HD CRTs for sale, only at those conventions.


It depends on your definition of an HD CRT. I had a 36" standard aspect Sony CRT TV that was "HD ready". The tuner did not handle HD broadcasts which didn't matter as during its lifetime only sports shows were broadcast in HD and the cable box superseded the tuner in any case. However, when used with a Bluray player it would show the content in HD. Wide aspect shows were letterboxed of course.

The interesting thing is that since picture size is measured on the diagonal, I had to go to a 46 inch wide screen set to maintain the same size when watching standard aspect shows.
Back to top
View user's profile Send private message My Anime My Manga
dragonrider_cody



Joined: 14 Jun 2008
Posts: 2541
PostPosted: Thu Sep 08, 2016 12:21 pm Reply with quote
leafy sea dragon wrote:
Shiroi Hane wrote:
I'd expect the majority of SD TVs are flat panels these days, and there definitely have been HD CRTs - they showed them off on BBC Tomorrow's World in the 80s!


I remember seeing those things in the 90's too. My father liked to stay on top of technological releases, so he'd watch shows, read magazines, and go to conventions about not-quite-consumer-yet machines. The consensus on HD CRTs was "too expensive."

Were they ever released into the general consumer markets? I never saw any HD CRTs for sale, only at those conventions.


I know at least Sony a few HD CRT models to market. We had them for sale at Best Buy. The picture was actually pretty good, and they were considerably cheaper than the LCD and plasma models available at the time.
Back to top
View user's profile Send private message
Shiflan



Joined: 29 Jul 2015
Posts: 418
PostPosted: Sat Sep 10, 2016 8:31 am Reply with quote
leafy sea dragon wrote:
Regarding comb filters, are they physical devices placed into a TV or a program for the computer inside the TV to do? If it's the latter, couldn't they reuse existing comb filters? If it's the former, is it simply just not worth the money to manufacture good quality ones anymore?


Originally they were dedicated IC chips installed somewhere in either the player or the television. A player using a "composite" video cable would send the combined signal to the TV and then the TV's built-in comb filter was used. That setup would have been most common for typical consumer A/V gear. If the connection was done by either S-video or Component cables then the comb filter would be built into the player. Those were found on some higher-end consumer gear.

It is not always obvious which is the better setup. S-video was marketed as being an "upgrade" from composite but if your TV happened to have a better filter than your player then the opposite was true.

Nowadays this is largely moot. DVDs and Blu-Rays encode the chroma and luminance signals separately so there is no need for a comb filter at all. The only reason they still exist is for backwards comparability. But it seems that feature will likely disappear soon as analog audio and video are pretty much dead in the mainstream market. Earlier this year I went looking for a Blu-Ray player that had an analog audio output on it so I could run the signal through my 2-ch stereo. It was very difficult to find such a thing even though it used to be so common.

The best comb filters that I am aware of were from the Japanese MUSE HD-LD players (yes, there was such a thing as HD laserdisc!) and the Mitsubishi TVs I mentioned above. Those were different because they had implemented a particular patent which had just become available. But I don't think there was much demand to keep pushing the envelope on filter design since DVD was the new thing and didn't need one.

These days TVs have a powerful computer inside which handles all the video processing. Surely such a chip has the "horsepower" to do a better job than the best tech of 1995. But I don't think the motivation exists for any of the makers to invest in furthering the R&D for such a niche thing. The modern video processing engines are geared around real-life video, not animation.
Back to top
View user's profile Send private message
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Sun Sep 11, 2016 3:48 am Reply with quote
All right, thanks for the answer. So the reason is because there isn't much demand for backwards compatibility with older devices anymore? (I'd guess this will be problematic for retro gamers who want to play on the systems they're originally on. Certainly, I have a number of older game systems, and their lack of compatibility with newer TVs worries me.)
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Anime News Network Forum Index -> Site-related -> Talkback All times are GMT - 5 Hours
Goto page Previous  1, 2, 3, 4  Next
Page 3 of 4

 


Powered by phpBB © 2001, 2005 phpBB Group