×
  • remind me tomorrow
  • remind me next week
  • never remind me
Subscribe to the ANN Newsletter • Wake up every Sunday to a curated list of ANN's most interesting posts of the week. read more

Forum - View topic
Answerman - Is It Worth Seeing 35mm Prints of Ghibli Movies?


Goto page Previous  1, 2, 3  Next

Note: this is the discussion thread for this article

Anime News Network Forum Index -> Site-related -> Talkback
View previous topic :: View next topic  
Author Message
configspace



Joined: 16 Aug 2008
Posts: 3717
PostPosted: Thu Dec 14, 2017 3:51 am Reply with quote
Since I don't go to the theaters often in recent times, I still have vivid memories of old analog projected film. And all I can say is thank goodness theaters have moved to digital projection. That old analog "warmth" just meant dull colors, very dim brightness, smooth and smudgy details rather than sharpness, and graininess and random visible dirt and hairs on screen.

Watching the same movie I saw in the theaters before now on bluray or 4k content on a nice TV or monitor just reminds me of how bad film was. I remember watching the X-files movie (film) in the theater I still remember thinking to myself, man this looks like crap.

For something that was supposed to capture real life more closely film ironically resulted in preserving something with less fidelity than modern digital means. Take for instance grain. What's with the nostalgia for grain? If I ever saw grain in my vision, I'd be to the doctor quick. It's even more ironic for anything produced in the last few decades since they're mastered, edited or post-produced with digital technologies.

Since the question specifically mentioned titles like Mononoke, Ghibli has used digital painting from Mononoke on, so I would say that no, it's not worth any money at all viewing it on film.
Back to top
View user's profile Send private message
fuuma_monou



Joined: 26 Dec 2005
Posts: 1817
Location: Quezon City, Philippines
PostPosted: Thu Dec 14, 2017 4:00 am Reply with quote
configspace wrote:
For something that was supposed to capture real life more closely film ironically resulted in preserving something with less fidelity than modern digital means. Take for instance grain. What's with the nostalgia for grain? If I ever saw grain in my vision, I'd be to the doctor quick. It's even more ironic for anything produced in the last few decades since they're mastered, edited or post-produced with digital technologies.


If it was shot on film, it'll have grain. Getting rid of the grain when making HD masters via DNR (digital noise reduction) more often than not makes the picture look worse.
Back to top
View user's profile Send private message Visit poster's website
configspace



Joined: 16 Aug 2008
Posts: 3717
PostPosted: Thu Dec 14, 2017 4:10 am Reply with quote
@ fuuma_monou

Oh yeah, I understand that, but that's just a matter of choosing the least bad result. I meant to convey that grain is anti-fidelity as my argument against such sentimental nostalgia for it. The original source image that film is attempting to capture -- the light that comes through the lens from the real life shot or the animation cells themselves don't have any grain (notwithstanding any specific grain SFX in modern movies)
Back to top
View user's profile Send private message
Frenzie



Joined: 08 Sep 2017
Posts: 11
PostPosted: Thu Dec 14, 2017 6:35 am Reply with quote
configspace wrote:
Oh yeah, I understand that, but that's just a matter of choosing the least bad result. I meant to convey that grain is anti-fidelity as my argument against such sentimental nostalgia for it.

As a corollary to the grain thing, I've also never understood the low-framerate fetish. My memory of watching Jurassic Park in '93 is that it was without a doubt the best and most exciting movie I'd ever seen. But with one big blemish on my memory of the event! Some of the big moving shots practically made me feel nauseated, or at the very least like something was very, very wrong with me. Yet now people complain at the lack of this unpleasant effect claiming it makes a movie without it feel less "theatrical" and more "like a soap opera."
Back to top
View user's profile Send private message
Ouran High School Dropout



Joined: 28 Jun 2015
Posts: 440
Location: Somewhere in Massachusetts, USA
PostPosted: Thu Dec 14, 2017 10:05 am Reply with quote
Ouran High School Dropout wrote:
For any 35mm material that old, my concern here would be decay of the film emulsion, sometimes called "vinegar syndrome". After 20 years, I'd expect the image to be decomposed, especially if the print was carelessly stored.

Silly me. Vinegar syndrome and dye fading are two separate issues. Should have made that clear.

Vinegar syndrome refers to the chemical breakdown of the acetate (safety film) base, and is named from the odor it gives off. The film base first becomes brittle, then the emulsion (the layer containing the image) separates from the base as the latter shrinks.

Dye fading happens to the emulsion layer. Since the dyes use organic compounds, they are prone to decay. The yellow and cyan dyes fade quicker, rendering the image on a positive print a garish collection of reds, browns, and oranges.

I've handled film with both these conditions. Not pretty.
Back to top
View user's profile Send private message
Blood-
Bargain Hunter



Joined: 07 Mar 2009
Posts: 23769
PostPosted: Thu Dec 14, 2017 10:31 am Reply with quote
For me, it's not really an issue if the film is 35mm or digital, I just like seeing Ghibli movies (or any other anime film, for that matter) on the big screen and take advantage of the opportunity any time I can
Back to top
View user's profile Send private message My Anime My Manga
PurpleWarrior13



Joined: 05 Sep 2009
Posts: 2025
PostPosted: Thu Dec 14, 2017 2:09 pm Reply with quote
configspace wrote:
@ fuuma_monou

Oh yeah, I understand that, but that's just a matter of choosing the least bad result. I meant to convey that grain is anti-fidelity as my argument against such sentimental nostalgia for it. The original source image that film is attempting to capture -- the light that comes through the lens from the real life shot or the animation cells themselves don't have any grain (notwithstanding any specific grain SFX in modern movies)


I always liked the texture that film grain gives a movie, especially on smaller formats like 16mm. For me, it’s not just about how something looks. It’s how the image makes me feel. It’s the same reason some cinematographers chose a softer focus. When I was shooting my short film, I wanted a thick and noticeable grain texture to give the image a hint of darkness. Even though grain texture doesn’t exist in real life, it still gives me a connection to reality on an emotional level. If an image doesn’t have any trace of grain, it looks too slick and polished to me. There’s something artificial about it. The recent movie Lady Bird was (apparently) shot digitally, but the image is still heavily textured with grain because the director, Greta Gerwig, wanted the movie to “feel like a memory.” Soft focus is often intentionally used to give an image a “dream-like” quality, and grain can be used for a similar effect. It’s all about what you want, artistically.

It’s not even about “nostalgia” for me, because I grew up watching movies on VHS, which never displayed much grain texture. It’s not something I picked up on until maybe 10 years ago.

Animation is a bit different though, and of course I’m talking about photography, not projection.
Back to top
View user's profile Send private message
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Thu Dec 14, 2017 8:01 pm Reply with quote
configspace wrote:
Since I don't go to the theaters often in recent times, I still have vivid memories of old analog projected film. And all I can say is thank goodness theaters have moved to digital projection. That old analog "warmth" just meant dull colors, very dim brightness, smooth and smudgy details rather than sharpness, and graininess and random visible dirt and hairs on screen.

Watching the same movie I saw in the theaters before now on bluray or 4k content on a nice TV or monitor just reminds me of how bad film was. I remember watching the X-files movie (film) in the theater I still remember thinking to myself, man this looks like crap.

For something that was supposed to capture real life more closely film ironically resulted in preserving something with less fidelity than modern digital means. Take for instance grain. What's with the nostalgia for grain? If I ever saw grain in my vision, I'd be to the doctor quick. It's even more ironic for anything produced in the last few decades since they're mastered, edited or post-produced with digital technologies.

Since the question specifically mentioned titles like Mononoke, Ghibli has used digital painting from Mononoke on, so I would say that no, it's not worth any money at all viewing it on film.


Your description of the "warmth" of film sounds a lot like my description of the "warmth" of incandescent lights. (Minus the grain, of course, but replaced with much shorter life than LEDs.)

Reading arguments from said incandescent fans, it may be that they want to see the grain and blurriness and such because they believe it makes the picture look more natural. A big part of why those incandescent fans dislike LED lighting is because they believe it feels cold, sterile, and mechanical and will light their houses and such with incandescents for as long as they can because they want that old-fashioned, homely feel to their lives.

(I grew up in the age of the iPod though, so I actually prefer the cold, sterile, and mechanical look to things, of which I would prefer the adjectives "sleek, cutting-edge, and stylish.")
Back to top
View user's profile Send private message
Alan45
Village Elder



Joined: 25 Aug 2010
Posts: 9839
Location: Virginia
PostPosted: Thu Dec 14, 2017 8:14 pm Reply with quote
@leafy sea dragon

For home use they make LED lights in Warm White as well as the original cool white. While this undercuts the argument about "cold and sterile" I'm sure they will find another excuse to not change and to damn the government for insisting.
Back to top
View user's profile Send private message My Anime My Manga
StudioToledo



Joined: 16 Aug 2006
Posts: 847
Location: Toledo, U.S.A.
PostPosted: Thu Dec 14, 2017 9:27 pm Reply with quote
PurpleWarrior13 wrote:
Film projection is a different story. I would still argue that the image on a fresh print looks better than digital, especially in IMAX 70mm, but it’s such a nightmare to deal with, there really is less of a need for film today. I work in a movie theater that converted to full digital back in 2012 (the last 35mm show was Battleship). It definitely has its own unique problems. Every once in a blue moon, the computer will freeze, and show’s over. The audio needs constant resetting, or else it will sound very tinny and distorted. Sometimes the system crashes before a show, and the audience is treated to a desktop background before everything is reset.

I think anything is prone to technical difficulties one way or another.

Quote:
However, these problems are relatively small compared to the issues with film, like Justin mentioned. I’ll never forget watching Finding Nemo when I was 9, and being distracted by the constant dirt and scratches, just because we saw the movie a month after it came out. A manager where I work says that when he worked at a different theater, someone loaded Star Wars: The Phantom Menace into the projector incorrectly on opening night. The first show played just fine, but every show after had a neon green scratch running along the side. Oops. My film professor used to work at a cinema, and says that when Titanic was playing, they ran the prints so much, they would literally fall apart in the projector, and they had to order new ones from Paramount.

I'm sure the stories of going to "reduced price" cinemas are even more telling of the degraded state of these prints once they leave the "first run" cinemas. That was what my mom could afford to take me to and I learned to enjoy the slightly dingy quality of it, but then, that's all we knew back in those days. TV kinda conditioned us to this reality...
https://www.youtube.com/playlist?list=PLTnbwiCw-mMQ0Pak82XOKJVchQQ5k4a7z

Quote:
But I love movies, and I love film. I would love to see another show in 35mm (I think the last one I saw was Scream 4) for the novelty alone. The only anime movie I ever saw in 35mm was Yu-Gi-Oh! The Movie, but I’ve seen plenty of Miyazaki movies through digital projection, and they look gorgeous.

This is why I'm glad I got a good two decades out of going to movies in the 20th century.

fuuma_monou wrote:

If it was shot on film, it'll have grain. Getting rid of the grain when making HD masters via DNR (digital noise reduction) more often than not makes the picture look worse.

Thank you.

Frenzie wrote:
configspace wrote:
Oh yeah, I understand that, but that's just a matter of choosing the least bad result. I meant to convey that grain is anti-fidelity as my argument against such sentimental nostalgia for it.

As a corollary to the grain thing, I've also never understood the low-framerate fetish. My memory of watching Jurassic Park in '93 is that it was without a doubt the best and most exciting movie I'd ever seen. But with one big blemish on my memory of the event! Some of the big moving shots practically made me feel nauseated, or at the very least like something was very, very wrong with me. Yet now people complain at the lack of this unpleasant effect claiming it makes a movie without it feel less "theatrical" and more "like a soap opera."

The so-called "low-framerate" was basically more a limitation film stuck to for a long while, especially when the application of sound became prominent. The real takeaway out of this is the way the retina of the eyes are meant to function, as the trick that made motion picture work the way it did is in how our eyes retained those images long after they've left our vision. This was the foundation of animation itself, going back to the days of the familiar devices like the zoetrope, the praxinoscope or even flipbooks. The German word for animation has been written as "Trickfilm", and in some way, it's the "trick" that film does that we accept, even in low frame rates. That "trick" is sorta leaving us, even if gradually.

Ouran High School Dropout wrote:
Ouran High School Dropout wrote:
For any 35mm material that old, my concern here would be decay of the film emulsion, sometimes called "vinegar syndrome". After 20 years, I'd expect the image to be decomposed, especially if the print was carelessly stored.

Silly me. Vinegar syndrome and dye fading are two separate issues. Should have made that clear.

Vinegar syndrome refers to the chemical breakdown of the acetate (safety film) base, and is named from the odor it gives off. The film base first becomes brittle, then the emulsion (the layer containing the image) separates from the base as the latter shrinks.

Dye fading happens to the emulsion layer. Since the dyes use organic compounds, they are prone to decay. The yellow and cyan dyes fade quicker, rendering the image on a positive print a garish collection of reds, browns, and oranges.

I've handled film with both these conditions. Not pretty.

Well at least we weren't discussing Nitrate-based film! Now that's a story all its own.

When it comes to fading of those days, there have been attempts at lowering the rate it happens to, or have it virtually non-existent. There's several types out there ranging from the crappy "Eastman" stock all the way to the IB Technicolor process. AGFA and FUJI made great stocks from what I've collected.

leafy sea dragon wrote:

(I grew up in the age of the iPod though, so I actually prefer the cold, sterile, and mechanical look to things, of which I would prefer the adjectives "sleek, cutting-edge, and stylish.")

For me, this was my childhood! I defy anyone who thinks they had it best.


Alan45 wrote:
@leafy sea dragon

For home use they make LED lights in Warm White as well as the original cool white. While this undercuts the argument about "cold and sterile" I'm sure they will find another excuse to not change and to damn the government for insisting.

For me, I got those Warn LED's in my room. Using out a "Cool White" bulb just feels too clinical, I don't like being in hospitals.
Back to top
View user's profile Send private message Visit poster's website MSN Messenger
leafy sea dragon



Joined: 27 Oct 2009
Posts: 7163
Location: Another Kingdom
PostPosted: Thu Dec 14, 2017 11:35 pm Reply with quote
Alan45 wrote:
@leafy sea dragon

For home use they make LED lights in Warm White as well as the original cool white. While this undercuts the argument about "cold and sterile" I'm sure they will find another excuse to not change and to damn the government for insisting.


Yeah, they even make some that are designed to fade in and out when turned on or off, rather than the suddenness characteristic of LEDs, as that fade is also a major part of the appeal to incandescent fans.

The fading is done by an additional piece of hardware located in the socket that detects when the light has been turned on or off and controls the flow of electricity accordingly. (Hence, you can also control how long or short it takes to fade.) This causes these LEDs to be a lot more expensive, but yeah, they're selling to a diehard niche audience.

StudioToledo wrote:
For me, this was my childhood! I defy anyone who thinks they had it best.

For me, I got those Warn LED's in my room. Using out a "Cool White" bulb just feels too clinical, I don't like being in hospitals.


Though not QUITE that old, I actually have a Sony Trinitron TV from 1978 in the dining room, a hand-me-down from the previous generation of my family. I've hooked up things as recent as an Xbox 360 to it, and it's more complicated than I'd like it to be because most electronic appliances have stopped support for coaxial long ago. A corollary to me using old TVs, though, especially with video capture devices attached to them, is that I've acquired a lot of knowledge on installing audio/video equipment. Not as much as someone whose job would be about them, of course, but people have repeatedly relied on me to hook things up to their TVs and sound systems, way more so than I had thought. When I worked in a thrift store, I became the go-to person for that store for any questions related to connecting these things (and I also learned that a LOT of people have problems with component and composite cables--I could see their "Why didn't I think of that?" expressions when I would tell them to plug each part into the socket of the same color) and did a lot of demonstrations on request.

As for "warm white" versus "cool white," I'd be part of the group to prefer "cool white."I just like the really clean look it has, and since my own hobbies tend to involve precise use of color, I want my lighting to be as bright white as possible to see colors the easiest. The yellower a light becomes, the darker blues get, for instance, and the harder it is to tell whites from yellows, especially between off-whites and pale yellows. (There is someone I know who solves Rubik's cubes, for instance. She had to do it at least once in an area with very yellow light, and she ran into a lot of trouble because the white and yellow faces merged into a similar yellow, blue and green merged into a similar vague dark color, and red and orange merged into a vermillion.)
Back to top
View user's profile Send private message
FLCLGainax





PostPosted: Fri Dec 15, 2017 3:50 pm Reply with quote
I noticed on IFC Center's website (NYC), only a couple of their Ghibli film screenings are 35mm while the rest are Digital Cinema Package. Hmmm...
Back to top
zaphdash



Joined: 14 Aug 2002
Posts: 620
Location: Brooklyn
PostPosted: Sat Dec 16, 2017 12:52 pm Reply with quote
configspace wrote:
Since I don't go to the theaters often in recent times, I still have vivid memories of old analog projected film. And all I can say is thank goodness theaters have moved to digital projection. That old analog "warmth" just meant dull colors, very dim brightness, smooth and smudgy details rather than sharpness, and graininess and random visible dirt and hairs on screen.

Watching the same movie I saw in the theaters before now on bluray or 4k content on a nice TV or monitor just reminds me of how bad film was. I remember watching the X-files movie (film) in the theater I still remember thinking to myself, man this looks like crap.

For something that was supposed to capture real life more closely film ironically resulted in preserving something with less fidelity than modern digital means. Take for instance grain. What's with the nostalgia for grain? If I ever saw grain in my vision, I'd be to the doctor quick. It's even more ironic for anything produced in the last few decades since they're mastered, edited or post-produced with digital technologies.

Since the question specifically mentioned titles like Mononoke, Ghibli has used digital painting from Mononoke on, so I would say that no, it's not worth any money at all viewing it on film.

You're obviously entitled to your own opinions about how you'd like an image to look, but your fundamental premise that the purpose of photography or cinematography is "to capture real life more closely" is wrong. The fact that photography can more faithfully reproduce a realistic image than, say, painting or animation doesn't make faithful reproduction its sole overriding purpose. Filmmakers very frequently manipulate the image and intentionally diminish its realism (sometimes subtly, sometimes in obvious ways) to achieve whatever it is they're going for. I think this is more or less what PurpleWarrior was trying to get at in describing why people might like grain (and also pointing out use of soft focus as well, and could have mentioned any of a number of other techniques), but (s)he stopped short of saying it flat out. And I mean, this isn't limited to photography or movies -- musicians, for instance, also often make a deliberate choice to go lo-fi. Fidelity (or lack thereof) is just another attribute among many that the artist plays with to create what they want to create.
Back to top
View user's profile Send private message My Anime My Manga
Frenzie



Joined: 08 Sep 2017
Posts: 11
PostPosted: Sat Dec 16, 2017 2:39 pm Reply with quote
StudioToledo wrote:

The so-called "low-framerate" was basically more a limitation film stuck to for a long while, especially when the application of sound became prominent. The real takeaway out of this is the way the retina of the eyes are meant to function, as the trick that made motion picture work the way it did is in how our eyes retained those images long after they've left our vision. This was the foundation of animation itself, going back to the days of the familiar devices like the zoetrope, the praxinoscope or even flipbooks. The German word for animation has been written as "Trickfilm", and in some way, it's the "trick" that film does that we accept, even in low frame rates. That "trick" is sorta leaving us, even if gradually.

Correct me if I'm wrong, but hasn't the 24 fps of movies "always" been 72 Hz? By which I mean the framerate of the film is 24 fps but each frame is displayed three times.

It only just now occurred to me that 48 or 60 fps could also feel like a downgrade from 72 Hz if they are displayed at 48 Hz and 60 Hz respectively. Conversely, maybe it just subconsciously brings to mind @#$@#$ ugly interpolation like on some TVs.

Anyway, I love old movies and the feel of old movies in new movies, don't get me wrong. I think replicating it (whether simply by using film or as an effect) can be great. What I don't get is the "it's 48 fps/not grainy enough therefore it's bad" crowd. Smile Perhaps it's more the weird washed out color movies have typically had for the past few years than the technology, a bit like how the late '90s and early 2000s were full of oversaturation (which imo was much better and cheerier than all the drab). But I'm pretty sure that's not caused by a lack of grain. Very Happy
Back to top
View user's profile Send private message
Polycell



Joined: 16 Jan 2012
Posts: 4623
PostPosted: Sun Dec 17, 2017 7:52 pm Reply with quote
Frenzie wrote:
As a corollary to the grain thing, I've also never understood the low-framerate fetish. My memory of watching Jurassic Park in '93 is that it was without a doubt the best and most exciting movie I'd ever seen. But with one big blemish on my memory of the event! Some of the big moving shots practically made me feel nauseated, or at the very least like something was very, very wrong with me. Yet now people complain at the lack of this unpleasant effect claiming it makes a movie without it feel less "theatrical" and more "like a soap opera."
People have been toying with higher framerates since before there was a standard(24 FPS is the minimum needed for decent sound quality); it's never taken. Various critics have observed that it being the borderline produces a very different effect than if it were well above the threshold for flicker fusion, which is part of makes cinema distinct from jumbotrons.

(And yes, there probably is something wrong with you; I've never ever heard of somebody getting sick from 24 FPS)
Back to top
View user's profile Send private message
Display posts from previous:   
Reply to topic    Anime News Network Forum Index -> Site-related -> Talkback All times are GMT - 5 Hours
Goto page Previous  1, 2, 3  Next
Page 2 of 3

 


Powered by phpBB © 2001, 2005 phpBB Group