Careful what you wish for, or we might get AI-powered "Vibrant Story" filters that reduce 62 minutes of plot-less filler to a 5 minute summary of the only relevant points. Or that try to generate some logic to make the magic in the story make narrative sense.
Much like a chain of email AI filters that turn short directions into full-fledged emails, that in turn get summarized into short directions on the receiving end.
My TV is from around 2017 and some of those settings definitely suck on it. I'm curious if they have improved any of them on newer TVs.
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
Thanks for the thought but from what I’ve heard from friends I’ll be keeping the final season unwatched just like I did with the last 2 episodes of GoT.
It really isn't. I keep seeing comparisons to the last seasons of Game of Thrones, but while there is a dip in quality this season, it is no where near as bad as what happened to GoT.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
I hope AI tools allow for better fan edits. There's enough of a foundation and source footage to redo the later episodes of Stranger Things ... The Matrix ... etc.
Yes, I think that this is one place to be very bullish on AI content creation. There are many people with fantastic visions for beautiful stories that they will never be in a position to create the traditional way; oftentimes with better stories than what is actually produced officially.
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
From what I’ve read, you want to make sure that the setting is spelled FILMMAKER MODE (in all caps) with a (TM) symbol, since that means that the body who popularized the setting has approved whatever the manufacturer does when you turn that on (so if there’s a setting called “Cinephile Mode” that could mean anything).
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.
The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better. These motion interpolation settings are now ubiquitous and pretty much nobody cares about said effect anymore, which is great, because maybe now we can start having movies above 24FPS.
This article seems to imply that the default settings are the manufacturer recommended ones for streaming movies - is that bad ux? Should Netflix be able to push recommended settings to your tv?
The problem is it can be subjective. Some people really like the “smooth motion” effect, especially if they never got used to watching 24fps films back in the day. Others, like me, think seeing stuff at higher refresh rates just looks off. It may be a generational thing. Same goes for “vivid color” mode and those crazy high contrast colors. People just like it more.
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
Totally agreed. I read somewhere that the only place these features help is sports. They should not be defaults. They make shows and films look like total crap.
Actually, they do not belong anywhere. If you look at the processing pipeline necessary to, for example, shoot and produce modern sporting events in both standard and high dynamic range, the last thing you want is a television that makes its own decisions based on some random setting that a clueless engineer at the manufacturer thought would be cool to have. Companies spend millions of dollars (hundreds of millions in the case of broadcasters) to deliver technically accurate data to televisions.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
As someone who has built multi-camera live broadcast systems and operated them you are 100% correct. There is color correction, image processing, and all the related bits. Each of these units costs many times more and is far more capable with much higher quality (in the right hands) than what is included in even the most high end TV.
They're the equivalent of the pointless DSP audio modes on 90's A/V receivers. Who was ever going to use "Concert Hall", "Jazz Club", or "Rock Concert" with distracting reverb and echo added to ruin the sound.
Yeah, televisions come full of truly destructive settings. I think part of the genesis of this virus is the need for TV's to stand out at the store. Brands and models are displayed side-by-side. The only way to stand out is to push the limits of over-enhancement along every possible axis (resolution, color, motion, etc.).
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
what about not filming entire show in darkness. or, i don't know, filming it in a way that it will look ok on modern televisions without having to turn off settings.
> filming it in a way that it will look ok on modern televisions without having to turn off settings.
That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.
I'm not even convinced anyone really watches Stranger Things, so I don't see the point. Seems like something people put on as background noise while they are distracted by their phones.
People were clearly watching through at least season 4. That show used songs that nowadays most viewers would consider to be oldies that became hits again after the episodes containing them were released.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
Yeah, kiss m'ass. I agree that some of those settings do need to be turned off. When I visit someone and see their TV on soap opera mode, I fight the urge to fix it. Not my house, not my TV, not my problem if they like it that way, and yet, wow, is it ever awful.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?
I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.
Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.
100% agree. I’ve tried multiple times to use the cinema modes in my TVs, the ones that are supposed to be “as the director intended” but in the end they’re always too dark and I find things hard to see, and turns out I just subjectively like the look of movies on the normal (or gasp sometimes vivid if it’s really bright in the room) than in the “proper” cinema mode. I don’t really care what the creator thinks, it looks better to me so it’s better for me.
> What if I need higher contrast to make out what's happening on the screen?
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
> I would say that TV's should ship without any such enhancements enabled.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.
All the settings in the world won't change the story.
Careful what you wish for, or we might get AI-powered "Vibrant Story" filters that reduce 62 minutes of plot-less filler to a 5 minute summary of the only relevant points. Or that try to generate some logic to make the magic in the story make narrative sense.
I would use this for most reality TV shows.
You'd be better off simply not watching those shows.
as opposite to AI-powered "Hyper Vibrant Story" filters that increase 5 minute of plot to 62 minutes of slop
Yes, once you have the 5 minute summary you can then extend it to however long your Uber is going to take to arrive!
Quibi was ahead of the curve!
Much like a chain of email AI filters that turn short directions into full-fledged emails, that in turn get summarized into short directions on the receiving end.
The "soap opera" effect is real, I don't enjoy it.
christmas day, walked into a relative’s living room to watch football and the players were literally gliding across the screen. lol
My TV is from around 2017 and some of those settings definitely suck on it. I'm curious if they have improved any of them on newer TVs.
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
Thanks for the thought but from what I’ve heard from friends I’ll be keeping the final season unwatched just like I did with the last 2 episodes of GoT.
It’s very bad.
It really isn't. I keep seeing comparisons to the last seasons of Game of Thrones, but while there is a dip in quality this season, it is no where near as bad as what happened to GoT.
GoT got so bad that I don't really have any desire to watch any of the seasons ever again. Killed rewatchability.
Implying that makes a bad season better. When you watch thrash settings doesn't really matter
I don't think it implies that at all.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
I hope AI tools allow for better fan edits. There's enough of a foundation and source footage to redo the later episodes of Stranger Things ... The Matrix ... etc.
Yes, I think that this is one place to be very bullish on AI content creation. There are many people with fantastic visions for beautiful stories that they will never be in a position to create the traditional way; oftentimes with better stories than what is actually produced officially.
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
Probably a good time to plug Filmmaker mode!
From what I’ve read, you want to make sure that the setting is spelled FILMMAKER MODE (in all caps) with a (TM) symbol, since that means that the body who popularized the setting has approved whatever the manufacturer does when you turn that on (so if there’s a setting called “Cinephile Mode” that could mean anything).
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.
Typically “Game” mode, on TVs, turns off post processing, to avoid the extra frames of lag it causes.
Wow that CGI creature looks bad. I thought it was from the Stranger Things game.
The soap opera effect is only a problem because no one is used to it. Higher FPS is objectively better. These motion interpolation settings are now ubiquitous and pretty much nobody cares about said effect anymore, which is great, because maybe now we can start having movies above 24FPS.
This article seems to imply that the default settings are the manufacturer recommended ones for streaming movies - is that bad ux? Should Netflix be able to push recommended settings to your tv?
The problem is it can be subjective. Some people really like the “smooth motion” effect, especially if they never got used to watching 24fps films back in the day. Others, like me, think seeing stuff at higher refresh rates just looks off. It may be a generational thing. Same goes for “vivid color” mode and those crazy high contrast colors. People just like it more.
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
Game of Thrones Season 8 was lambasted for having an episode that was mostly in darkness...in 2019.
You'd think television production would be calibrated for the median watcher's TV settings by now.
Totally agreed. I read somewhere that the only place these features help is sports. They should not be defaults. They make shows and films look like total crap.
Actually, they do not belong anywhere. If you look at the processing pipeline necessary to, for example, shoot and produce modern sporting events in both standard and high dynamic range, the last thing you want is a television that makes its own decisions based on some random setting that a clueless engineer at the manufacturer thought would be cool to have. Companies spend millions of dollars (hundreds of millions in the case of broadcasters) to deliver technically accurate data to televisions.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
As someone who has built multi-camera live broadcast systems and operated them you are 100% correct. There is color correction, image processing, and all the related bits. Each of these units costs many times more and is far more capable with much higher quality (in the right hands) than what is included in even the most high end TV.
They're the equivalent of the pointless DSP audio modes on 90's A/V receivers. Who was ever going to use "Concert Hall", "Jazz Club", or "Rock Concert" with distracting reverb and echo added to ruin the sound.
Yeah, televisions come full of truly destructive settings. I think part of the genesis of this virus is the need for TV's to stand out at the store. Brands and models are displayed side-by-side. The only way to stand out is to push the limits of over-enhancement along every possible axis (resolution, color, motion, etc.).
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
what about not filming entire show in darkness. or, i don't know, filming it in a way that it will look ok on modern televisions without having to turn off settings.
Or specially.. stopping at season 2 of this show.
In some ways, Firefly being canceled was the best thing that ever happened to it.
This is the way.
even better
> filming it in a way that it will look ok on modern televisions without having to turn off settings.
That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.
Why should I change my style? Modern TVs are the ones that suck.
I'm not even convinced anyone really watches Stranger Things, so I don't see the point. Seems like something people put on as background noise while they are distracted by their phones.
People were clearly watching through at least season 4. That show used songs that nowadays most viewers would consider to be oldies that became hits again after the episodes containing them were released.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
I think people paid attention to at least season 1 back in the day.
Just for the synth intro
Yeah, kiss m'ass. I agree that some of those settings do need to be turned off. When I visit someone and see their TV on soap opera mode, I fight the urge to fix it. Not my house, not my TV, not my problem if they like it that way, and yet, wow, is it ever awful.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?
I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.
Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.
100% agree. I’ve tried multiple times to use the cinema modes in my TVs, the ones that are supposed to be “as the director intended” but in the end they’re always too dark and I find things hard to see, and turns out I just subjectively like the look of movies on the normal (or gasp sometimes vivid if it’s really bright in the room) than in the “proper” cinema mode. I don’t really care what the creator thinks, it looks better to me so it’s better for me.
The equalizer analogy is perfect.
> What if I need higher contrast to make out what's happening on the screen?
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
Etc.
> I would say that TV's should ship without any such enhancements enabled.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.