ADVERTISEMENT

4K Streaming Subscriptions

alaskanseminole

HB Legend
Oct 20, 2002
25,381
37,396
113
Anyone pay the extra for them? I upgraded my entire home theater setup over the holiday break. I have Roku Ultra 4K (limited to 60hz), LG OLED 4K, Sony 4K Blu-ray, and upgraded all my HDMI cabling to 2.1 (shockingly had all v1.3). Noticeable difference with just the OLED, but when I switched my Netflix subscription to 4K, I really don't notice a difference; given I realize not all content is 4K.

What about y'all?
 
Is there a difference between using the LG built in apps versus the Roku (say using Netflix on both)?
 
Is there a difference between using the LG built in apps versus the Roku (say using Netflix on both)?
For me, yes, because I have a Polk audio surround system running off a Sony receiver. I guess if I use the LG apps, I'd have to run an optical out cable from the TV. I've just never felt like fishing it through the wall, so everything just runs through the receiver; PS4, blu-ray and Roku. Also, FWIW, I really like the Roku interface.
 
Does that make a big difference?

I don't know what version of HDMI cables I have.
The internet tells me "HDMI 2.1 supports resolutions up to 10K, while HDMI 1.4 supports 4K." I don't know if there's even such a thing as "10k" at this point.
 
Does that make a big difference?

I don't know what version of HDMI cables I have.
Very much so. Not so much v2.0 to v2.1, but 1.3 to 2.1 is a big difference. From the Googles:

HDMI 2.1 is a significantly more advanced version of HDMI compared to 1.3. It provides much higher bandwidth, supports higher resolutions and refresh rates, improves audio and gaming features, and offers better compatibility with future tech such as 8K video and dynamic HDR. If you’re using modern devices like 4K or 8K TVs, gaming consoles (e.g., PlayStation 5, Xbox Series X), or high-end sound systems, HDMI 2.1 is the better option.

 
  • Like
Reactions: NDallasRuss
Very much so. Not so much v2.0 to v2.1, but 1.3 to 2.1 is a big difference. From the Googles:

HDMI 2.1 is a significantly more advanced version of HDMI compared to 1.3. It provides much higher bandwidth, supports higher resolutions and refresh rates, improves audio and gaming features, and offers better compatibility with future tech such as 8K video and dynamic HDR. If you’re using modern devices like 4K or 8K TVs, gaming consoles (e.g., PlayStation 5, Xbox Series X), or high-end sound systems, HDMI 2.1 is the better option.
Luckily I bought all the cables on Amazon. It looks like I've got two that are 2.0 and the rest are 1.4. I suppose I'll need to re-buy them and replace them all.
 
Anyone pay the extra for them? I upgraded my entire home theater setup over the holiday break. I have Roku Ultra 4K (limited to 60hz), LG OLED 4K, Sony 4K Blu-ray, and upgraded all my HDMI cabling to 2.1 (shockingly had all v1.3). Noticeable difference with just the OLED, but when I switched my Netflix subscription to 4K, I really don't notice a difference; given I realize not all content is 4K.

What about y'all?
We recently upgraded our main tv to a samsung OLED, and it is amazing. It makes a massive difference in a viewing experience over what we had hanging over the fireplace, and is even surprisingly better as compared our smaller overflow LG OLED (4 times brighter than the LG). We have 4k for YTTV. I like the higher number of feeds you get, and the 4k content which is available. It's not much content, but the Eagles game yesterday was magnificent in 4k. For Netflix, we also pay for 4k again as part of some premium package which again I mainly have for extra users who do not live with me (son in college). Looks good to me, but depends on source material. I think 4k just kind of comes with the other services. Is all of this something I would pay for if I was living in the margins? Eff no. I just got all of this stuff at age 50 with discretionary income.
 
  • Like
Reactions: alaskanseminole
For me, yes, because I have a Polk audio surround system running off a Sony receiver. I guess if I use the LG apps, I'd have to run an optical out cable from the TV. I've just never felt like fishing it through the wall, so everything just runs through the receiver; PS4, blu-ray and Roku. Also, FWIW, I really like the Roku interface.

Yep, I see what you're doing.

What I do is let the TV be the "input switcher", mainly because I currently am running a much older surround sound amp (TOTL in 2008 Yamaha) with only a 2.1 audio setup. I then let the receiver decode all surround sound formats and direct the rear/center to my main speakers via the receiver.

Then I run my optical audio output of the TV to the receiver.

I do it this way because the way I figure it, using the receiver adds another "link in the video signal chain" to everything. The TV can pass through DD and DTS to the receiver where it'll decode them per 2008 standards.

Since I only use my Roku to be a media server on my home network (I have about 12 TB of video on my network), I can then use the TV's built in apps for streaming YTTV/YT/etc. Now, I have done a comparison of the TV apps vs the Roku apps (say a Youtube TV vs Youtube Roku) and I seem to get more consistent quality using the TV apps (a 2018 Vizio Quantum) versus the Roku.

The PQ just seems to be a tad bit sharper no matter what quality video rate I'm streaming from 480i to 4k.


Logic tends to dictate that there should be no difference, but I kinda believe there is. Like I say, using my receiver and the input changer - it is another device all signals have to pass through. And being a former Sony snob with regards to their ES line back in the 2000's...Sony's sometimes are just plain finicky, which is why I switched to using Yamaha's (which have been trouble free for over a decade now).


Your mileage obviously may vary, and every rig is indeed different. But that's been my experience - less links in the A/V chain the better.
 
All right - I just ordered new cables. Amazon says they'll be here between 2-6pm this afternoon: that's not bad!
Don't expect to be blown away (with just cables), just know you'll be getting all the correct signals to your speakers. I played the intro scene to Star Wars Ep3 and could really hear the difference in sound. Then when I finished the upgrades and put in a 4K blue-ray on the OLED, man the black levels were amazing!
 
  • Like
Reactions: NDallasRuss
Yep, I see what you're doing.

What I do is let the TV be the "input switcher", mainly because I currently am running a much older surround sound amp (TOTL in 2008 Yamaha) with only a 2.1 audio setup. I then let the receiver decode all surround sound formats and direct the rear/center to my main speakers via the receiver.

Then I run my optical audio output of the TV to the receiver.

I do it this way because the way I figure it, using the receiver adds another "link in the video signal chain" to everything. The TV can pass through DD and DTS to the receiver where it'll decode them per 2008 standards.

Since I only use my Roku to be a media server on my home network (I have about 12 TB of video on my network), I can then use the TV's built in apps for streaming YTTV/YT/etc. Now, I have done a comparison of the TV apps vs the Roku apps (say a Youtube TV vs Youtube Roku) and I seem to get more consistent quality using the TV apps (a 2018 Vizio Quantum) versus the Roku.

The PQ just seems to be a tad bit sharper no matter what quality video rate I'm streaming from 480i to 4k.


Logic tends to dictate that there should be no difference, but I kinda believe there is. Like I say, using my receiver and the input changer - it is another device all signals have to pass through. And being a former Sony snob with regards to their ES line back in the 2000's...Sony's sometimes are just plain finicky, which is why I switched to using Yamaha's (which have been trouble free for over a decade now).


Your mileage obviously may vary, and every rig is indeed different. But that's been my experience - less links in the A/V chain the better.
That's a solid set-up as well. I had originally ordered an upgraded Denon receiver so I could get the 120hz, but once I realize my Roku would only do the 60z refresh, I canceled the order. I mean, dang, if 60 to 120 is that noticeable I must have really good eyes. LOL From what I hear, you'll only notice it watching sports. I watched my Bucs last night and it looked fantastic!
 
Don't expect to be blown away (with just cables), just know you'll be getting all the correct signals to your speakers. I played the intro scene to Star Wars Ep3 and could really hear the difference in sound. Then when I finished the upgrades and put in a 4K blue-ray on the OLED, man the black levels were amazing!
I think we've reached the point where it's smaller, incremental improvements. Even then, if I've got "better", more modern equipment, then I should at least have the right cables to get the most out of it - I just didn't know enough to know there was a difference - figured HDMI was HDMI...
 
I pay for the 4k through YTTV

There are only a handful of games each week that are broadcast in 4k but the picture is outstanding.

For both my upstairs and downstairs living rooms I use a 4k firestick and Sony receiver for HDMI pass through to handle my surround sound.
 
I'm too cheap and old. Netflix tried to get me to updgrade last night, but I had no interest. 10 years ago me would have insisted upon it. Getting old is hell.
Netflix forced me to upgrade by eliminating their lowest-cost ad-free plan. A-holes. But I saw no point in upgrading to the most expensive 4K plan. Sure, sometimes you can tell, but mostly I don't notice.
 
  • Like
Reactions: shiphappens
What I do is let the TV be the "input switcher", mainly because I currently am running a much older surround sound amp (TOTL in 2008 Yamaha) with only a 2.1 audio setup. I then let the receiver decode all surround sound formats and direct the rear/center to my main speakers via the receiver.

Then I run my optical audio output of the TV to the receiver.
That's my arrangement, too.

I used to run devices to receiver to TV and that's worked fine, too. But then my cat broke my receiver and I decided to do it your way.

Can't tell a difference in quality of either audio or video.

If there were to be a difference, I'd expect the video to be better this way; audio better the other way; but I can't tell.
 
I think we've reached the point where it's smaller, incremental improvements. Even then, if I've got "better", more modern equipment, then I should at least have the right cables to get the most out of it - I just didn't know enough to know there was a difference - figured HDMI was HDMI...
That was me 2 months ago. I knew there were differences in the versions, but never bothered to look into the impact. I built this house in 2014 and wired in all the surround myself, bought the latest stuff at the time and haven't touched it since. Boy was I surprised how much had changed in 10 years.

*I did have to replace my receiver in 2020. My old one went caput.
 
Last edited:
  • Like
Reactions: NDallasRuss
I think we've reached the point where it's smaller, incremental improvements. Even then, if I've got "better", more modern equipment, then I should at least have the right cables to get the most out of it - I just didn't know enough to know there was a difference - figured HDMI was HDMI...
And that's what the "experts" were telling us a decade ago. Copper is copper as long as you don't buy poorly made cables. But apparently that changed with 4K and 8K.
 
  • Like
Reactions: NDallasRuss
I have the YouTubeTV 4k package, but that is just because it comes with unlimited logins. My kids and my parents use my account.
I don't even bother switching over to the 4k versions of games on it because it's not that huge a difference.
 
  • Like
Reactions: alaskanseminole
Yeah...I also kinda believe we're a bit stagnant from a provider aspect.

My thinking is until we get widespread acceptance and conversion of local TV to 4k (ATSC 3.0) - which right now is seemingly slowing to a crawl - streaming providers have no real incentive to go all in on 4k.

Last time I read something about ATSC 3.0, TV makers are pulling back some on even putting a 3.0 tuner in their high end sets. I've also read there's a lot of DRM gauntlet issues with some tuners (local stations encrypting their signals)...that tells me all I need to know on how that's going.

The same thing happened with HDTV. It seemed like the then cable and satellite providers gave us a taste in the early to mid 2000's, but once the changeover for locals happened in 2009, THAT got providers off their asses and they finally went all in on HD.



All this is why I haven't bought a new TV such as an OLED, nor a new 4k receiver. To me, just seems to not really be a reason to justify it (yet). My 4k TV still looks pretty damn good and constant software updates seem to not be hindering it's processing speed - so why "upgrade" now when that upgrade isn't substantial (like SD to HD)?

I haven't even upgraded my secondary room TV's to 4k yet. Hell, bedroom has a 2007 Panasonic plasma (still has an awesome picture), my computer room here has a 2006 720p LCD, my cabin has a 2012 (?) edge lit 1080p LED.

They work perfectly fine, and I use them so seldomly in comparison to the living room set - so why bother if the available content simply isn't heavily there yet?
 
Yeah...I also kinda believe we're a bit stagnant from a provider aspect.

My thinking is until we get widespread acceptance and conversion of local TV to 4k (ATSC 3.0) - which right now is seemingly slowing to a crawl - streaming providers have no real incentive to go all in on 4k.

Last time I read something about ATSC 3.0, TV makers are pulling back some on even putting a 3.0 tuner in their high end sets. I've also read there's a lot of DRM gauntlet issues with some tuners (local stations encrypting their signals)...that tells me all I need to know on how that's going.

The same thing happened with HDTV. It seemed like the then cable and satellite providers gave us a taste in the early to mid 2000's, but once the changeover for locals happened in 2009, THAT got providers off their asses and they finally went all in on HD.



All this is why I haven't bought a new TV such as an OLED, nor a new 4k receiver. To me, just seems to not really be a reason to justify it (yet). My 4k TV still looks pretty damn good and constant software updates seem to not be hindering it's processing speed - so why "upgrade" now when that upgrade isn't substantial (like SD to HD)?

I haven't even upgraded my secondary room TV's to 4k yet. Hell, bedroom has a 2007 Panasonic plasma (still has an awesome picture), my computer room here has a 2006 720p LCD, my cabin has a 2012 (?) edge lit 1080p LED.

They work perfectly fine, and I use them so seldomly in comparison to the living room set - so why bother if the available content simply isn't heavily there yet?
I would say that the jump from standard LCD 4k to an OLED 4k is pretty dramatic in terms of picture quality, brightness, true darks, etc. But you're the only one that matters. If you're happy with it, stay the course.
 
I would say that the jump from standard LCD 4k to an OLED 4k is pretty dramatic in terms of picture quality, brightness, true darks, etc. But you're the only one that matters. If you're happy with it, stay the course.

Oh yeah, best friend has a couple OLED sets... awesome sauce indeed.

I'll get there eventually. Just not a priority at this moment and the Quantum has been good to me, damn good set for the money.
 
Oh yeah, best friend has a couple OLED sets... awesome sauce indeed.

I'll get there eventually. Just not a priority at this moment and the Quantum has been good to me, damn good set for the money.
If you have QLED those look to be about the same price as OLED, so I am sure it looks pretty damned good!
 
Yep, I see what you're doing.

What I do is let the TV be the "input switcher", mainly because I currently am running a much older surround sound amp (TOTL in 2008 Yamaha) with only a 2.1 audio setup. I then let the receiver decode all surround sound formats and direct the rear/center to my main speakers via the receiver.

Then I run my optical audio output of the TV to the receiver.

I do it this way because the way I figure it, using the receiver adds another "link in the video signal chain" to everything. The TV can pass through DD and DTS to the receiver where it'll decode them per 2008 standards.

Since I only use my Roku to be a media server on my home network (I have about 12 TB of video on my network), I can then use the TV's built in apps for streaming YTTV/YT/etc. Now, I have done a comparison of the TV apps vs the Roku apps (say a Youtube TV vs Youtube Roku) and I seem to get more consistent quality using the TV apps (a 2018 Vizio Quantum) versus the Roku.

The PQ just seems to be a tad bit sharper no matter what quality video rate I'm streaming from 480i to 4k.


Logic tends to dictate that there should be no difference, but I kinda believe there is. Like I say, using my receiver and the input changer - it is another device all signals have to pass through. And being a former Sony snob with regards to their ES line back in the 2000's...Sony's sometimes are just plain finicky, which is why I switched to using Yamaha's (which have been trouble free for over a decade now).


Your mileage obviously may vary, and every rig is indeed different. But that's been my experience - less links in the A/V chain the better.
Similar setup with the OLED and Yamaha 5.1 Dolby DigitalDTS receiver with powered sub preout. Have cable box and Series X hooked up to it.

Thanks to poster reminding me I need another upgraded HDMI cable. Years ago I saw a distortion test performed with Monster cables and an off brand and the distortion level is obvious between the 2 back then for car audio. Ever since I pay attention to cables and make effort to get good ones for my shit.

Average novice may not hear a difference but if you are using good equipment you already know it matters to ya.
 
ADVERTISEMENT
ADVERTISEMENT