4k is enough, 60fps is enough, no smart or AI stuff is perfectly fine…
What about reducing the energy consumption? That’s an innovation I want.
I hope you mean 60hz is enough for TVs. Because I certainly don’t want that regression on my monitor 😄
The only real innovation after 1080p for TV was HDR, sound stuff, 60-120hz, and upscaling to 4k.
Fuck.
Now instead of each new generation of TVs being slightly higher in resolution some god damn business tech executive is going to focus on some god damn bullshit to try and change or add which is going be absolutely fucking ridiculous or pointless and annoying like AI television shit or TV gaming or fuck my life. Smart TVs are bad enough but they can’t help themselves they need to change or add shit all the fucking time to “INNOVATE!!!” and show stockholder value.
Resolution was something easy and time consuming but we can’t rely on that keeping them from fucking TV up any more.
Real innovation:

Want us to put your show back on? Yes? Then read the following aloud 5 times while slapping your face continuously: “I am sofa king, stupid”.
What about Dairy Queen?
I want a dumb tv with the price and and specs of a smart tv. Less is more
What you’re looking for are commercial screens. They’re a bit more expensive for a comparable panel, as they are intended for 24/7 use- but are about as dumb as they get nowadays.
A bit more expensive? I was about to get a smart tv for like 800 bucks. The same equivalent dumb tv wpuld have been a few thousand dollars and Best Buy said they would only sell it to business accounts which was infuriating to read.
You would likely have to pay more since they aren’t getting to sell your information.
*you would have to pay more because major companies know they can charge more. There isn’t a limited amount of profit a company wants to mae, and then they pick a price from that, they price it as high as the market will bear.
I second this.
The TV industry can crash and burn otherwise.
Realistically, not only do I not want an 8k tv, but I might not get a tv at all, if I had to do it today. We rarely watch tv anymore. The brainrot is the same but we’re much more likely to use individual screens
What about 12K?
Silly goose, after 8K is 16K!
Most developing countries have cheap 1080p TVs right now, but others are still using CRTs, and still others are watching on their phones (like some of my poorer relatives who do get their entertainment fix through their phones while the big TVs in their living rooms rarely gets turned on).
I think my TV is like 32" and 720p from 2012 or so. It’s fine.
Before that I had a projector which was fun sometimes but mostly too big for the room. Cool to take psychedelics and run a visualizer while it was pointed along the floor at you. You could feel the fractals on your skin. I don’t do that anymore, so a 32" TV is fine.
Theres a ton of other things I want my TV to do before more pixels.
Actual functional software would be nice, better tracking on high speed shots (in particular sweeping landscapes or reticles in video games) higher frame rates and variable frame rate content, make the actual using of the tv, things like changing inputs or channels faster, oh man so much more.
Anything but more pixels.
I still probably watch 90% 1080p and 720p stuff lol. As long as the bitrate is good it still looks really good.
Actual functional software would be nice
you do not want software on your TV.
Yes I do. I want an actual smart TV with a practical, open source, TV-optimitzed Linux OS. It’s not that software on a TV is a bad idea in itself. It’s how it’s ruined by for-profit companies.
Nah, honestly, I think stuffing an entire computer inside a monitor and relying on it to generate/show content is a bad idea no matter what software it runs. A dumb TV + a small computing dongle requires only a tiny fraction more labor to produce than a smart TV, but it’s so much easier to upgrade in the future if you decide you need faster boot times or wanna game on the TV, etc. And if the TV breaks before the dongle does, you can also buy a new TV and keep all your settings/media without transferring anything.
Also to add to this, the life-cycle of a TV display is mismatched from the live-cycle of media playing hardware or just hardware for general computing: one needs to update the latter more often in order to keep up with things like new video codecs (as for performance those things are actually implemented in hardware) as well as more in general to be capable of running newer software with decent performance.
I’ve actually had a separate media box for my TV for over a decade and in my experience you go through 3 or 4 media boxes for every time you change TVs, partly because of new video codes coming out and partly because the computing hardware for those things is usually on the low-end so newer software won’t run as well. In fact I eventually settled down on having a generic Mini-PC with Linux and Kodi as my media box (which is pretty much the same to use in your living room as a dedicated media box since you can get a wireless remote for it, so no need for a keyboard or mouse to use it as media player) and it doubles down as a server on the background (remotely managed via ssh), something which wouldn’t at all be possible with computing hardware integrated in the TV.
In summary, having the computing stuff separate from the TV is cheaper and less frustrating (you don’t need to endure slow software after a few years because the hardware is part of an expensive TV that you don’t want to throw out), as well as giving you far more options to do whatever you want (lets just say that if your network connected media box is enshittified, it’s pretty cheap to replace it or even go the way I went and replace it with a system you fully control)
I mean, yes and no. I like e-arc, and I like being able to adjust settings other than v-hold. But I don’t want this slow crud fest that keeps telling me when my neighbour turns on Bluetooth on their iphone.
I like e-arc,
… the audio hdmi thing?
Yeah, all my inputs go to the tv, then i run a wire to the receiver. This makes it so my ps5 and PC are plugged directly to the tv so i can get different resolutions are variable refresh rate and the tv can control the receiver. So when I turn something on, the tv/receiver turn on and set themselves to matching settings, Dolby, stereo, whatever. Its not huge but its a nice convinienice over the older optical connection.
I want software on my TV.
Steam Link specifically. I like streaming to my TV via Ethernet.
You can do that with a Raspberry Pi for <$100 and without the need to have Amazon/Google/Roku/whoever tf else collecting your data.
Who says I let Amazon/Google/Roku/whoeverTfElse collect my data?
I have my TV isolated to its own network and allow inputs from LAN, so i can use Steam Link and Jellyfin just fine.
So get a device that can do that. You didn’t need a piece of software that will never see an update to do this.
It’s funny that you think smart TVs don’t receive updates. It’s got a wifi chip for a reason.
Tvs can last 20 years you’ll be lucky to get 2 years of updates
My TV isn’t going to update because it’s been lobotomized.
I have a Samsung frame because I wanted a TV that didn’t look so much like I had one, but the software is so goddam bad. The only way to switch sources quickly is to set them as a favorite which isn’t always that straight forward if you didn’t do it right away. Regardless you have to let the the home page fully render before you can even worry about that. Even the Samsung TV app which you would think would be perfectly optimized for the hardware since the same compare makes the software is barely functional and loads like a web page on AOL in 1998
I like my frame because it faces a set of windows and with all my other tvs … I would have to close the blinds to see the tv in the day time.
However, the software is straight garbage. I didn’t even know about the favourite thing … every time I change source it would spend like a minute or two trying to figure out if it could connect to it for no reason.
Couldn’t any TV be a “frame tv”
Would you just need some trim?
No, not really
They showed Skyfall on 70ft IMAX screens, and that film was shot 2880 x 1200. Its not all about the pixel count.
Where are those numbers from? They disagree with https://shotonwhat.com/skyfall-2012 . The IMAX release was done as a film print, which likely wasn’t made from the regular HD cut.
DP Roger Deakins has a forum on his site where he answers these sorts of questions. The Arri Alexa Studio was essentially developed for him, as he prefers an optical VF. It has an academy gate, and Deakins favours spherical lenses rather than anamorphic so we simply take the width of the sensor and crop down to 2.40:1 aspect ratio to get the quoted dimensions. Your link quotes the capture resolution, which will have some excess compared to the finished material.
This was put through the DI process and at a later stage IMAX DNR’d a blow-up to 15/70mm.
Even if it was, the streamings everyone’s using crush down the bitrate so bad it’d barely look better than 4k anyway.
Make a TV that can accurately reproduce really high contrast ratios and don’t put any pointless software on it. Display the image from the source with as much fidelity as possible, supporting all modern display technology like VRR.
That’s all I want from a display.
Gaming was supposed to be one of the best drivers for 8K adoption.
Whu? Where 4k still struggles with GPU power? And for next to no benefit?
8k would be cool to display lots of windows at the same time but fairly useless for gaming.
Introducing the new DLSS 9, were we upscale 720p to 8k. Looks better than native, pinky swear.
What’s dumb is that 3D failed because of lack of resolution and brightness, and now we have more pixels than we can handle and screens so bright they can hurt to look at. PS3 had a couple games that showed different screens to two players wearing 3D glasses. I’d love to see full screen couch coop games with modern tech. 8K isn’t solving any problems.
3D failed for the exact same reason VR is failing now. Nobody wants to wear headsets at home.
But VR can do a lot of cool things, 3d… not so much.
Nobody wants to wear headsets at home.
Or anywhere, really. AR/XR glasses are a big improvement, but can still be clunky.
Or bring on the neural shunt. Not so fast, Elon. It’s not for your purposes. It’s for mine. And I can’t see who I could trust with direct connections to my brain… But once I plugged in, I love Amazon, and you should subscribe!
Screen dimming is technically possible over HDMI/Displayport no idea why its not properly supported and integrated into monitors, graphics drivers, windows and Linux. KDE shows dimming for monitors sometimes? Don’t know if that is software or real hardware dimming though.
It’s real hardware dimming.
Nice thought so but wasn’t sure.
I’m talking about brightness in context of 3d. KDE uses DDE with USB connected monitors which is the same thing as using a button to lower brightness.
I dunno. Oxygen Not Included looks crisp on a 4K monitor. And it makes my job easier, being able to have an absolute tonne of code on-screen and readable. I reckon I could probably use an 8K monitor for those things.
Yeah, I generally have FSR running on any 3D game made in about the last decade - even if I can run it at 4K at a reasonable framerate, my computer fans start to sound like a hoover and the whole room starts warming up. But upscaling seems a better solution than having separate monitors for work and play.
Yeah, can’t run it, Internet too slow to steam it
Pretty sure my GPU could run 4k Rimworld, just play good games instead of AAA games.
But we are talking TV. RPG with a controller is more suitable here. And there are some great RPG games.
FR. What distance and size do I need to be able to actually see the difference between 1080p and 4K? Cuz my current setup does not allow me to notice anything but the massive reduction in performance, unless it’s a 2D game and then everything becomes too tiny to effectively play the game.
4k is noticeable in a standard pc.
I recently bought a 1440p screen (for productivity, not gaming) and I can fit so much more UI with the same visual fidelity compared to 1080p. Of course, the screen needs to be physically bigger in order for the text to be the same size.
So if 1080p->1440p is noticeable, 1080p->4k must be too.
Like I said, 2D things it is noticable only because it makes everything smaller (there is more space because the elements inside that space are smaller). However movies and 3D games? No difference.
Even going from 640x480 to 1024x768 makes a noticeable size difference with the 2D elements of a UI.
It’s about 60”. That’s the point where we noticed pixels at 1080P. HDR is more important.
I’m using a 60 inch tv as a monitor to my desktop - I sit in front of it at a distance of about 2m. It feels really nice to have stuff in 4k, so it’s always 4k except the games that are too tough for my 2060 super to give me 60p.
It’s very noticeable at the DPI of a 27" screen from arms’ length. Or maybe not if you can’t see very well. But on a TV from 10 feet away, I dunno if I could differentiate 1440p from 4K personally.
I have 27 inch 2k monitors at work, and it’s already enough
27" 2K is a good DPI. I personally only went up to 4K 27" because I also wanted OLED, and the 2K OLED panel I was using had some noticeable text fringing because of the subpixel layout. At 4K it’s not noticeable anymore.
For PC you can actually see the benefit between 4K and lower resolutions.
It’s for TV where it’s likely useless to have 8K
I am on a PC… You mean a smaller monitor closer to my face?
Yes, that’s what I am saying, for a monitor you can easily see the difference between 4K and lower. While for a tv if I remember correctly for a 55’ over 2 meters and something from the TV you can’t tell the difference between 4k and 1080p
Yeh, do: 60fps, 30 bit color… and I guess HDR?
Do things that people can actually appreciate.
And do them in the way that utilises the new tech. 60fps looks completely different from 24fps… Work with that, it’s a new media format. Express your talentSorry, the best I can do is install a camera and microphone on our next model, to spy on you and force interaction with advertisements.
I mean video conferencing from your living room. How neat is that?
It’s about time the electronics industry as a whole realises that innovation for the sake of innovation is rarely a good thing
Look, we can’t have TVs that last 15 years anymore!
We need to keep people buying every year or two. Otherwise line not go up! Don’t you understand that this is about protecting The Economy?!
It’s not even innovation, per say. It’s just Big Number Go Up.
Nobody seems to want to make a TV that makes watching TV more pleasant. They just want to turn these things into giant bespoke advertising billboards in your living room.
Show me the TV manufacturer who includes an onboard ad blocker. That’s some fucking innovation.
The galaxy brain move is buying an old dumb tv for a pittance and use it for watching Jellyfin/Plex/stream from a browser with uBlock Origin/DNS filtering – all running on some relative’s “obsolete” smart toaster from last year that they happily gift you because “the new version’s bagel mode IS LIT – pun intended – but it needs the 128 gb DDR7 ram of the new model, can barely toast on the old one any more”.
I think this just comes down to human nature. Give people (engineers, execs) a metric that looks like a good proxy for performance and they will overcommit on that metric as it is a safer bet than thinking outside the box. I think the incremental improvements in deep learning with all those benchmarks are a similar situation.
You can’t really find a dumb TV anymore. I might see how big of a monkey I can find when I’m ready to upgrade, but I doubt I’ll find one big enough and cheap enough.
I hooked my computer up to the HDMI and have used that as my primary interface.
It’s not perfect, but it screens out 95% of bullshit
That doesn’t, unless you’ve blocked your TV from network access, because they use ACR - Automated Content Recognition - that literally scans what is being displayed over your hdmi port and then sells it off to advertisers.
I think you missed the part where the TV doesn’t have access to the network.
Just don’t give the TV your wifi password, boom dumb TV.
That won’t save you anymore. My boss bought a smallish smart TV in contravention of my explicit instructions for use as a CCTV monitor because it was “cheap.” It nags you on power up with a popup whining about not being able to access the internet, and if you don’t feed it your Wifi password it will subsequently display that same popup every 30 minutes or so requiring you to dismiss it again. And again. And again. Apparently the play is to just annoy you into caving and letting it access your network.
Instead I packed it up and returned it. Fuck that.
If you are at a business you should have an access point or router that is capable of blocking specific devices from WAN access. But I would create a new segmented network, block that network from WAN access entirely, put it on its own VLAN, and then connect the TV to that network.
I’d assume it nags whenever it can’t connect to the home server, and just says “network”.
So when they go out of business any remaining units will nag forever.
You can use your router or access point tools to check what address it’s trying to resolve and then set up a redirect to a device that can respond with a fake response.
FireWire got killed too soon.
Too right, brother
Wha?
AFAIK FireWire is part of the Thunderbolt protocol, you can get FireWire 800 to Thunderbolt adapters. Apple even used to sell one.
Apple killed Firewire 4 years before Lightning came along.
??? My 2012 MacBook Pro had FireWire, that’s the year Lightning came out…
Apple announced it was moving away from Firewire in 2008
it was way too ahead of its time.
We traded 3D tv’s, which are amazing if you watch the right stuff, for 8k…
8k is great, but we need media in 8k to go with it.
Working in entertainment and broadcasting, you learn barely half of Americans have a 4K tv and it’s under half worldwide. Marketing makes you think that “everyone is doing it”
I think it is pc gamers who get the most out of TVs funny enough. Particularly when it comes to framerate, since I was enjoying 60 fps while 2 consoles generations passed with 30 fps being the standard. And when 120 hz came out I was enjoying 120 hz on some titles with freesync before the PS5 came out.
But, for average tv/movie watchers streaming stuff as well as console gamers prior to PS5 lot of the features of 4k TVs weren’t fully utilized.
Most enterprise places are 1080p default and reqi people to go above and beyond to justify 4k screens.
Then we get into laptop’s still barely going about 720p so a lot of people have no idea what 4k would even bring them as even most streaming content still only in 1080 so not really noticeable for those who even have 4k screens
Wait, half have 4k‽
Broadcast towers put out 1080p and satellite broadcasts 1080i. Filmmakers/streaming standard is 1080p. As a filmmaker (producer/ cinematographer) and broadcaster operator, you’re always encoding to scale down due to bandwidth and/or majority of the incapable most due to FCC rules/cost.
Even the bargain basement ones with horrible pictures have 4k resolution these days
Think it is an economy of scale issue.
I don’t own a single 4k. Have four 1080p dumb TV’s. I will get a new one when one of these dies.
Out of all the displays in the house, there’s only one 4k computer monitor. The dumb tv is a 1080p, my other monitor is a 720p, then my partner has a DVI and a VGA monitor for his computer.
I’m still using a 1080p TV and I only plan to replace it if it breaks.
If you do, smart TVs are dumb and forgot how to TV






















