Decades ago, the TV took five minutes to warm the tubes up before one could watch the news.
Today, the TV takes five minutes to boot, install updates, and mangle some configuration before one (eventually) can watch the news - if the TV has not lost it's list of stations again.
I once got gifted a TV from a nice elderly guy. The TV had been edge of technology when it was built: It had a wireless remote! Although the remote worked with ultrasound instead of infrared...
This beast took several minutes before it actually showed a picture.
Feels like everything is much more a faff to set up, then one bit updates & something or other is longer compatible.
Don't even want to think about the waste it must generate, both of devices & of the hours trying to get things to work whether at the development end or in the home.
at this point i don't understand why people bother with TVs rather than just hooking up an actual normal computer to a big screen and just watching youtube or torrenting media
I tell my laptop to put the video in the vga port. It does. That’s it. There’s nothing plugged in, but it’s there.
I plug a vga cable in. There’s video in there now. With enough paperclips, I could get it out the other end. My laptop does not care. It wiggles the electrons regardless.
I plug the other end of the cable in. The shielding was eaten by mice and two pins are dead. But alas, lo and behold, purple tho it may be - the video comes out and is displayed.
Meanwhile, hdmi protocol negotiation wants to know if you’d like to set your screen as the default sound device. Not that teams would use it anyway. Actually nevermind, the receiving end doesn’t support the correct copyright protection suite. Get fucked, no video for you.
Even old flat screens are ridiculously heavy compared to new ones. I replaced an old Sony 720p screen that weight probably 20 pounds with a 1080p smart TV of the same size that I could lift one-handed. And the new one cost less than $200.
I grew up with CRTs and VCRs, hard pass. There's a certain nostalgia to it all: the bum-DOOON sound as its electron gun warmed up, the smell of ozone and tingly sensation that got exponentially stronger the closer you were, crusty visuals... But they were objectively orders of magnitude worse than what we have now, if nothing else than because they don't weigh 150 pounds or make you wonder if watching Rugrats in Paris for the 30th time on this monster is giving you cancer. Maybe it's because I'm techie, I've never really had much issue with "smart" TVs. Sure, apps will slow down or crash because of memory leaks and it's not as customizable as I'd like, but I might be satiated just knowing that if push comes to shove I can plug in a spare computer and use it like a monitor for a media system.
I'm rooting it if it starts serving me out-of-band ads, though.
Oh, spare me this lecture again! Being techie doesn't save you from having to go through a cringe firmware update or lack of service through the "smart" OS because their invasive ad service is offline or whatever.
And I'm not saying there aren't alternative ways to view content -- I'm pointing out how fucking dumb the whole setup is. I'm referring to the unit itself, as an average user experiences it.
If I were interested in solutions, I'd be posting in a tech forum, not a joke community where I'm trying to use humor to cops with a frustrating experience, so go rain on someone else's parade.
Also, you don't know what "objectively" means. There's no such thing from a user perspective. It depends entirely on personal preference, and I've made clear what my preference is.
Let me be a grumpy old man in peace. I don't need fixing.
They don't seem to have a lecturing tone in their comment. The only part which you might have a point about is where they say "objectively", but throughout the whole comment they're really just expressing their opinion and showing their experience with smart TVs, which they're entitled to have and might be different from yours.
No aggressiveness intended. Just trying to keep the niceness around.
Tbh I have a setup on our upstairs TV with a linux PC and one of those $10 rechargable wireless keyboards to stream anything, but I refuse to invest any time or money into the cheap TV I bought myself to use in the gaming room while my husband sleeps during the day. If it doesn't start instantly, then I'm out -- which means it's basiclaly useless
All of those issues are covered by other devices that most people will already have. An XBox360/PS3 or any newer gaming system can output 4k and make the smart features in the TV unnecessary. The same is true of a cable box, Roku plugin, fire stick, or any other streaming device. All the TV really needs is to display the 4k signal it receives. TVs don't even really need receivers anymore just a USB hub, a processor for video and audio output, and a screen.
I was happy with the quality, and don't get more enjoyment from all the advancements since, but only ever remember plugging it into the wall, plugging an aerial into the back of it & pressing one button to get the tuner to pick up channels. Batteries into the remote once that became a thing. Plug in a VCR or DVD player once they appeared.
And no thank you, I'm not going to do all that. I don't care enough about any shows to go through all that hassle. I just want my TV to work without extra expense, and I will complain when it doesn't because I hate big corporations and I want them to fail.
The comparison between CRT and digital is not as simple as "625 vs 4k". Those analog signals were intended for a triangular subpixel shadow mask with no concept of horizontal pixel count, making them effectively into ∞×625i@50fps signals (1), compared to the digital fixed 3840×2160@60fps square pixels regardless of subpixel arrangement.
It takes about a 6x6 square pixel block to correctly represent a triangular subpixel mask, making 4K LCDs about the minimum required to properly view those 625i@50fps signals.
(1) I'm aware of optics limitations, CoC, quantum effects, and ground TV carrier band limitations, but still.
I hate smart sutff so much. It's fucking impossible to find a good TV that don't have all of this shit thrown in now. I just want a nice display. And that's it. But what's worse, is when, not only it comes with awful software but they also take "the Apple route" for their features/services. So you have an issue like this and you can't do anything about it.
What am I talking about, you say? "The Apple way", but I allso like to call it the "fucking magic" syndrome. The "fucking magic" syndrome is when something is supposed to be "magic", to "just work", BUT WHEN IT DOESN'T... you're shit out of luck. :)
Because you see, it's supposed to just work. It's absolutely inconceivable for it to not just work. So the people who made it and never even for a second considered that it might fail, never took the time to implement some kind of failsafe in the UI to allow you to actually force the thing to do it's thing on the off chance where the rabbit just refuses to come out of its hat. So when
Anyone who's had to update their Airpods knows exactly what I'm talking about. They're supposed to update themselves, without you doing anything. But every now and then... THEY FUCKING DON'T AND THERE IS ABSOLUTELY ZERO WAY TO FORCE THEM TO DO SO! You just have to wait with your Airpods in their cases open for the moon to be in the correct position or something.
I feel like I've missed something. I don't dispute any of the horrible experiences people have had, however I've had nothing but good luck. The only thing about our current television that bothers me is the promotional wallpapers that get applied every-fucking-time a new Disney property needs advertising. We buy relatively modestly priced units in the $300-$500, so maybe we just have different expectations than someone buying a much more high end unit. It is also possible that it has been pure luck and I'll reply to this message one day soon to recant everything.
Some people don't base their entire personality around hating the existence of ads and jumping through outrageous amounts of steps to avoid them.
So yeah, count me in for a TV that always works how I want it to but had a background ad that I can completely ignore and has no actual bearing on my life.
There are so many more important things for me to spend my time and energy worrying about.
This is less an issue of "smartness" and moreso because analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place. HDMI hits kind of a weird spot because it's a digital protocol based on analog scanlines; if the signal gets disrupted for 0.02 ms, it might only affect the upper half and maybe shift the bits for the lower half. Digital is more contextual and it will resynchronize at least every frame, so this kind of degradation is also unstable.
analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place
Not really. Digital signals come over analog mediums, and it's up to the receiver to decide how much degradation is too much. Mitigations like error correction are intended to reduce the final errors to zero, but it's up to the device to decide whether it shows/plays something with some errors, and how many of them, or if it switches to a "signal lost" mode.
For example, compressed digital video has a relatively high level of graceful degradation: full frames come every Nth frame and they are further subdivided into blocks, each block can fail to be decoded on its own without impacting the rest, then intermediate frames only encode block changes, so as long as the decoder manages to locate the header of a key frame, it can show a partial image that gets progressively more garbled until the next key frame. Even if it misses a key frame, it can freeze the output until it manages to locate another one.
Digital audio is more sensitive to non-corrected errors, that can cause high frequency and high volume screeches. Those need more mitigations like filtering to a normalized volume and frequency distribution based on the preceding blocks, but still allow a level of graceful degradation.
You don't have to let your TV do the streaming though, it just needs to play it. Lots of other devices can connect to streaming services, eg gaming consoles and DVD players. Me, I have a media PC that does all the fun stuff (and lets me stream my library while I'm away), but you could easily use an old laptop.