Why Your TV Will Probably Never Be Better Than It Is Now

We’ve reached the end of the television era. Since the technology’s invention in the 1920s, TV screens have steadily increased in size, images have become sharper, and TVs have become cheaper. Essentially, we’re at the end of the road. This unremarkable 65-inch Samsung TV is, for most people, exactly what a TV should be. It displays images with greater detail than most viewers can see from their couches, its screen is large enough to fit into the average American living room, and it costs less than $500. For 100 years, manufacturers and consumers have chased screen size and image clarity. What happens now that the dog has caught up with the mail truck and virtually everyone has a near-perfect TV?
A Brief History of Big Screen Televisions
Television has come a long way. If we were back in 1986 and had $500 to spend on a TV, we might only have been able to afford the cheapest set in the Sears catalog that year . For $159.99, which is about the same price as a 65-inch Samsung today, you could buy a 13-inch 4:3 TV with a resolution of around 480i. (CRT TVs don’t have pixels, but their screens displayed approximately 330–480 lines of useful detail, depending on the signal.) By comparison, the Samsung has a 65-inch 16:9 screen with a resolution of 3840×2160.
Originally, these CRTs displayed images by firing electrons at a phosphorescent screen inside a sealed glass tube. The cathode ray tube (hence the name CRT) had to be deep enough to accelerate the electron beams, and the glass had to be thick enough to securely support them. The result: heavy, deep, and fragile machines that were practically incapable of supporting screens much larger than 40 inches without being prohibitively expensive and heavy. The 1981 Sony KV-3000R, a 30-inch model that cost $10,000 ($36,500 today) and weighed over 500 pounds, was at the pinnacle of the large consumer CRT market. Technically, it was possible to make more —Trinitron created a 45-inch CRT in 1989 that sold in Japan for $40,000 —but these weren’t the kind of screens you’d find in anyone’s living room.
Subsequent projection televisions achieved unprecedented screen sizes by using internal projectors and mirrors to project a cathode-ray image onto a translucent screen, but this had significant drawbacks. The devices were bulky, weighing up to 225 kg , and the projected image was blurrier and dimmer than that of a conventional CRT, which was already the “standard.” Viewing angles were limited —you had to sit directly in front of the screen to see anything—and the projector lamps had a limited lifespan and were expensive to replace.
The limitations and high cost of projection televisions didn’t stop people from adopting this technology, especially after their prices dropped. By the 1990s, improvements in projection optics, CRT projectors, and increased manufacturing efficiency turned large-screen projection televisions into a status symbol, leading to the appearance of giants with 50, 60, and even 70-inch diagonals in suburban living rooms. They were still heavy, blurry, and insanely expensive— a 61-inch Magnavox projection television cost $2,999.99 in 1993 —but everything changed in the late 1990s with the release of the first plasma televisions.
The Flat Screen Revolution
Plasma and LCD TVs not only displayed images better, but also operated on completely different principles. In a plasma TV, each pixel is a tiny gas-filled cell that, when powered, emits ultraviolet light, which then excites phosphors on the display, creating visible colors that evoke an episode of Friends . LCD TVs use liquid crystals to control the passage of light from a backlight. Each pixel contains a layer of liquid crystal that can bend or block light, allowing for precise control of color and brightness, allowing for much more detailed viewing of Rachel’s hair. Both technologies produce much brighter and sharper images than rear-projection TVs, all without weighing 400 pounds, making large-screen, high-definition displays accessible to the average consumer.
LCD TVs and plasma TVs had their advantages and disadvantages: plasma TVs had a faster response time (the speed at which pixels are adjusted) and deeper blacks than LCD TVs, but LCD TVs lasted longer (about 50,000 hours versus 30,000 hours ), consumed less energy, performed better in well-lit rooms , and weren’t as susceptible to burn-in as older plasma and CRT monitors. Ultimately, LCD TVs won out, and by 2014, plasma TVs were a thing of the past.
In 2004, Sony introduced the first LED TVs . While older LCD TVs used cold cathode fluorescent lamps for backlighting, LED TVs use light-emitting diodes (LEDs). They are much more energy efficient and provide brighter images, more accurate color reproduction, and greater contrast than LCDs or plasma panels. LEDs and other technical improvements also solved problems such as narrow viewing angles, image blur, and uneven backlighting, common with previous generations of flat screens.
Flat-panel displays were initially expensive, but prices quickly dropped. A 42-inch plasma TV cost around $20,000 in 1997 , but ten years later it cost less than $1,000 . As prices dropped, resolution increased from 720p (1280 pixels wide by 720 pixels high) to 1080p (1920 pixels wide by 1080 pixels long) and to 4K (3840 pixels wide by 2160 pixels long), making it possible for anyone to hang a large TV on their living room wall and enjoy the realism and picture quality previously only available in movie theaters.
Fine-tuning Your TV: All About Backlighting
As screen size and resolution increased, television picture quality also improved, including contrast, color accuracy, and brightness. Older LCD TVs used fluorescent lamps to pass light through liquid crystals, but the crystals couldn’t block all the light, so no pixel was ever truly black. This is why you can tell if an older LCD TV is on even when there’s no image on it. LED displays feature local dimming technology—a backlight that can brighten or darken areas of the screen as needed. This results in less light passing through the pixels, resulting in darker blacks. Mini-LED displays have many more backlight zones, sometimes thousands , which further softens dark areas. QLED displays use a “quantum dot” film between the LEDs and the LCD bezel, which expands to improve color saturation and brightness .
Organic light-emitting diode (OLED) TVs go even further. Many OLED TVs have no backlight at all. Instead, each pixel in the display contains an organic material that lights up individually when power is applied. So, when a pixel is black, it’s off—that is, completely black. OLED TVs aren’t perfect—they’re generally less bright than LED or mini-LED displays—and developing microLED TV technology promises to solve this problem, but current six-figure prices make them prohibitively expensive.
We may have reached the peak of television popularity.
The difference between color and black-and-white images became apparent immediately after the first color televisions appeared in the 1950s , as did the difference between high-definition and standard-definition displays in the late 1990s, but the difference between OLED and QLED displays is so subtle that it’s practically unnoticeable to the average consumer. I’m sure some people are passionately devoted to OLED over mini-LED, or feel they haven’t seen Breaking Bad unless they watched it on a $100,000 microLED TV. But for most of us, mid-range TVs are so close to the “ultimate” that minor technological improvements are irrelevant.
No technology is perfect for everyone. For example, CRT TVs are better than the best LED TVs for old-school gaming , and a 4K TV may not offer enough detail for some technical tasks. But when it comes to the needs and desires of everyday viewers in their living rooms, modern TV technology is practically perfect. Here are a few reasons:
Limits of vision
A standard 65-inch 4K TV has a resolution of 3840 x 2160 pixels. This density is enough to create an image that ‘s invisible to the average viewer sitting at a reasonable distance . You can buy an 8K TV (7680 pixels wide by 4320 pixels high), but these extra pixels won’t make the image sharper or clearer in practice; they’ll just add more detail than you can physically see from your couch. Even 4K screens are overkill for acceptable viewing.
Then there’s the issue of size. TVs can always get bigger, but there comes a point where that’s no longer beneficial to viewing experience. The Society of Motion Picture and Television Engineers has found that the best viewing experience for most people is achieved when the screen occupies 30 degrees of the visual field. That’s roughly 8.5 feet (2.4 meters) for a 65-inch TV, more than enough for most living rooms. Even if that’s not the case, commercially available TVs come in sizes up to 115 inches (385 cm), which is plenty for any room short of a cathedral-sized living room.
Boundaries of light, color and comfort
Contrast, the difference in brightness between the darkest blacks and the brightest whites a screen can display, helps determine how vibrant and detailed an image appears. OLED TVs don’t have a contrast ratio because contrast is infinite. Each pixel in an OLED TV is its own light source, so when a pixel is told to be black, it’s literally turned off and doesn’t get any blacker than that. In terms of color, modern OLED TVs can reproduce 98 to 100% of the colors used in movies and TV shows, so what you see onscreen is all the color present in the source material. While other display types don’t have OLED’s infinite contrast ratio, they come pretty close: some mini-LED TVs have contrast ratios as high as 10,000,000:1 .
TVs have also become brighter than ever. Displays designed for outdoor use are bright enough to be viewed comfortably even in bright sunlight , and their peak HDR brightness is around 1400 nits, significantly higher than the 250 nits of a typical indoor screen , which is more than enough for comfortable viewing in the living room.
Content boundaries
When it comes to what we watch on TV, if you define ideal television as “being able to watch whatever I want, whenever I want,” then we’re practically there. Viewers used to have a problem of scarcity; you watched whatever came your way on one of three channels, and you loved it . Now, our problem is abundance. We’re overwhelmed with content to watch—there are millions of things instantly available to stream on your TV, from shows to movies to YouTube videos. While the programming spread across thousands of channels and dozens of paid and free streaming services is confusing, nearly every movie or TV show ever made is available somewhere, though it may take a little effort (and a monthly subscription fee) to find them.
What does the future hold for television?
Consumer demand for larger-screen, higher-quality TVs has essentially driven the industry’s growth for the past 80 years. So what happens now that the race is almost over, and we can all watch anything on a near-perfect TV? A marketer might answer that TV manufacturers are creating reasons for people to buy new TVs by expanding the very definition of television. This can be seen in the examples of Samsung’s The Wall or Sony’s Crystal LED —systems that allow you to cover an entire wall with seamless TV panels (if you have a spare $100,000).
But do people really want a TV wall so much that they’re willing to buy one, given that they’ll become more affordable? Sure, some will, but a wall-mounted screen is unlikely to make watching TV on the couch more enjoyable for most of us. A more down-to-earth future for TVs is represented by the Samsung Frame— a “TV for life” designed to turn your screen into a digital art gallery when you’re not watching Netflix. It’s cool, but unless it improves the Pluribus viewing experience , I won’t be rushing to replace my TV.
When “big TVs” try to create demand for TVs that do more than just function as TVs, the results aren’t always successful. Back in 2010, perhaps sensing a need for a “must-have” feature, the industry released the first 3D TVs. Despite years of hype surrounding the technology as the next big thing, consumers didn’t bite, and by 2017, 3D TV was a dead technology . It was cool, but not cool enough to justify buying a new TV when people just wanted to watch Game of Thrones . Another example: the “screenless screen,” represented by augmented and virtual reality devices like the Apple VisionPro or Meta Quest 3. While it’s too early to say for sure, these highly touted devices also seem to be meeting with a lukewarm consumer response.
The only drawback of your TV
But don’t get too excited about your perfect TV—it’s likely to break soon. The profitability of the TV industry requires many to buy new TVs every few years, so your 65-inch Samsung isn’t designed to last as long as the clunky CRT TVs of the past. Old TVs were fairly simple machines that could last for decades ( if Elvis hadn’t shot them ), but today’s flat panels feature LEDs that dim and LCDs that flicker. Perhaps more importantly, almost all new TVs are smart TVs , which introduces new ways to accelerate obsolescence: manufacturers may stop updating your TV’s operating system, and streaming services may also drop support. Even if the display still works, you may find navigating your TV so slow, cumbersome, and unhelpful that you go out and buy a new one, much sooner than you otherwise would.
There’s also the issue of privacy: these TVs constantly monitor our activities and collect our data when connected to the internet. This is one of the reasons why TVs are initially cheaper: you offset the cost with your data . Disconnecting these TVs from the internet helps, but many streaming devices aren’t much better, so you need to choose wisely . However, choosing wisely can extend the life of an old, otherwise functional TV—until its hardware fails, of course.
Today’s TVs are gorgeous, inexpensive, and huge, but they’re also designed for a world where replacing a screen every five to seven years is the norm, even if there’s no “better” device.