People did care, which is why people who played games competitively continued to use CRT monitors well into the crappy LCD days.
Heck, some people still use CRTs. There’s not too much wrong with them other than being big, heavy, and not being able to display 4k or typically beeing only 4:3.
Idk if it’s just me but I have pretty good hearing, so I can hear the high pitch tone CRTs make and it drives me crazy.
This only happens with TVs or very low quality monitors. The flyback transformer vibrates at a frequency of ~15.7k Hz which is audible to the human ear. However, most PC CRT monitors have a flyback transformer that vibrates at ~32k Hz, which is beyond the human hearing range. So if you are hearing the high frequency noise some CRTs make, it is most likely not coming from a PC monitor.
Its a sound thats a part of the experience, and your brain tunes it out pretty quickly after repeated exposure to it. If the TV is playing sound such as game audio or music it becomes almost undetectable. Unless there is a problem with the flyback transformer circuit, which causes the volume to be higher than its supposed to be.
There is not one crt I ever encountered that I couldn’t hear. So I’m having trouble believing you information.
I could time it out most of the time, but it was always there.
https://en.m.wikipedia.org/wiki/Flyback_transformer
Under “Operation and Usage”:
In television sets, this high frequency is about 15 kilohertz (15.625 kHz for PAL, 15.734 kHz for NTSC), and vibrations from the transformer core caused by magnetostriction can often be heard as a high-pitched whine. In CRT-based computer displays, the frequency can vary over a wide range, from about 30 kHz to 150 kHz.
If you are hearing the sound, its either a TV or a very low quality monitor. Human hearing in perfect lab conditions can only go up to about 28kHz, and anything higher is not able to be heard by the human ear.
Either that or you’re a mutant with super ears and the US military will definitely be looking for you to experiment on.
I’ll defend this guy: there can easily be a harmonic at half the flyback frequency that is audible. It’s lower amplitude so less loud, but I could believe someone being able to hear that.
Yes, as I previously stated, if there is a problem with the flyback transformer circuit, it is possible that the frequency or volume of the noise it generates can become increased or different.
Though again, PC monitors never made an audible noise unless they were low quality and used the cheaper 15.7kHz transformer in their construction.
Other noises associated with CRTs are the degaussing noise, which only happens once usually after turning on the CRT or after pressing the degauss button, or the sound of old IDE hard disks spinning, which also make a constant high frequency noise.
Not sure you follow: even if the primary frequency is out of range, a harmonic (half the frequency, quarter the frequency, etc) can simultaneously exist with the primary.
I could hear them too, when I was younger. I lost that frequency range of my hearing in my mid-to-late 20’s, which I’ve read is normal.
I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields… The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.
As someone who still uses a CRT for specific uses, I feel that you’re misremembering the switch over from CRT to LCD. At the time, LCD were blurry and less vibrant than CRT. Technical advancements have solved this over time.
Late model CRTs were even flat to eliminate the distortion you’re describing.
I had a flat CRT. It was even heavier than a regular one.
yeah my parents had a trinitron, that thing weighed a whole cattle herd. The magnetic field started failing in the later years so one corner was forever distorted. It was an issue playing Halo because I couldn’t read the motion tracker (lower left)
Resolution took a step back as well, IIRC. The last CRT I had could do 1200 vertical pixels, but I feel like it was years before we saw greater than 768 or 1080 on flat screen displays.
Sure, but they were thin, flat, and good enough. The desk space savings alone was worth it.
I remember massive projection screens that took up half of a room. People flocked to wall mounted screens even though the picture was worse.
I miss the <thunk> sound of the degaussing function.
Schdoing !
There was always push back in esports
Smash uses CRTs today because of how much pushback there was/is
I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.
People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.
CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.
When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.
Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.
It’s all just a little bit of history repeating.
plasma TV?
Plasma TVs had burn-in problems, especially with cropped content with black bars.
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.
And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.
Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.
There’s some really bad misconceptions about how latency works on screens.
No such thing as instant. There is always some latency.
Ok fine, at the speed of light then.
They don’t have zero latency. It’s a misconception.
The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.
Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.
The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.
0 input lag lmao
Just goes to show many gamers do not infact know what “input” lag is. I’ve seen the response time a monitor adds called input lag way to many times. And that mostly doesn’t in fact include the delay a (wireless) input device might add, or the GPU (with multiple frames in flight) for that matter.
It was a dark day for gamers when the competitive things crawled out of their sports holes.
It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car
First rule at our LAN parties: You carry your own monitor.
We’d help each other out with carrying equipment and snacks and setting everything up. But that big ass bulky CRT, carry it yourself!
Not necessarily if you’re the one walking in with the DC++ server. Getting that thing up and running was suddenly priority #1 for the entire floor.
I like how no one mentions that CRT pixels bleed into each other.
And it worked as AA wasn’t as important in that “fuzzier” screen when graphics aren’t as good as they are today.
Are you sure it was CRT technology? Because bear in mind, colour CRTs had to focus the beam so accurately that it only hit the specific “pixel” for the colour being lit at that time. What there was, was blur from bad focus settings, age and phosphor persistence (which is still a thing in LCD to an extent).
What DID cause blur was the act of merging the image, the colour and the synchronisation into a composite signal. All the mainstream systems (PAL, SECAM and NTSC) would cause a blurring effect. Games on 80s/90s consoles generally used this to their advantage, and you can see the dithering effects clearly on emulators of systems from that period. Very specifically, the colour signal sharing spectrum with the luminance signal would lead to a softening of the image which would appear like blurring. Most consoles from the time only output either an RF signal for a TV or if you were lucky a composite output.
Good computer monitors (not TVs) of the time were extremely crisp when fed a suitable RGB signal.
Fr, if your worried about 2 ms input lag than it isn’t the lag, your just bad