• RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    21 days ago

    People did care, which is why people who played games competitively continued to use CRT monitors well into the crappy LCD days.

    Heck, some people still use CRTs. There’s not too much wrong with them other than being big, heavy, and not being able to display 4k or typically beeing only 4:3.

    • Julian@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 days ago

      Idk if it’s just me but I have pretty good hearing, so I can hear the high pitch tone CRTs make and it drives me crazy.

      • RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 days ago

        This only happens with TVs or very low quality monitors. The flyback transformer vibrates at a frequency of ~15.7k Hz which is audible to the human ear. However, most PC CRT monitors have a flyback transformer that vibrates at ~32k Hz, which is beyond the human hearing range. So if you are hearing the high frequency noise some CRTs make, it is most likely not coming from a PC monitor.

        Its a sound thats a part of the experience, and your brain tunes it out pretty quickly after repeated exposure to it. If the TV is playing sound such as game audio or music it becomes almost undetectable. Unless there is a problem with the flyback transformer circuit, which causes the volume to be higher than its supposed to be.

        • systemglitch@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          21 days ago

          There is not one crt I ever encountered that I couldn’t hear. So I’m having trouble believing you information.

          I could time it out most of the time, but it was always there.

          • RightHandOfIkaros@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            21 days ago

            https://en.m.wikipedia.org/wiki/Flyback_transformer

            Under “Operation and Usage”:

            In television sets, this high frequency is about 15 kilohertz (15.625 kHz for PAL, 15.734 kHz for NTSC), and vibrations from the transformer core caused by magnetostriction can often be heard as a high-pitched whine. In CRT-based computer displays, the frequency can vary over a wide range, from about 30 kHz to 150 kHz.

            If you are hearing the sound, its either a TV or a very low quality monitor. Human hearing in perfect lab conditions can only go up to about 28kHz, and anything higher is not able to be heard by the human ear.

            Either that or you’re a mutant with super ears and the US military will definitely be looking for you to experiment on.

            • errer@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              20 days ago

              I’ll defend this guy: there can easily be a harmonic at half the flyback frequency that is audible. It’s lower amplitude so less loud, but I could believe someone being able to hear that.

              • RightHandOfIkaros@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                20 days ago

                Yes, as I previously stated, if there is a problem with the flyback transformer circuit, it is possible that the frequency or volume of the noise it generates can become increased or different.

                Though again, PC monitors never made an audible noise unless they were low quality and used the cheaper 15.7kHz transformer in their construction.

                Other noises associated with CRTs are the degaussing noise, which only happens once usually after turning on the CRT or after pressing the degauss button, or the sound of old IDE hard disks spinning, which also make a constant high frequency noise.

                • errer@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  20 days ago

                  Not sure you follow: even if the primary frequency is out of range, a harmonic (half the frequency, quarter the frequency, etc) can simultaneously exist with the primary.

          • deltapi@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            10 days ago

            I could hear them too, when I was younger. I lost that frequency range of my hearing in my mid-to-late 20’s, which I’ve read is normal.

  • Bytemeister@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    20 days ago

    I remember CRTs being washed out, heavy, power hungry, loud, hot, susceptible to burn-in and magnetic fields… The screen has to have a curve, so over ~16" and you get weird distortions. You needed a real heavy and sturdy desk to keep them from wobbling. Someone is romanticizing an era that no one liked. I remember the LCD adoption being very quick and near universal as far as tech advancements go.

    • Kit@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      3
      ·
      20 days ago

      As someone who still uses a CRT for specific uses, I feel that you’re misremembering the switch over from CRT to LCD. At the time, LCD were blurry and less vibrant than CRT. Technical advancements have solved this over time.

      Late model CRTs were even flat to eliminate the distortion you’re describing.

        • Hadriscus@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          20 days ago

          yeah my parents had a trinitron, that thing weighed a whole cattle herd. The magnetic field started failing in the later years so one corner was forever distorted. It was an issue playing Halo because I couldn’t read the motion tracker (lower left)

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        20 days ago

        Sure, but they were thin, flat, and good enough. The desk space savings alone was worth it.

        I remember massive projection screens that took up half of a room. People flocked to wall mounted screens even though the picture was worse.

      • rothaine@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        20 days ago

        Resolution took a step back as well, IIRC. The last CRT I had could do 1200 vertical pixels, but I feel like it was years before we saw greater than 768 or 1080 on flat screen displays.

    • ILikeBoobies@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      20 days ago

      There was always push back in esports

      Smash uses CRTs today because of how much pushback there was/is

  • figaro@lemdro.id
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 days ago

    It’s ok, if anyone wants them back the smash brothers melee community has them all in the back of their car

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    1
    ·
    19 days ago

    I think most people that were gaming held onto their CRTs as long as possible. The main reason being, the first generation of LCD panels took the analogue RGB input, and had to present that onto the digital panel. They were generally ONLY 60hz, and you often had to reset their settings when you changed resolution. Even then, the picture was generally worse than a comparable, good quality CRT.

    People upgraded mainly because of the reduced space usage and that they looked aesthetically better. Where I worked, we only had an LCD panel on the reception desk, for example. Everyone else kept using CRTs for some years.

    CRTs on the other hand often had much better refresh rates available, especially at lower resolutions. This is why it was very common for competitive FPS players to use resolutions like 800x600 when their monitor supported up to 1280x960 or similar. The 800x600 resolution would often allow 120 or 150hz refresh.

    When LCD screens with a fully digital interface became common, even though they were pretty much all 60hz locked, they started to offer higher resolutions and in general comparable or better picture quality in a smaller form factor. So people moved over to the LCD screens.

    Fast-forward to today, and now we have LCD (LED/OLED/Whatever) screens that are capable of 120/144/240/360/Whatever refresh rates. And all the age-old discussions about our eyes/brain not being able to use more than x refresh rate have resurfaced.

    It’s all just a little bit of history repeating.

  • cordlesslamp@lemmy.today
    link
    fedilink
    arrow-up
    0
    ·
    21 days ago

    Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      20 days ago

      They don’t have zero latency. It’s a misconception.

      The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

      Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

    • TexasDrunk@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      21 days ago

      The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.

    • B0rax@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      21 days ago

      Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        21 days ago

        And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        21 days ago

        Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

        There’s some really bad misconceptions about how latency works on screens.