• cordlesslamp@lemmy.today
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?

    • TexasDrunk@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      4 months ago

      They don’t have zero latency. It’s a misconception.

      The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

      Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

    • B0rax@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 months ago

        Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

        There’s some really bad misconceptions about how latency works on screens.