• cordlesslamp
    link
    fedilink
    arrow-up
    63
    arrow-down
    1
    ·
    5 months ago

    Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?

    • TexasDrunk@lemmy.world
      link
      fedilink
      arrow-up
      108
      arrow-down
      1
      ·
      5 months ago

      The guy inside it drawing them is insanely fast at his job. That’s also why they were so bulky, to fit the guy who does the drawing.

    • B0rax@feddit.de
      link
      fedilink
      arrow-up
      61
      arrow-down
      5
      ·
      5 months ago

      Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.

      • accideath@lemmy.world
        link
        fedilink
        arrow-up
        49
        arrow-down
        1
        ·
        5 months ago

        And no motion blur because the image is not persistent. LCDs have to change their current image to the new one. The old image stays until it’s replaced. CRTs draw their image line by line and only the the last few lines are actually on screen at any time. It just happens so fast, that, to the human eye, the image looks complete. Although CRTs usually do have noticeable flicker, while LCDs usually do not.

        • ByteJunk@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          5 months ago

          Thanks for the explanation.

          OP’s point is a weird flex though, like pointing out that a bicycle never runs out of gas…

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        23
        ·
        edit-2
        5 months ago

        Of course there’s buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

        There’s some really bad misconceptions about how latency works on screens.

        • HackerJoe@sh.itjust.works
          link
          fedilink
          arrow-up
          8
          ·
          5 months ago

          Those are on the graphics adapter. Not in the CRT.
          You can update the framebuffer faster than the CRT can draw. That’s when you get tearing. Same VSync then as now.

        • __dev@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          5 months ago

          CRTs (apart from some exceptions) did not have a display buffer. The analog display signal is used to directly control the output of each electron gun in the CRT, without any digital processing happening in-between. The computer on the other end however does have display buffers, just like they do now; however eliminating extra buffers (like those used by modern monitors) does reduce latency.

          • frezik@midwest.social
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            5 months ago

            Doesn’t matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn’t like racing the beam.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        5 months ago

        That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn’t the latency just moved to the PC side?

        • fmstrat@lemmy.nowsci.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          5 months ago

          I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don’t see in digital, digital is “code”. You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.

          Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.

          Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.

        • cynar@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          5 months ago

          When one of your times is in milliseconds, while the other requires awareness of relativistic effects, you might as well call it instant.

          The propagation speed in copper is 2/3 C. With analogue monitors, that was effectively amped and thrown at the screen. The phosphate coating is the slowest part, that takes 0.25-0.5ms to respond fully.

          By comparison, at the time, “gaming” LCD screens were advertising 23ms response rates.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      edit-2
      5 months ago

      They don’t have zero latency. It’s a misconception.

      The industry standard way to measure screen lag is from the middle of the screen. Let’s say you have a 60Hz display and hit the mouse button to shoot the very moment it’s about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

      Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

      • Björn Tantau@swg-empire.de
        link
        fedilink
        arrow-up
        19
        arrow-down
        1
        ·
        5 months ago

        Actually 60 Hz was too low to comfortably use a CRT. I think it started to work well at 75 Hz, better 80 or 85. Don’t know if I ever had a 90 Hz one, especially at a resolution above 1280x960. But if you valued your eyes you never went down to 60.

        No idea why 60 Hz on an LCD works better, though.

        • DefederateLemmyMl@feddit.nl
          link
          fedilink
          English
          arrow-up
          17
          ·
          edit-2
          5 months ago

          No idea why 60 Hz on an LCD works better, though.

          Because LCD pixels are constantly lit up by a backlight. They don’t start to dim in between refresh cycles. They may take some time to change from one state to another, but that is perceived as ghosting, not flickering.

          On a CRT the phosporus dots are periodically lit up (or “refreshed”) by an electron beam, and then start to dim afterwards. So the lower the refresh rate, the more time they have to dim in between strobes. On low refresh rates this is perceived as flickering. On higher refresh rates, the dots don’t have enough time to noticably dim, so this is perceived as a more stable image. 60Hz happens to the refresh rate where this flicker effect becomes quite noticable to the human eye.

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          9
          ·
          5 months ago

          60Hz is what any NTSC TV would have had for consoles. Plenty of older computers, too. Lots of people gamed that way well into the 2000s.

          Incidently, if you do the same calculation above for PAL (50Hz), you end up at 10ms, or about 2ms more lag than NTSC. Many modern LCDs can have response times <2ms (which is on top of the console’s internal framerate matched to NTSC or PAL). The implication for retro consoles is that the lag difference between NTSC CRTs and modern LCDs is about the same as the difference between NTSC and PAL CRTs.

          • tal
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            deleted by creator

    • myplacedk@lemmy.world
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      5 months ago

      Because it draws those “pixels” as the signal reaches the monitor. When half of a frame is transmitted to a CRT monitor, it’s basically half way done making it visible.

      An LCD monitor needs to wait for the entire frame to arrive, before it can be processed and then made visible.

      Sometimes the monitor will wait for several frames to arrive before it processes them. This enables some temporal processing. When you put a monitor in gaming mode, it disables (some of) this.

        • Lojcs@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          5 months ago

          No? Afaik vsync prevents the gpu from sending half drawn frames to the monitor, not the monitor from displaying them. The tearing happens in the gpu buffer Edit: read the edit

          Though I’m not sure how valid the part about latency is. In the worst case scenario (transfer of a frame taking the whole previous frame), the latency of an lcd can only be double that of a crt at the same refresh rate, which 120+ hz already compensates for. And for the inherent latency of the screen, most gaming lcd monitors have less than 5 ms of input lag while a crt on average takes half the frame time to display a pixel, so 8 ms.

          Edit: thought this over again. On crt those 2 happen simultaneously so the total latency is 8ms + pixel response time (which I don’t know the value of). On lcds, the transfer time should be (video stream bandwidth / cable bandwidth) * frame time. And that runs consecutively with the time to display it, which is frame time / 2 + pixel response time. Which could exceed the crt’s latency

          BUT I took the input lag number from my monitor’s rtings page and looking into how they get it, it seems it includes both the transfer time and frame time / 2 and it’s somehow still below 5 ms? That’s weird to me since for that the transfer either needs to happen within <1 ms (impossible) or the entire premise was wrong and lcds do start drawing before the entire frame reaches them

          Although pretty sure that’s still not the cause of tearing, which happens due to a frame being progressively rendered and written to the buffer, not because it’s progressively transferred or displayed

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      5 months ago

      The transmission is still the same with the exception of things like VRR and DSC. We still send a VBLANK signal which is the electronic signal to tell a CRT to move up to the top of the screen. We don’t change the way things are sent. It’s still top down, left to right. VSync and HSync are still used but make less obvious sense on LCDs. Digital displays translate this.

      Because LCDs convert these signals, we call the time it takes to do the conversion “draw time” but this isn’t as important today. What matters now is the time it takes for a pixel to change one color to another (response time). Because a CRT would fire electrons, the next frame would essentially vanish pretty quickly. LCDs don’t do this.

      Conversely OLEDs are plenty fast, but can’t reproduce the same pixel response without inserting a blank frame with Black Frame Insertion which sacrifices brightness and is being slowly removed.

      Still, most “lag” comes from transmission time. It takes 1/60s of a second to transmit a full frame at 60hz. Divide that 2 to get the “average” lag and CRTs would measure at 8.3333ms. LCDs were happy to get to 10ms.

      Now we can do 120hz which is way more important since even if CRTs are faster, you can get the whole image out in half the time, which “averages” at 4.1666ms, making even a “4ms” slow LCD on PC better than the console running at 60hz on CRT.

      And while CRTs could reach high resolution, these were limited by their HSync speed which usually means lower resolution, because a CRT could only move ever so quickly horizontally.

      Today that translates to an OLED is best for emulating any console that ran at 60hz and better or as good pixel response time if you are willing to do BFI. The main reason why the competitive Melee community still uses CRT is mostly pricing, second to FUD.

    • mindbleach@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      Those pixels appear basically as soon as the signal arrives at the back of the monitor, and they’re gone within a dozen scanlines. Watch slow-motion video of a CRT and you’ll see there’s only a narrow band that’s bright at any given moment.

    • Socsa@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      arrow-down
      4
      ·
      5 months ago

      The motion blur thing is complete nonsense. It’s never been a benefit of CRT and reveals this greentext to be fake and gay.