• sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        edit-2
        11 days ago

        Really? I’ve had more trouble with DisplayPort since I keep forgetting about the locking feature and bust the jacket off.

        I honestly prefer HDMI cables, but I hate the closed nature of the HDMI standard.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            11 days ago

            I think I’ve had one HDMI connector get bent out of shape, and I’ve broken 2 or 3 DP cables due to the locking bit. And I’ve only been using DP for 5 years and have used HDMI for >10 (15?). The DP lock is solving a problem I’ve never had, while creating one I never considered to be an issue.

            So I honestly prefer HDMI in general, and I’d have no reason to care about DP if HDMI standards people weren’t such thugs. But alas, DP has basically won the PC, so I guess I’ll get used to the clip.

        • a_fancy_kiwi@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          11 days ago

          They make display port cables without the locking feature. I don’t have a link unfortunately but I have a few in my box of cables that I’m definitely going to use one day

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            11 days ago

            Cool! Maybe I’ll get some next time I break one.

            I don’t remember the last time I had a cable come out on accident, but I have definitely broken multiple DP jackets and getting those unlatched without looking is way too frustrating (way more annoying than guessing the orientation of the HDMI cable wrong).

          • Valmond@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            11 days ago

            It “clicks” when inserted, and you have to have really small fingers, do magic and wiggle at the same time to get the damn cable out without ripping some hardware out.

            It’s a big strong head too, no need for these click teeth IMO.

            • Pieisawesome@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              10 days ago

              There is!

              I purchased my wife a brand new expensive oled monitor for Christmas.

              She opened it and set it up, but the stand hung over the edge a little bit.

              Unfortunately, I have a sit/stand desk which is back to back with her desk.

              I raised my desk the next day and my desk caught her monitor stand and flipped the monitor.

              The only reason it didn’t crash into the floor is the DP cable teeth held onto the monitor.

              The cable was ruined, but I’ll always defend those teeth!

  • mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    ·
    11 days ago

    Not only is DisplayPort better, they have eDP which has been used as a beefed up MIPI for laptops and tablets for years.

    Plus it supports HDMI and DVI for backward compatibility, so really it’s just the last standing corporate media standard that hasn’t fallen to its superior open counterpart.

  • flemtone@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    ·
    11 days ago

    And how much are they charging to support the new standard ? DisplayPort is at least free and open.

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      36
      ·
      11 days ago

      I wish manufacturers would bother to mark the capabilities of their otherwise identical looking ports and cables, so we could figure out what the hell we were looking at when holding a Device That Does Not Work in one hand and a cable in the other.

      • Ilovethebomb@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 days ago

        I think this is the reason a number of standards are going to active cables, so the device will know the cable isn’t up to standard.

        Or in the case of USB-C, so it doesn’t catch fire after having five amps cranked through it.

        • dual_sport_dork 🐧🗡️@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          11 days ago

          It’s bully for the device if it knows, but that doesn’t help the user who has just pulled one identical looking cable out of many from the drawer and will have no idea until they plug it in whether or not they will get a picture, nothing, near-undiagnosable partial functionality, or smoke.

          • Ilovethebomb@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 days ago

            I’m thinking that the user will get a notification that the cable they’re using isn’t the correct one.

              • Ilovethebomb@lemm.ee
                link
                fedilink
                English
                arrow-up
                3
                ·
                11 days ago

                HDMI is reverse compatible a long way back, almost every device will fall back to a standard that doesn’t require such an expensive cable.

                Come on man, this is simple stuff.

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 days ago

      I can definitely see the difference between a 1440p 27 inch display vs a 5k 27 inch display, add in high refresh rate and HDR and you already are close to exceeding the DP 2.0 maximum bandwidth (without display stream compression). I wish we could finally get decent high DPI monitors on desktops that aren’t made by or for apple Macs

      • LaggyKar@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 days ago

        Though that’s not where you would use HDMI. I would argue for TV:s, 4k is generally enough, and HDMI 2.1 already has enough bandwidth for 4k 120 Hz 12 bit-per-color uncompressed.

        But DisplayPort, yeah, that could use a bit more.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          11 days ago

          The point is definition doesn’t matter, it’s the viewing distance + pixel density that matters. This is what apple calls retina: when we stop seeing the individual pixels (jagged edges) at a normal viewing distance. This means that a phone will need a much higher pixel density than a desktop or tv. But the low-dpi displays we still have are unacceptable in 2024 the icons and text look so ugly…

          • Cocodapuf@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            10 days ago

            The thing is, I prefer actually owning my media, I don’t use steaming services for the most part. But even with my 40 TB of media storage, I just don’t have the space for 5k content. If it’s worthwhile, it gets 1080, if it matters less (kid shows or anything that came from a dvd), it gets 720 at best.

        • Mac@mander.xyz
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          10 days ago

          1080p gang 🤙

          Higher resolution costs more money and requires better hardware to drive it (more money). Overpowering a lower resolution only means your hardware is relevant for longer.
          It’s just tech creep.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    6
    ·
    11 days ago

    No thank you. I’d like to pass on this.

    We can already do 8K resolution. We still haven’t gotten to a point where the average broadcast is anything more than 720p or 1080p.

    It’s the reason I never bought a 4K or 8K tv. Sure, I have a new tv, but the only thing thats 8k is the forced ads into the TVs OS.

    And that’s why I don’t see benefit to a new HDMI. It’s just going to support more protocols, and make tv’s do more things that we don’t want. It’s going to make DRM in the cable. It’s going to make unskippable ads, it’s going to make all this shit that nobody wants or needs, but ooooOOOOooooo!!! Look! It’s new tech, so everybody gotta have it!!!

    But how’s it going to improve your visual experience if the content isn’t any better resolution to begin with?

    • UntitledQuitting@reddthat.com
      link
      fedilink
      English
      arrow-up
      20
      ·
      11 days ago

      do you really think the only use for a screen is television? boy, let me introduce you to the world of personal computing, it’s all the rage

      • BirdObserver@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 days ago

        No, but it’s the only reason for HDMI (along with AV receivers) because it’s all the manufacturers support, so all of us home theater nerds that do care about this stuff have no real* choice but to keep up with the HDMI world. Yes, you can set up a media server that streams 4K video, but you’re not going to find a DisplayPort 4K UHD player, or a 7.2 AVR that plugs into your 77” OLED and supports all of your game consoles. HDMI is just the unfortunate reality there.

        That said, the tech that actually takes advantage of the new cable specs tends to lag behind significantly, and new gaming consoles that support HDMI 2.2 likely will in a limited (ultimately disappointing) capacity for years, just like previous versions.

        (Also the top comment in this thread doesn’t really seem to reflect modern reality for most people I know. Most people are using their TV to stream at 1080p - 4K, not watching broadcast TV - in which case, yeah, get a $60 720p LCD or whatever, HDMI specs won’t matter to that kind of viewer. Still, subscription streaming quality definitely doesn’t take full advantage of your expensive shiny new TV the way physical media - or a media server - might, but that’s another conversation).

    • WalrusDragonOnABike [they/them]@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      11 days ago

      I’d like to be able to run my 3 4K monitors without compression via a single port and still have the option for one to be high refreshrate. Things like spreadsheets benefit far more from the increased resolutions than shows anyways, imo.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      11 days ago

      On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is “visually lossless”, so actually lossy. We don’t even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That’s definitely not enough even by today’s standards. I think you’re underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you’re going to be using it in 2030 or later for your “average broadcast”, then it needs to be designed today.

      Of course HDMI is shit for reasons you mention. DisplayPort is better, but it’s not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI’s problems.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      11 days ago

      Yes. It’s nice to have crisp fonts, but how realistic do I need pictures on my screen really?

      Life is finite. I have my dog, family members, some need for rest and food and cleaning and music, chores forgotten, and a girl I’d like to talk to (likely either it’s my imagination and she doesn’t like me that way or the ship has sailed, but still to hope is to live).

      Pictures on a usual 1920x1080 display are already as good as what people of 1950s could dream to see on paper. I’m not judging those who want more, but it seems like pressure for the sake of it. More GHz, more pixels, bigger something.

      I hope that pressure for “more” will change to pressure for “better” some day.

  • Ilovethebomb@lemm.ee
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 days ago

    80 gigabit is a crazy amount of bandwidth, and it seems like HDMI will leapfrog that?

  • Altima NEO@lemmy.zip
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    10 days ago

    Now I wonder how strict they’ll actually be with the standard? 2.1 is a mess because you can call just about anything 2.1 now.

    And then it takes forever for companies to adopt it.