28 comments

  • piker 12 hours ago
    The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off. For example, it's clear that a lot of the Rust UI framework developers have been working on Macs for the last few years. The font rendering on many of those look bad once you plug them into a more normal DPI monitor. If they hadn't been using Macs with Retina displays they would have noticed.
    • guhcampos 8 hours ago
      This is more widespread than we like to admit.

      Developers writing software on 64GB M4 Macs often don't realize the performance bottlenecks of the software they write.

      Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

      Developers writing services over unlimited cloud budgets often don't realize the resource wastes into which their software incurrs.

      And to extend this to society in general.

      Rich people with nice things often alienate themselves from the reality of the majority of people in the World.

      • zerkten 6 hours ago
        You can nerf network performance in the browser devtools or underprovision a VM relatively easily on these machines. People sometimes choose not to and others are ignorant. Most of the time, it's just the case that they are dealing with too many things that are vague making it difficult to prioritize seemingly less important things.

        A number of times I've had to have a framing discussion with a dev that eventually gets to me asking "what kind of computer do your (grand)parents use? How might X perform there" around some customer complaint. Other times, I've heard devs comment negatively after the holidays when they've tried their product on a family computer.

        • alsetmusic 6 hours ago
          > Other times, I've heard devs comment negatively after the holidays when they've tried their product on a family computer.

          I worked for a popular company and went to visit family during the winter holidays. I couldn't believe how many commercials there were for said company's hot consumer product (I haven't had cable or over-air television since well before streaming was a thing, so this was a new experience in the previous five years).

          I concluded that if I had cable and didn't work for the company, I'd hate them due to the bajillion loud ads. My family didn't seem to notice. They tuned out all the commercials, as did a friend when I was at his place around a similar time

          All it takes is a change in perspective to see something in an entirely new light.

          • chrismorgan 2 hours ago
            I’ve never had TV, and have used ad blockers as long as they’ve been a thing. (Until 1⅓ years ago I even lived in a rural area where the closest billboard of any description was 40km away, and the second-closest 100km away.) On very odd occasions, I would get exposed to a television, and what I find uncomfortable at the best of times (notably: how do they cut so frequently!?) becomes a wretched experience as soon as it gets to ads, which it does with mindboggling frequency. I’m confident that if I tried actually watching that frenetic, oversaturated, noisy mess on someone’s gigantic, far-too-bright TV, I would be sick to the stomach and head within a few minutes.
        • Normal_gaussian 2 hours ago
          More to the point; colour and font rendering are typically "perception" questions and very hard to measure in a deployed system without introducing a significant out of band element.

          Network performance can be trivially measured in your users; and most latency/performance/bandwidth issues can be identified clearly.

        • kijin 4 hours ago
          Chrome devtools allow you to simulate low network and CPU performance, but I'm not aware of any setting that gives you pixelated text and washed-out colors. Maybe that will make a useful plugin, if you can accurately reproduce what Microsoft ClearType does at 96dpi!
      • Moto7451 6 hours ago
        > Developers working over 1gbps Internet connections often don't realize the data gluttony of the software they write.

        As a developer and AirBnB owner, what I’ve also noticed is the gluttony of the toolchain as well. I’ve had complaints about a 500/30 connection from remote working devs (very clear from the details they give) which is the fastest you can get for much of the metro I am in.

        At home I can get up to 5/5 on fiber because we’re in a special permitting corridor and AT&T can basically do whatever they want with their fiber using an on old discontinued sewer run as their conduit.

        I stick to the 1/1 and get 1.25 for “free” since we’re so over-provisioned. The fastest Xfinity provides in the same area as my AirBnB is an unreliable 230/20 which means my “free” excess bandwidth is higher than what many people near me can pay for.

        I expect as a result of all this, developers on very fast connections end up having enough layers of corporate VPN, poorly optimized pipelines, a lot of dependency on external servers, etc that by the time you’re connected to work your 1/1 connection is about 300/300 (at least mine is) so the expectation is silently set that very fast internet will exist for on-Corp survival and that the off-corp experience is what others have.

        • ericd 5 hours ago
          OT, but leaving the zeros on those gigabit numbers makes this a lot less work to understand, at first I thought maybe you were in mbps throughout.
        • kijin 4 hours ago
          Not only bandwidth but also latency can vary dramatically depending on where you are. Some of your guests might have been trying to connect to a VPN that tunnels all their traffic halfway around the world. That's much, much worse than getting a few hundred Mbps less bandwidth.
          • Moto7451 1 hour ago
            Yup. That isn’t helping them either. My corporate VPN, along with being rather bandwidth limited, is super laggy.
      • guerrilla 6 hours ago
        I wish we could have this as a permanent sticky for this website. It's out of control, especially with web stuff.

        Spotify's webapp, for example, won't even work on my old computer, whereas YouTube and other things that you'd think would be more resource intensive work without any issue whatsoever.

      • jonhohle 7 hours ago
        I tend to use older hardware and feel like I’m constantly fighting this battle. It’s amazing thr hardware we have and I have to wait for dozens of seconds to start an app or load a web page.
        • bombcar 6 hours ago
          Sometimes I run "old software" on the latest hardware it could support (think Windows 2000 on 2010s machines) and it is amazing how much it flies.
        • dangus 6 hours ago
          I would like to ask why even fight the battle?

          Philosophically I am with you, e-waste and consumerism are bad, but pragmatically it is not worth punishing yourself from a dollars and cents standpoint.

          You can jump on Amazon and buy a mini PC in the $300 range that’s got an 8 core 16 thread AMD 6800H CPU, 16GB RAM, 500GB SSD, basically a well above-average machine, with upgradable RAM and storage. $240 if you buy it on AliExpress.

          You can grab a MacBook Air M2 for around $500.

          Why suffer with slow hardware? Assuming that using a computer is at least somewhat important to your workflow.

      • rangestransform 2 hours ago
        At a "rich world" company that wants to make money, it's completely rational to not give a shit about "poor world" people that won't make you much money (relatively speaking) anyways. It basically only makes sense to milk the top leg of the K-shaped economy.

        Conversely, it opens up a niche for "poor world" people to develop local solutions for local challenges, like mobile payments in India and some of Africa.

      • npteljes 4 hours ago
        I agree, but developers don't have freedom over the product. Product managers are the ones who have a say, and even then, they are in a strict hierarchy, often ending at "shareholders". So, many of the wrongs come from the system itself. It's either systemic change (at least an upgrade), or no meaningful change.
      • bombcar 6 hours ago
        You need two "classes" of developers; which may be the exact same people - those who are on the fastest, biggest hardware money can buy - but you also need some time running on nearly the worst hardware you can find.
      • pmbanugo 1 hour ago
        +1000
    • dragonwriter 4 hours ago
      > The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

      That’s not the problem of using this monitor for creating the work, that’s the problem of not also using a more typical monitor (or, better, an array covering the common use cases, but which is practical depends on whether you are talking about a solo creator or a bigger team) for validating the work.

      Just as with software, developers benefit from a more powerful machine for developing, but the product benefits from also being tested on machines more like the typical end-user setup.

    • SeasonalEnnui 9 hours ago
      Yes! I’m glad to see this pointed out - when working on UIs, I regularly move them between 3 monitors with varying resolution & DPI. 4k @ 200%, 2K at 125%, and 2K at 100%. This reveals not only design issues but application stack issues with DPI support.
    • nine_k 11 hours ago
      As a designer, one should keep a couple of cheap, low-res monitors reset to the factory defaults for proofing what many users are going to see.
      • zerkten 6 hours ago
        This is probably one of the few things I think works better in an office environment. There was older equipment hanging around with space to set it up in a corner so people could sit down and just go. When mobile came along there would be sustainable lending program for devices.

        With more people being remote, this either doesn't happen, or is much more limited. Support teams have to repro issues or walk through scenarios across web, iOS, and Android. Sometimes they only have their own device. Better places will have some kind of program to get them refurb devices. Most times though people have to move the customer to someone who has an iPhone or whatever.

      • eb0la 11 hours ago
        I must confess I felt a lot of lust looking at the self color calibration feature.

        It is extremely useful if your work ends up in paper. For photography (edit: film and broadcast, too) would be great.

        My use case are comics and illustration, so a self-color-correcting cintiq or tablet would be great for me.

        • BolexNOLA 8 hours ago
          I like having a color calibrated monitor but at the end of the day it’s about trusting my scopes too. Audio unfortunately has this perception element that for some reason doesn’t seem as big of an issue with video. We have dB/loudness standards for a reason, but different stuff just sounds louder or softer no matter what.

          If it looks good on a mac laptop screen/imac and the scopes look right, it’s good for 99%+ of viewers. You can basically just edit visually off any Mac laptop from the last 10 years and you’ll probably be happy tbh.

      • sim7c00 11 hours ago
        this exactly. same ppl do for sound, listen in the car, over shity headphones etc. - that's just quality control not the fault of any piece of equipment.
        • mikepurvis 10 hours ago
          Yes this is universal in pro mixing setups, having filters or even actual physical hardware to provide the sound of stock earbuds, a crappy Bluetooth speaker, sound system in a minivan, etc.
        • edb_123 7 hours ago
          Well, of course it's a good idea to double check with various output methods. But if a mix sounds good on studio monitors with a flattest possible frequency response (preferably even calibrated with an internal DSP) in an acoustically treated room, there's a very high probability it will sound good on almost anything out there. At least that's my experience.
          • spookie 2 hours ago
            I would reccomend one to take a look at the usual frequency response of cheap drivers or the inherent flaws of the consumer tech over time and compare it with the evolution of pop music.

            Audio engineers are for sure taking all this into account, and more (:

    • cosmic_cheese 3 hours ago
      I make a point of keeping my secondary monitor a "normal" DPI 2560x1440 display precisely to avoid this kind of problem. The loss of legibility has little impact on secondary monitor use cases, and I can easily spot-check my UI and UI graphics work by simply dragging the window over.

      High quality normal DPI monitors are so cheap these days that even if multi-monitor isn't one's cup of tea there's not really a good reason to not have one (except maybe space restrictions, in which case a cheap ~16" 1080p/1200p portable monitor from Amazon will serve the purpose nicely).

    • bn-l 6 hours ago
      This was (is?) the issue with zed and maybe the cause also.
    • ponector 9 hours ago
      Also, it's not only about the screen resolution. Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

      Had reported many issues where to reproduce they needed to enable 10x throttling in the browser. Or use a Windows machine.

      • throw0101d 9 hours ago
        > Developers uses powerful macs and users have old windows - the usability is different, but devs usually don't care. Works fine on my machine!

        Part of what QA testing should be about: performance regressions.

        • ponector 4 hours ago
          Usually it is not a priority, especially for enterprise software. It's ok if UI is lagging, page finish loading in 10 seconds, etc. Simply because there is usually no other choice, people have to use whatever is bought/developed.
        • branko_d 7 hours ago
          This is not even a regression - it has always been there, unnoticed!
          • throw0101d 4 hours ago
            If it has always been there, then your QA process has always sucked and missed it.
    • sz4kerto 8 hours ago
      This is exactly how sound studios do mixing. They don't just use top-end monitors -- they generally also listen on low-end speakers that color sound in a way that's representative to what people have at home (hello, Yamaha NS-10).
      • Intermernet 8 hours ago
        People used to buy NS-10s because they knew professional studios used them. They were then underwhelmed when they sounded worse than the hifi speakers they had at home.

        Many audio engineers live by the mantra "if it sounds good on NS-10s, it'll sound good on anything".

        We need such a touchstone for software engineers.

        • noir_lord 1 hour ago
          It'd be moving touchstone is the problems, speakers in the consumer space don't evolve as fast as computing tech in the user space.

          You could get somewhat close by looking at what was a middle of the road consumer laptop from Dell/HP/Lenovo 5 years ago and buying one of those though.

    • stephenr 11 hours ago
      Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

      The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.

      • mrbungie 11 hours ago
        Yeah sure, as long as you have a lot of resources for testing widely.

        Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.

        • stephenr 11 hours ago
          > if you were to make an analogy you should target for a few devices that represent the "average"

          For Macs, 220DPI absolutely is the average.

          • toast0 4 hours ago
            Sure, but Macs are around 10% of general desktop computing. To a first approximation, they don't count. User communities vary widely. If you target macs, then a high DPI screen is a must for testing. Otherwise, I dunno; ~ 100 DPI screens are way less expensive than ~ 200 DPI screens, so I'd expect that installed base is significantly higher for standard DPI. But there's probably enough high DPI users that it's worth giving it a look.

            To address a question elsewhere, personally, I don't see the benefit to pushing 4x the pixels when ~ 100 DPI works fine for me. My eyes aren't what they were 20 years ago, and it's just extra expense at every level.

          • swiftcoder 9 hours ago
            I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.

            One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.

            • gr4vityWall 9 hours ago
              I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.

              > One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world

              I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.

              > I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

              If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)

              • leguminous 7 hours ago
                If you look at the Steam hardware survey, most users (as in, > 50%) are still using 1080p or below.

                https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...

                • noir_lord 1 hour ago
                  In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.

                  Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.

                • swiftcoder 7 hours ago
                  I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)
                  • gr4vityWall 4 hours ago
                    FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:

                    > I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

                    I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.

              • wlesieutre 7 hours ago
                Writing this on 1280x1024 because it still works fine

                The 5:4 aspect ratio is weird, especially in this era of everything 16:9, but it's a second monitor so usually only has one thing open

            • eertami 5 hours ago
              If you have 20/20 vision, a 27" display at 1440p (~110 DPI) has a visual acuity distance of 79cm - ie, if you are sat 79cm or further away from the screen, humans are not capable of resolving any extra detail from a higher resolution. High refresh rate 1440p IPS screens are very widely available at good prices, so it isn't that crazy that people choose them.

              Phone and laptop have higher DPI screens of course, but I'm not close enough to my desktop monitor for a higher DPI to matter.

              • Dylan16807 56 minutes ago
                That's a common misconception. That acuity calculation is based on knowing nothing about the image; imagine applying it to arbitrary noise. When you have things like lines and edges your eyes can pick out differences an order of magnitude finer.

                https://en.wikipedia.org/wiki/Hyperacuity

              • swiftcoder 4 hours ago
                I'm running a 32" display at 4k, which works out to about the same at 79cm. Apparently a bunch of people sit really close to their monitors :)
            • MBCook 2 hours ago
              Absolutely everyone in my company uses 1080p monitors unless they got their own. That’s just “normal“.

              It’s horrible.

            • wmf 4 hours ago
              Retina still isn't available for large monitors like 38" and above.
      • MBCook 2 hours ago
        I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

        Stands out like a sore thumb.

    • mschuster91 8 hours ago
      This is just as valid for mobile app and website development.

      When all you use for testing is Browserstack, local emulators and whatnot and only the latest iPhone and Samsung S-series flagship, your Thing will be unusable for wide parts of the population.

      Always, always use at the very least the oldest iPhone Apple still supports, the cheapest and oldest (!) Samsung A-series models still being sold in retail stores as "new", and at least one Huawei and Xiaomi device. And then, don't test your Thing only on wifi backed by your Gbit Wifi 7 router and uplink. Disable wifi and limit mobile data to 2G or whatever is the lowest your phone provider supports.

      And then, have someone from QA visit the countryside with long stretches of no service at all or serious degradation (think packet loss rates of 60% or more, latencies of 2 seconds+). If your app survives this with minimal loss of functionality, you did good.

      A bunch of issues will only crop up in real world testing. Stuff like instead of keeping a single socket to the mothership open, using fresh from scratch SSL connections for each interactions is the main bummer... latency really really eats such bottlenecks alive. Forgotten async handling leading to non-responsiveness of the main application. You won't catch that, not even with Chrome's network inspector - you won't feel the sheer rage of the end user having a pressing need and be let down by your Thing - even if you're not responsible for their shitty phone service, they will associate the bad service with your app.

      Oh, and also test out getting interrupted while using your Thing on the cheap-ass phones. Whatsapp and FB Messenger calls, for example - these gobble so much RAM that your app or browser will get killed by OOM or battery saver, and when the user has their interruption finished, if you didn't do it right your Thing's local state will have gotten corrupted or removed, leading the user having to start from scratch!

    • freejazz 3 hours ago
      >The problem with using this kind of monitor for any work that others will view on their own monitors is that your perception of what looks good will be way off.

      Really? It's not a problem for photo retouchers, for whom a monitor like this is basically designed for.

  • ec109685 15 hours ago
    I don’t get marketing people. The only link in the press release is to adobe’s creative cloud. Why isn’t there two taps to buy the monitor with Apple Pay and have it shipped when it’s available?

    > The redemption period ends August 31, 2026. For full details, visit https://www.asus.com/content/asus-offers-adobe-creative-clou....

    Well, the monitor is €8,999, so maybe it’d be more than two taps for me:

    > The monitor is scheduled to be available by October 2025 and will costs €8,999 in Europe (including VAT)

    • pjerem 12 hours ago
      Buy a 9k€ monitor and get free 3 months for a cloud subscription. What a deal !
      • ryanjshaw 10 hours ago
        If you’re not careful, that adobe creative cloud sub will cost you more than the monitor when you try to cancel
    • gigatexal 14 hours ago
      Too rich for me. Also I don’t need a creative cloud sub. But I’m the wrong customer for such a monitor.

      I’ll wait till 8k becomes more of the norm for say 1-1.5k

      • nine_k 11 hours ago
        Human eye resolution is about 1 arcminute. The comfortable field of view is about 60°, or 3600 arcmimutes. A 4K display should mostly suffice %)
        • refulgentis 3 hours ago
          Naw, not at 32".

          32" 4K isn't much better than pre-Retina iPhone, even after accounting for viewing distance between say, ~18" for phone, and 2-3 feet for desktop.

          I agree for many people, I might even go so far as to say "normies", this sort of thing doesn't matter. But after many years of poking at this, I strongly believe it's not because their eyes can't see the difference, it's because they don't understand the question we're asking (i.e. about overall quality rather than detail density, and when you try explaining detail density, they think you're asking if a monitor "looks real" which sounds ~impossible)

          This whole thing is disappointing because all I've wanted for a decade is 27" 5K to be mainstream and ubiqituous, that hits a sweet spot - surprisingly, only slightly less PPI than this, but a much more reasonable price. They of course exist, its just, consistently fringe and Mac-focused, presumably due to vagaries of HDMI.

          • nine_k 16 minutes ago
            All good points!

            5K requires a lot of bandwidth; some 5K monitors required two HDMI connections, and Thunderbolt was Mac-specific and generally costly.

        • gigatexal 10 hours ago
          But I run double on Mac so an 8k is 4k.
  • martinald 11 hours ago
    Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

    Why is this? 5k/6k at 27" would be the sweet spot for me, and potentially 8k at 32". However, I'm not willing to drop $2k per monitor to go from a very nice 27" 4k to 27" 5k.

    You can get 8K TVs for <$1000 now. And an Quest 3 headset has 2 displays at far higher PPI for $600.

    • throw0101d 9 hours ago
      > Me and a friend were just chatting how annoying it is monitors stalled out at 4K.

      There's been a bit of a 'renaissance' of 5K@27" in the last ~year:

      > In just the past few months, we've taken a look at the ASUS ProArt Display 5K, the BenQ PD2730S, and the Alogic Clarity 5K Touch with its unique touchscreen capabilities, and most recently I've been testing out another new option, the $950 ViewSonic VP2788-5K, to see how it stacks up.

      * https://www.macrumors.com/review/viewsonic-vp2788-5k-display...

      There are 15 monitors discussed in this video:

      * https://www.youtube.com/watch?v=EINM4EysdbI

      The ASUS ProArt PA27JCV is USD 800 (a lot less than $2k):

      * https://www.youtube.com/watch?v=ojwowaY3Ccw

    • Aurornis 7 hours ago
      > You can get 8K TVs for <$1000 now.

      8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.

      > And an Quest 3 headset has 2 displays at far higher PPI for $600

      Those displays are physically tiny. Easier to deal with lower yields when it’s only taking a few square inches.

      Ultra high resolution desktop monitors would exist in the middle: Very small pixel sizes but also relatively large unit area.

      However, the demand side is also not there. There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well. That doesn’t incentivize more.

      If we get to a place where we could reliably plug a 6K monitor into any medium to high end laptop or desktop and it just works, there might be more. Until then, making a high res monitor is just asking for an extremely high return rate.

      • Kon5ole 5 hours ago
        >> You can get 8K TVs for <$1000 now.

        >8K at jumbo TV size has relatively large pixels compared to an 8K desktop monitor. It’s easier to manufacture.

        I don't think that's true.

        I've been using a 8k 55" TV as my main monitor for years now. It was available for sub-800 USD before all such tv's vanished from the market. Smaller pixels were not more expensive even then, the 55"s were the cheapest.

        4k monitors can be had for sub-200 usd, selling 4x the area of the same panel should be at most 4x that price. And it was, years ago.

        So they were clearly not complicated or expensive to manufacture - but there was no compelling reason for having 8k on a TV so they didn't sell. However, there IS a compelling reason to have 8K on a desktop monitor!

        That such monitors sell for 8000 usd+ is IMO a very unfortunate situation caused by a weird incompetence in market segmentation by the monitor makers.

        I firmly believe that they could sell 100x as many if they cut the price to 1/10th, which they clearly could do. The market that never appeared for tv's is present among the world's knowledge workers, for sure.

        • dmayle 4 hours ago
          I've been using an 8k 65" TV as a monitor for four years now. When I bought it, you could buy the Samsung QN700B 55" 8k, but at the time it was 50% more than the 65" I bought (TCL).

          I wish the 55" 8k TVs still existed (or that the announced 55" 8k monitors were ever shipped). I make do with 65", but it's just a tad too large. I would never switch back to 4k, however.

          • murkt 2 hours ago
            What standard does reliably work to drive 8K at 60 Hz and how expensive cables are?

            How far away do you sit from it? Does it sit on top of your desk? What do you put on all this space, how do you handle it?

            I don’t think you’re maximizing one browser window over all 33 million pixels

          • r0b05 3 hours ago
            What is the model number and how has the experience been?

            I've mostly read that TV's don't make great monitors. I have a TLC Mini LED TV which is great as a TV though.

          • mrguyorama 2 hours ago
            What do you watch on an 8K TV?

            There's no content

            Average bitrate from anything not a Bluray for even HD is not good, so you do not benefit from more pixels anyway. Sure, you are decompressing and displaying 8K worth of pixels, but the actual resolution of your content is more like 1080p anyway, especially in color space.

            Normally, games are the place where arbitrarily high pixel counts could shine, because you could literally ensure that every pixel is calculated and make real use of it, but that's actually stupidly hard at 4k and above, so nvidia just told people to eat smeary and AI garbage instead, throwing away the entire point of having a beefy GPU.

            I was even skeptical of 1440p at higher refresh rates, but bought a nice monitor with those specs anyway and was happily surprised with the improvement, but it's obvious diminishing returns.

      • nicoburns 3 hours ago
        > There are already a number of 5K, 6K, and 8K monitors on the market. They’re just not selling well. Between difficult software support for scaling legacy apps, compatibility issues with different graphics cards and cables, and the fact that normal monitors are good enough, the really high resolution monitors don’t sell well.

        They're available, but they never seem to have become a mass-market product at mass-market prices. The cheapest 5k monitor is at least double the price of the cheapest 4k monitor. And it was more like 4x until recently.

        You're probably right that we're starting to hit the point where people don't care though.

    • rickdeckard 9 hours ago
      Because the vast majority of Monitor Sales-Volume are (public) tenders from companies buying huge volume, and those companies still mostly look for monitors <4K (without fancy specs and without i.e. USB-C).

      If 4K reaches mass-market for those, the specs will shift down and there will be room in the (much smaller) Premium-Tier monitor segment

      Heck, even if you just want USB-C and an integrated webcam on an average display, the price-hike compared to one without it is crazy, because everything except those basic office-monitors is still niche-production...

    • ebbi 1 hour ago
      One of the best things I've done for my setup is convert old 5k iMacs to work as external display.

      Only downside are the massive borders by todays standards, but it still has the Apple aesthetics, the 5k resolution is beautiful for my use cases (spreadsheets, documents, photo editing), and has HDMI inputs so I can play PS5 on it.

    • 4ggr0 11 hours ago
      as a gamer 8k makes me sweat because i can't imagine what kind of hardware you'd need to run a game :O probably great for text-based work, though!
      • Aurornis 7 hours ago
        Once you get into the high pixel densities you stop running everything at native resolution. You have enough pixel density that scaling the output doesn’t produce significant visible artifacts.

        With 8K small pixels you could pick a number of resolutions up to 4K or higher and you wouldn’t even notice that the final product was scaled on your monitor.

        People with Macs with retina displays have been doing this for years. It’s really nice once you realize how flexible it is.

        • 4ggr0 7 hours ago
          i'm actually going to do the reverse move, was gaming on a 4K display, but going to downgrade to 3440x1440 to get more performance. but of course the gaming displays i find apparently aren't ideal for working, because text looks worse. add to that that the internet seems to be split if wide-monitors are the best thing ever or actually horrible. why is it all so complicated, man.
          • zamadatix 6 hours ago
            My only gripe is nearly all common "ultrawide" models should really be thought of as "ultrashort" in that they don't offer more width, just less height.

            E.g. a 21:9 ultrawide variant of 4k should really be 5040x2160. Instead they are nearly all 3840x1600. That may well be cost/price optimal for certain folks, I'm not saying it's bad for the product itself to exist, but nobody was looking at a 1600p monitor thinking "man, I wish they'd make a wider variant!" they started with 4k and decided it would be nice if it were permanently shortened.

            • 4ggr0 3 hours ago
              yeah, that really confused me as well. the whole 4K, 2K, 2.5K, ultrawide, ultrahigh, microwide, 8K shit just gets confusing, especially because it's neither accurate nor standardized.
            • tstrimple 1 hour ago
              I think they are calling those 5k2k monitors. I'm quite happy with my 45" LG 5k2k OLED monitor. Much more usable than the 32:9 monitors after seeing both in person.
          • simoncion 5 hours ago
            If the game offers it [0], set the output resolution to 4K, and the render resolution to something smaller. A multiplier of ~0.66 is roughly 1440p output, and 0.5 is 1080p.

            If the game doesn't offer that, then I've found that the HUD/UI uglification isn't too bad when one sets the output resolution to 1440p.

            If Windows is getting in the way of doing this, and most or all of your games have been purchased through Steam, give Linux a try. I've heard good things about the Fedora variant known as Bazzite, but have never, ever tried it myself. [1]

            [0] And shockingly few do! There's all sorts of automagic "AI" upscaling shit with mystery-meat knobs to turn, but precious few bog-standard "render everything but the HUD and UI with this many fewer pixels" options.

            [1] I've been using Gentoo for decades and (sadly) see no reason to switch. I strongly disrecommend Gentoo as a first Linux distro for most folks, and especially for folks who primarily want to try out Linux for video gaming.

            • 4ggr0 3 hours ago
              > set the output resolution to 4K, and the render resolution to something smaller

              doesn't that make everything blurry? that's the gripe i have with circa post-2020 PC gaming, barely any pc can run a AAA or AA game in native resolution and instead has to use artificial upscaling and whatnot. haven't specifically tried it.

              also can't test it anymore, as my gaming monitor is now our TV (48 inch OLED gaming TV, what a blast it was). now using my "old" 32in 2560x1440 IPS display, really miss OLED :( which is why i want to buy a new monitor. but i can't decide if i should take a 27in one (seems to be the 16:9 standard right now, but seems so small to me) or a ultrawide one. i switch games very frequently and also sometimes like to play old(er) games, so a bit scared of the "ultrawides are cool if your game supports it"-vibe...

              > I've heard good things about the Fedora variant known as Bazzite

              haha, this message was written on Bazzite, so i got that part covered :D switched about a month ago, funny to get the recommendation now.

              • simoncion 2 hours ago
                > doesn't that make everything blurry?

                My experience for the 3D parts of a great many games that my 5700 XT can't reasonably run at panel-native resolution is that the game's art style is to blur up the picture with all sorts of postprocessing (and sometimes (especially with UE5 games) with the ever-more-popular "it looks so bad it makes you wonder if the renderer is totally busted unless you leave TAA on" rendering technique). Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful.

                So, not that I notice? For the games that permit it, the HUD and menus stay nice and sharp, and the 3D stuff that's going to be all smudged up no matter what you do just renders at a higher frame rate.

                For games that don't have independent 2D and 3D render resolutions, I find 1440p to be quite tolerable, and (weirdly) 1080p to be much less tolerable... despite the fact that you'd expect it to fit nicely on the panel. I guess I'm expecting a much more crisp picture with integer scaling than I get? Or maybe this is like what some games did way back when where they totally change the font and render style of the UI once they get past some specific breakpoint. [0] I haven't looked closely at what's going on, so I don't have any even vaguely-reasonable explanation for it.

                > [ultrawide monitors]

                I like the comment someone else made somewhere that described them as "ultrashort" monitors. Personally, even if I was willing to move my head around enough to scan the whole damn monitor, I'm unwilling to lose so much vertical resolution. But as always, one should definitely choose what one likes.

                Personally, I find a 32" 3840 pixel wide monitor to be good. It's really great for doing work on, and perfectly acceptable for playing video games.

                > [Linux]

                Gratz on moving to Linux for gaming. Hope you don't have much trouble with it, and any trouble you have is either caused by super-invasive kernel-level anticheat that will never, ever work on Linux, or is trouble that's easy and/or fun to resolve.

                [0] One such game that sticks out in my memory is the OG Deus Ex game. At 1024x768, the font for the in-game UI switched from what -at lower resolutions- seemed a lot like a bitmapped font to what seemed a lot like a proper vector font. The difference was dramatic.

                • 4ggr0 54 minutes ago
                  > Sometimes this blurring ends up looking absolutely great, and other times it's just lazy, obnoxious, and awful

                  yeah, maybe i should give this way of setting the graphics a try. should try to find a game which looks great with it.

                  > I find a 32" 3840 pixel wide monitor to be good

                  just looked them up, surprised that they're quite a bit more expensive (800+ vs 600-750 for an ultrawide), but i guess the panels are more expensive due to the higher resolution. but your comment now makes me think what path i want to go. gotta read up on some opinions :D

                  > Hope you don't have much trouble with it

                  luckily i work on and with unix systems, so the new things are just the those related to gaming. but bazzite really has been very nice so far :) and as you say, the only times i had to boot up the windows on a separate disk are when i wanted to play games which don't run on linux at all, especially the kernel-level anticheat slopware.

                  but enough is enough. i've kept using windows at home just because of gaming, but i'm sick of M$. can't spend the whole day making fun of windows and then go home and game on it, feels dirty.

      • pornel 10 hours ago
        You don't really need 8K for gaming, but upscaling and frame generation have made game rendering resolution and display resolution almost independent.
        • jsheard 9 hours ago
          And if all else fails, 8K means you can fall back to 4K, 1440p or 1080p with perfect integer scaling.
          • layer8 9 hours ago
            Except that the hardware doesn’t necessarily offer perfect integer scaling. Oftentimes, it only provides blurry interpolation that looks less sharp than a corresponding native-resolution display.
            • jsheard 9 hours ago
              The monitor may or may not offer perfect scaling, but at least on Windows the GPU drivers can do it on their side so the monitor receives a native resolution signal that's already pixel doubled correctly.
            • Aurornis 7 hours ago
              Most modern games already have built-in scaling options. You can set the game to run at your screen’s native resolution but have it do the rendering at a different scale factor. Good games can even render the HUD at native resolution and the graphics at a scaled resolution.

              Modern OSes also scale fine.

              It’s really not an issue.

              • layer8 5 hours ago
                Games are not what I had in mind. Last time I checked, most graphics drivers didn’t support true integer scaling (i.e. nearest-neighbor, no interpolation).
                • jsheard 5 hours ago
                  > most graphics drivers didn’t support true integer scaling

                  https://www.nvidia.com/content/Control-Panel-Help/vLatest/en...

                  https://www.amd.com/en/resources/support-articles/faqs/DH3-0...

                  https://www.intel.com/content/www/us/en/support/articles/000...

                  I don't know what the situation is on Mac and Linux, but all of the Windows drivers offer it.

                • Aurornis 4 hours ago
                  With very high PPI displays the gamma corrected interpolation scaling is far better than nearest neighbor scaling.

                  The idea is to make the pixels so small that your eyes aren’t resolving individual pixels anyway. Interpolation appears correct to your eyes because you’re viewing it through a low-pass filter (the physical limit of your eyes) anyway.

                  Reverting to nearest neighbor at high PPI would introduce new artifacts because the aliasing effects would create unpleasant and unnatural frequencies in the image.

                  Most modern GPU drivers (nVidia in particular) will do fixed multiple scaling if that’s what you want. Nearest neighbor is not good though.

    • swiftcoder 9 hours ago
      > and potentially 8k at 32"

      What's your actual use-case for this? I run a 32" 4K, and I have to stick my nose within a foot (~30cm) of the display to actually spot individual pixels. Maybe my eyesight isn't what it used to be

      I'd kill for a 40" 5k or 6k to be available - that's significantly more usable desktop real estate, and I still wouldn't be able to see the pixels.

      • ak217 6 hours ago
        Pixels are very noticeable at 32" 4K. If you don't notice them, your eyes still do - they try to focus on blurry lines, causing eye strain. You might not notice, but it adds up over the years.

        It's simple math. A 32" 4K monitor is about 130 PPI. Retina displays (where you could reasonably say the pixels are not noticeable, and the text is sharp enough to not strain the eyes) start at 210 PPI.

        Subjectively, the other problem with 32" 4K (a very popular and affordable size now) is that the optimal scaling is a fractional multiple of the underlying resolution (on MacOS - bizarrely I think Windows and Linux both know how to do this better than MacOS). Which again causes blur and a small performance hit.

        I myself still use an old 43" 4K monitor as my main one, but I know it's not great for my eyes and I'd like to upgrade. My ideal would be a 40" or 42" 8K. A 6K at that size would not be enough.

        I am very excited about this 32" 6K Asus ProArt that came out earlier this year: https://www.asus.com/displays-desktops/monitors/proart/proar... - it finally gets Retina-grade resolution at a more reasonable price point. I will probably switch to two of these side-by-side once I can get them below $1K.

        • swiftcoder 6 hours ago
          > It's simple math. A 32" 4K monitor is about 130 PPI. Retina displays (where you could reasonably say the pixels are not noticeable, and the text is sharp enough to not strain the eyes) start at 210 PPI.

          It's also incorrectly applied math. You need to take into account the viewing distance - the 210 PPI figure often quoted is for smartphone displays (at the distance one typically holds a smartphone).

          For a 32" monitor, if your eyeballs are 36" away from the monitor's surface, you are well beyond the limit of normal visual acuity (and the monitor still fills a massive 42 degrees of your field of view).

          • ak217 5 hours ago
            Take a look at this article: https://www.nature.com/articles/s41467-025-64679-2 - the limits at "normal visual acuity" (18 observers ~25 years old) are far beyond what you imply. You need over 95 ppd to exhaust normal visual acuity.

            > For a 32" monitor, if your eyeballs are 36" away from the monitor's surface

            Why are you assuming 36"? Nobody I know uses 32" monitors at 36" away. Most people use less than half that distance for their laptops, and just over half for desktops.

            > the 210 PPI figure often quoted is for smartphone displays

            The 210 PPI figure is a minimum, it was used as marketing when Apple first started offering Retina displays. Apple's modern iPhone displays have far higher PPI. Apple's own marketing was challenged by critics who noted that visual acuity may top out closer to 200 ppd.

            Perhaps Retina doesn't matter to you - that's OK. But for most of us, 32" 4K is nowhere near the limit of our vision, and by staring at these monitors all day, we are slowly degrading it.

            • eertami 4 hours ago
              > and by staring at these monitors all day, we are slowly degrading it

              Yes, but that is probably accelerated more by sitting closer to screens than is healthy for too long, than it is by the resolution of the screen. It's anecdata so maybe truly everyone you know does sit 45cm away from a desktop monitor - but I can't say I've ever experienced that.

              Of course if you do sit that close then higher resolution is resolvable. Perhaps what your statement actually should be is: "Perhaps Retina doesn't matter if you sit at a (perfectly comfortable and healthy) further distance away from the screen - that's OK", otherwise I can a reader may think you are trying to imply the OP is somehow inferior, but really the only thing that differs is your viewing distance.

            • swiftcoder 5 hours ago
              > You need over 95 ppd to exhaust normal visual acuity

              32" 4K at 36" is 91 ppd. Which I guess is good enough, seeing as I'm well the far side of 25 year old.

              > Why are you assuming 36"? Nobody I know uses 32" monitors at 36" away.

              36" is the point where I can see all 4 corners of the monitor at the same time (and significantly too close to focus on one corner and have the other 3 corners in view at the same time).

              40 degrees of FoV is massive for a single monitor! I'm sitting here wondering how much you have to turn your head to use this size monitor up close

              • ak217 5 hours ago
                I actually have two more monitors, one on each side of my main one, in portrait mode :) And yes, I turn my head when I want to see them.

                I'm glad the low resolution monitors work for you. I just don't want people to proclaim that everything about displays is solved - it's not. There are meaningful, physiologically relevant improvements to be made. It's been over a decade since 4k60 became the standard. A lot of younger people would really benefit from mass produced 6k120 monitors.

              • Aurornis 5 hours ago
                > 40 degrees of FoV is massive for a single monitor! I'm sitting here wondering how much you have to turn your head to use this size monitor up close

                You move your eyes, not your head. Plus or minus 20 degrees is a trivial amount of eye movement.

                Most people are fine with this. Your requirement to comfortably see everything with minimal eye/head movement is atypical.

                Even if you do have to move your head, that’s not a bad thing. A little head movement during long computing sessions is helpful.

                • swiftcoder 3 hours ago
                  > You move your eyes, not your head. Plus or minus 20 degrees is a trivial amount of eye movement.

                  Maybe this varies a lot between humans, because I'm trying the experiment, and any closer than 24 inches requires physically moving my head to comfortably read text in the corner of the 32" display.

                  Even at 36" it's fatiguing to focus on a corner of the display solely through eye-movement for more than a few seconds.

                  > Your requirement to comfortably see everything with minimal eye/head movement is atypical

                  I don't think it's by any means an uncommon requirement. Movie-watchers want to be able to see the whole screen at once (with the exception of some intentionally-over-the-top IMAX theatres), gamers want to be able to see their radar/heath/ammo/etc in the corners of the screen. I'd like to be able to notice notifications arriving in the corner of the screen.

            • simoncion 5 hours ago
              > Nobody I know uses 32" monitors at 36" away.

              I suppose it's still true that nobody you know uses monitors of that size three feet away, but I'm very definitely one of those people.

              Why on earth would you put the monitor so close to your face that you have to turn your head to see all of it? That'd be obnoxious as all hell.

              > ...by staring at these monitors all day, we are slowly degrading it.

              No, that's age. As you age, the tissues that make up your eye and the muscles that control it fail more and more to get rebuilt correctly. I think the colloquial term for this is that they "wear out". It sucks shit, but we're currently too bad at bioengineering to really stop it.

      • FuriouslyAdrift 6 hours ago
        This is the only large true monitor I know of. It used to be branded by Acer, but now it is branded through Viewsonic. We have a bunch at work and everyone loves them. $570 for 43" 4K

        https://www.viewsonic.com/us/vx4381-4k-43-4k-uhd-monitor-wit...

      • Aurornis 7 hours ago
        > I'd kill for a 40" 5k or 6k to be available

        There are a number of 40” 5K wide monitors on the market. They have the same vertical resolution as a 4K but with more horizontal pixels.

        • swiftcoder 6 hours ago
          Yeah. I guess that's the way. I'm not wild about such a wide aspect ratio, and all the head-turning or chair-swivelling it implies.
    • layer8 9 hours ago
      The likelihood of dead pixels increases quadratically with resolution, hence panel yield drops correspondingly. In addition, the target audience who has hardware (GPUs) that can drive those resolutions is smaller.
    • Paianni 10 hours ago
      The Asus PA27JCV is rather less than $2k...
    • mschuster91 8 hours ago
      > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

      Multiple reasons.

      The first one being yield - yes you can get 8K screens, but the larger they get, the more difficult it is to cut a panel with an acceptably low rate of dead/stuck pixels out of a giant piece of glass. Dead pixels are one thing and bad enough, but stuck-bright pixels ruin the entire panel because they will be noticeable in any dark-ish movie or game scene. That makes them really darn expensive.

      The second reason is the processing power required to render the video signal to the screen, aka display controllers. Even if you "just" take regular 8 bit RGB - each frame takes up 33 million pixels, so 796.262.400 bits. Per frame. Per second? Even at just 30 FPS, you're talking about 23.887.872.000 bits per second - 23 gigabits/s. It takes an awful, awful lot of processing power just to shuffle that data from the link SerDes around to all the control lines and to make sure they all switch their individual pixels at the very same time.

      The third is transferring all the data. Even if you use compression and sub-sampling, you still need to compress and sub-sample the framebuffer on the GPU side, transfer up to 48 GBit/s (HDMI 2.3) or 77 GBit/s (DP 2.1) of data, and then uncompress it on the display side. If it's HDCP-encrypted, you need to account for that as well - encrypting and decrypting at such line speeds used to be unthinkable even two decades ago. The fact that the physical transfer layer is capable of delivering such data rates over many meters of copper cable of varying quality is nothing short of amazing anyway.

      And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p).

      What's stopping further progress? Other than yield and simple physics (similar to microchips, the finer the structures get the more difficult and expensive it is to make them), the most pressing issue is human visual acuity - even a human with very good vision can only make useful sense of about 74 of the theoretical 576 megapixels [1]. As we already established, 8K is at 33-ish megapixels, so the usual quadratic increase would already be far too detailed for 99.999% of humans to perceive.

      Yes, you could go for intermediate sizes. 5K, 6K, weird aspect ratios, whatever - but as soon as you go there, you'll run into issues with video content because it can't be up- or downscaled to such intermediates without a perceptible loss in quality and, again, a lot of processing power.

      [1] https://clarkvision.com/articles/eye-resolution.html

      • Aurornis 7 hours ago
        > And the fourth is generating all the data. You need absurdly high definition textures, which requires lots of VRAM, lots of regular RAM, lots of disk I/O, lots of disk storage (your average AAA game is well beyond 100GB of data at-rest for a reason!), and then render power to actually render the scene. 8K has 16x (!) the pixels of regular FullHD (1080p).

        You don’t need to scale everything up to match the monitor. There are already benefits to higher resolution with the same textures for any object that isn’t directly next to the player.

        This isn’t a problem at all. We wouldn’t have to run games at 4K.

      • zamadatix 7 hours ago
        ~half of these reasons state sub $2000 8k TVs shouldn't exist, but they do.
        • Aurornis 7 hours ago
          The individual pixels on a 60 inch 8K TV are the same size as the pixels on a 30 inch 4K computer monitor. Most 8K TVs are even bigger than that, so their individual pixels are already easier to manufacture than your average 4K monitor or laptop screen.

          You can’t compare large TVs to medium size computer monitors.

          • simoncion 5 hours ago
            > You can’t compare large TVs to medium size computer monitors.

            When half of those four reasons don't require having a PC attached to the display, and three fourths of the four have nothing to do about the panel manufacturing process, you totally can.

    • littlestymaar 10 hours ago
      > Me and a friend were just chatting how annoying it is monitors stalled out at 4K. I think I got my first set of 4k monitors ~15 years ago (!) and there's been no improvements since then apart from high end pro monitors resolution wise.

      It's mostly because the improvement over 4k is marginal. In fact, even from 1920x1080 it's not so big of a deal, which is why people keep buying such monitors in 2025.

      A the worse is that the higher spending consumer segment of PC parts, the gamers, can't really use high resolution display at their full potential because it puts such a burden on the GPU (DLSS helps, but the results is even smaller of an improvement over 1920x1080 than regular 4k is)

    • znpy 8 hours ago
      Ah yes. It’s the same with memory… 8gb/16gb is incredibly common, even though 16gb memory was a thing in like 2008 already. It’s only with high end machines that you get 64/128gb memory, which should be much more common in my opinion.
  • qaq 14 hours ago
    6K 32" ProArt model PA32QCV might be more practical for YN crowd at 1299 USD VS 8-9K USD PA32KCX will run you
    • retrac98 13 hours ago
      An aside - this monitor is proving surprisingly difficult to buy in the UK. Everywhere I look it seems to be unavailable or out of stock, and I’ve been checking regularly.

      Relatedly, I also don’t understand why a half-trillion dollar company makes it so hard to give them my money. There’s no option to order ASUS directly on the UK site. I’m forced to check lots of smaller resellers or Amazon.

      • brzz 9 hours ago
        Struggling with the exact same issue myself. If you do find a place to buy it, please let me know
      • qaq 13 hours ago
        Was same in US till maybe 2-3 weeks ago. Maybe they are slowly rolling out to various markets
      • ErneX 8 hours ago
        Same in Spain, I got tired of looking for it.
    • tom_alexander 9 hours ago
      I'm not buying a new monitor with a decade-old version of DisplayPort. Non-oled monitors are products that last a long time (at least a decade) so if I bought this monitor, I'd still be using DisplayPort 1.4 from 2016 in 2036. I need UHBR20 on a new monitor so I can rest assured that I will have some lanes available for my other peripherals. I've already lived the hell of needing to dedicate all 4 lanes to DisplayPort, leaving only a single USB2.0 connection remaining for all my other peripherals to share[0][1].

      [0] https://media.startech.com/cms/products/gallery_large/dk30c2...

      [1] https://i.imgur.com/iGs0LbH.jpeg

      • Aurornis 7 hours ago
        I also wish it had something newer, but for that price I’d gladly deal with a second cable for high speed USB devices or the purchase of a dock to handle breakout duties.
      • simoncion 5 hours ago
        > I'm not buying a new monitor with a decade-old version of DisplayPort.

        With the greatest of respect, this is a deeply silly way to think of it.

        The way you should be thinking of it is:

        > I'm not buying a new monitor that requires DSC to run at native resolution. That's fucking garbage.

        Since DP 1.4, the only thing the DisplayPort version indicates that an end-user gives a shit about is the maximum supported speed link speed. So, if all you need is HRB3 to drive a display at its native resolution, refresh rate, and maximum bit depth without fucking DSC, then DisplayPort 1.4 will be just fine. And if DSC doesn't bother you, then your range of acceptable displays is magically widened!

    • lwhsiao 5 hours ago
      I second this, I recently switched [1] and have been delighted by the crisp fonts.

      [1]: https://luke.hsiao.dev/blog/pa32qcv/

    • zokier 12 hours ago
      I'd imagine for most people the HDR perf difference is more noticeable than the resolution. This new monitor can do 1200 nits peak with local dimming, PA32QCV can only do 600 nits peak with no local dimming. Also Dolby Vision.
      • qaq 8 hours ago
        I'd imagine most people can't spend 9,000 USD on a monitor
    • WilcoKruijer 10 hours ago
      I've been enjoying the PA32QCV in the last couple months. It's definitely not perfect, but the 220 PPI at 32 inch is just amazing to code on.
  • fleventynine 17 hours ago
    No mention of 120Hz; I'm waiting for a 6k or higher-density display that can do higher refresh rates.
    • dietr1ch 17 hours ago
      I was going to joke about 8k@120Hz needing like 4 video cables, but it seems we are not too far from it.

      [8k@120Hz Gaming on HDMI 2.1 with compression](https://wccftech.com/8k-120hz-gaming-world-first-powered-by-...)

      > With the HDMI 2.2 spec announced at CES 2025 and its official release scheduled for later this year, 8K displays will likely become more common thanks to the doubled (96 Gbps) bandwidth.

      • FootballMuse 13 hours ago
        Uncompressed, absolutely we need another generation bump with over 128Gbps for 8K@120Hz with HDR. But with DSC HDMI 2.1 and the more recent DisplayPort 2.0 standards is possible, but support isn't quite there yet.

        Nvidia quotes 8K@165Hz over DP for their latest generation. AMD has demoed 8K@120hz over HDMI but not on a consumer display yet.

        https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_...

        https://en.wikipedia.org/wiki/HDMI#Refresh_frequency_limits_...

        https://www.nvidia.com/en-gb/geforce/graphics-cards/compare/

      • ternus 15 hours ago
        My primary monitor is the Samsung 57" 8Kx2K 240Hz ultrawide. That's the same amount of bandwidth, running over DisplayPort 2. It mostly works!
        • sfjailbird 11 hours ago
          I have three 4K 27" which yield a bit more screen real estate. Otherwise I'd love to go to a single ultrawide.
          • dietr1ch 6 hours ago
            I prefer 3 monitors because it eases window management while being cheaper. For gaming I only need one 240Hz+ monitor and for Lan parties I only take that one.

            Although for sim racing I've been thinking about getting a single ultra wide and high refresh rate monitor, but I'd probably go for a dedicated setup with a seat, monitor and speakers. It gets pricey, but cheaper than crashing IRL.

            • sfjailbird 3 hours ago
              Yeah, window management is certainly better with separate monitors, hopefully this will get better with time.

              On the flip side I would love to get rid of physically managing three individual pieces of hardware that wasn't made to work together as one setup.

        • _zoltan_ 14 hours ago
          I use the same monitor can I love it. Couldn't recommend it more to people.
        • wellthisisgreat 12 hours ago
          Is it actually good for productivity? The curve isn’t too aggressive? Could you, e.g. stack 3 independent windows and use all 3? Or you kind of give up on the leftmost / rightmost edges ?
          • dietr1ch 4 hours ago
            I think window managers these days do a better job on 3 monitors than on a single one that could have the same area.

            With an ultra wide you lose the screen concept for managing area and it gets awful because you lose grouping windows on different screens, picking per-monitor workspaces, moving windows across screens.

            Either monitors need to present themselves as multiple screens, or window managers need to come up with virtual screens to regain the much needed screen abstraction.

        • mohsen1 14 hours ago
          Fifty seven inches??
          • DrProtic 13 hours ago
            Just two 4k monitors slapped together, it’s 8k wide but 2k tall.
      • Dylan16807 17 hours ago
        Also as far as 6k goes, that's half the bandwidth of 8k.
      • kondro 15 hours ago
        Thunderbolt 5 supports up to 120Gbps one-way.
        • usrusr 7 hours ago
          Just don't try putting something convenient in between, at least that's what my adventures in TB4 taught me: displayport from a TB port works fine, even when DP goes to a multiscreen daisychain and the TB does PD to the laptop on the side, but try multiscreen through a hub and all bets are off. I think it's the hubs overheating and I've seen that even on just 2x FHD (Ok, that one was on a cheap non-TB hub, but I also got two certified TB4 hub hubs to fail serving 2x "2.5k" (2560x1600). And those hubs are expensive, I believe that they all run the same Intel chipset.
        • throw0101d 6 hours ago
          > Thunderbolt 5 supports up to 120Gbps one-way.

          Two clarify, there are two options for allocating bandwidth:

          * 80Gbps both up- and downstream

          * 120Gbps in one direction (up or down), and 40 in the opposite

          See:

          * https://en.wikipedia.org/wiki/Thunderbolt_(interface)#Thunde...

      • ranger_danger 12 hours ago
        > 4 video cables

        The IBM T220 4k monitor required 4 DVI cables.

        https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors

    • ryukoposting 17 hours ago
      I wouldn't hold my breath. Competing models seem to top out around 120 Hz but at lower resolutions. I don't imagine there's a universal push for higher refresh rates in this segment anyway. My calibrated displays run at 60 Hz, and I'm happy with that. Photos don't really move much, y'know.
      • eviks 16 hours ago
        > Photos don't really move much, y'know.

        They do when you move them (scroll)

        • justsomehnguy 16 hours ago
          And?

          Can you provide a ROI point for scrolling photos at 120Hz+ ?

          • klausa 15 hours ago
            It looks and feels much better to many (but not all) people.

            I don't really know how you expect that to translate into a ROI point.

          • eviks 13 hours ago
            Sure, give me your ROI point for an extra pixel and I can fit refresh rate in there.
      • klausa 16 hours ago
        I imagine your mouse still moves plenty though.
        • ryukoposting 4 hours ago
          If "the mouse looks nice" is the selling point, I'm not sold.
  • tombert 16 hours ago
    I swore a blood oath that I would never buy an Asus product ever again, after three terrible laptops from them in a row, but holy hell do I kind of want this monitor.

    My main "monitor" right now is an 85" 8K TV, that I absolutely love, but it would be nice to have something smaller for my upstairs desk.

    • mnw21cam 10 hours ago
      I have a fantastic Asus laptop that is 8 years old now and (after an easy battery replacement) easily does everything I want from it and feels nice and solid. I was so impressed that I recommended Asus to someone else, and what they got was pretty awful.

      So basically, YMMV. They make good stuff, and they make awful stuff.

    • ssivark 16 hours ago
      What are the cons of having a large TV as a monitor? I've been considering something like this recently, and I wonder why is this not more common.
      • bee_rider 15 hours ago
        Someone mentioned the latencies for gaming, but also I had a 4K TV as a monitor briefly that had horrible latency for typing, even. Enough of a delay between hitting a key and the terminal printing to throw off my cadence.

        Only electronic device I’ve ever returned.

        Also they tend to have stronger than necessary backlights. It might be possible to calibrate around this issue, but the thing is designed to be viewed from the other side of a room. You are at the mercy of however low they decided to let it go.

        • ycombinete 12 hours ago
          You could probably circumvent this by putting the display into Gaming Mode, which most TVs have. It removes all the extra processing that TVs add to make the image "nicer". These processes add a hell of a lot of latency, which is obviously just fine for watching TV, but horrible for gaming or using as a pc monitor.
          • bee_rider 10 hours ago
            It was a while ago (5 years?), so I can’t say for certain, but I’m pretty sure I was aware of game mode at the time and played with the options enough to convince myself that it wasn’t there.
        • xnx 11 hours ago
          > horrible latency for typing

          Was this the case even after enabling the TVs "game mode" that disables a lot of the latency inducing image processing (e.g. frame interpolation).

          • sim7c00 10 hours ago
            game mode is a scam. it breaks display quality on most TVs. and still doesn't respond as fast as a PC monitor with <1ms latencies.... it might drop itself to 2 or 3 which is still 2x or 3x atleast slower.

            you can think 'but thats inhumanly fast, you wont notice it' but in reality, this is _very_ noticeable in games like counter-strike where hand-eye coordination, speed and pinpoint accuracy are key. if you play such games a lot then you will feel it if the latency goes above 1ms.

            • eurleif 10 hours ago
              Where are you finding monitors with <1ms input lag? The lowest measured here is 1.7ms: https://www.rtings.com/monitor/tests/inputs/input-lag
              • Dylan16807 22 minutes ago
                They measure in a particular way that includes half a frame of unavoidable lag. There are reasons to do it that way, but it's not objectively the "right" way to do it.

                Rtings basically gives you a number that represents average lag without screen tearing. If you measure at the top of your screen and/or tolerate tearing then the numbers get significantly smaller, and a lot of screens can in fact beat 1ms.

              • theshackleford 10 hours ago
                Most people lack an understanding of displays and therefore what they are quoting and are in fact quoting the vendors claimed pixel response time as the input lag.

                It’s gotta be the most commonly mixed up things I’ve seen in the last twenty years as an enthusiast.

                • sim7c00 9 hours ago
                  well atleast i didn't misunderstand my own lack of understanding :D ... -

                  the part of feeling the difference in response times, that's true though, but I must say, the experience is a bit dated ^^ i see more high resolution monitors have generally quite slow response times.

                  <1ms was from CRT times :D which was my main counter-striker days. I do find noticable 'lag' still on TV vs. monitor though but i've only tested on HD (1080p) - own only 1 4k monitor and my own age-induced-latency by now far exceeds my display's latency :D

              • sim7c00 10 hours ago
                false advertisements :D
      • swiftcoder 9 hours ago
        Depending on the specific TV, small details like text rendering can be god-awful.

        A bunch of TVs don't actually support 4:4:4 chroma subsampling, and at 4:2:2 or 4:2:0 text is bordering on unreadable.

        And a bunch of OLEDs have weird sub-pixel layouts that break ClearType. This isn't the end of the world, but you end up needing to tweak the OS text rendering to clean up the result.

      • tombert 15 hours ago
        I'm sure there are reasons with regards to games and stuff, but I don't really use this TV for anything but writing code and Slack and Google Meet. Latency doesn't matter that much for just writing code.

        I really don't know why it's not more common. If you get a Samsung TV it even has a dedicated "PC Mode".

        • zf00002 3 hours ago
          A bunch of the mech eng at my work have switched from 2 monitors to big tvs for doing their CAD stuff.
        • baq 11 hours ago
          "PC Mode" or "Gaming mode" or whatever is necessary - I can tell any other mode easily just by moving the mouse, the few frames of lag kill me inside. Fortunately all tvs made in this decade should have one.
        • mrguyorama 2 hours ago
          Lots of us HAVE tried using a TV as a primary monitor, I did for years.

          Then I bought a real display and realized oh my god there's a reason they cost so much more.

          "Game mode" has no set meaning or standard, and in lots of cases can make things worse. On my TV, it made the display blurry in a way I never even noticed until I fixed it. It's like it was doing N64 style anti-aliasing. I actually had to use a different mode, and that may have had significant latency that I never realized.

          Displays are tricky, because it can be hard to notice how good or bad one is without a comparison, which you can't do in the store because they cheat display modes and display content, and nobody is willing to buy six displays and run tests every time they want to buy a new display.

      • terribleperson 15 hours ago
        If you play video games, display latency. Most modern TVs offer a way to reduce display latency, but it usually comes at the cost of various features or some impact to visual quality. Gaming monitors offer much better display latencies without compromising their listed capabilities.

        Televisions are also more prone to updates that can break things and often have user hostile 'smart' software.

        Still, televisions can make a decent monitor and are definitely cheaper per inch.

      • jmarcher 15 hours ago
        For me, on macOS, the main thing is that the subpixel layout is rarely the classic RGB (side by side) that macOS only supports for text antialiasing.

        If I were to use a TV, it would be an OLED. That being said, the subpixel layout is not great: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...

        • bestham 15 hours ago
          IIRC Apple dropped sub pixel antialiasing in Mojave or Sonoma (I hate these names). It makes no sense when Macs are meant to be used with retina class displays.
          • ahoka 12 hours ago
            A.K.A. workaround for a software limitation with hardware. Mac font rendering just sucks.
      • sim7c00 11 hours ago
        high latency on TVs make it bad for games etc. as anyhting thats sensitive on IO timings can feel a bit off. even 5ms compared to 1 or 2ms response times is noticable by a lot in hand-eye coordination across io -> monitor.
        • puzzlingcaptcha 10 hours ago
          It sort of depends on what you perceive as 'high'. Many TVs have a special low-latency "game" display mode. My LG OLED does, and it's a 2021 model. But OLED in general (in a PC monitor as well) is going to have higher latency than IPS for example, regardless of input delay.
          • Dylan16807 14 minutes ago
            > But OLED in general (in a PC monitor as well) is going to have higher latency than IPS for example, regardless of input delay.

            I hope you mean lower? An OLED pixel updates roughly instantly while liquid crystals take time to shift, with IPS in particular trading away speed for quality.

          • tombert 2 hours ago
            I have a MiSTer Laggy thing to measure TV latency. In my bedroom Vizio LCD thing, in Game Mode, is between 18-24ms, a bit more than a frame of latency (assuming 60fps).

            I don’t play a lot of fast paced games and I am not good enough at any of them to where a frame of latency would drastically affect my performance in any game, and I don’t think two frames of latency is really noticeable when typing in Vim or something.

          • TheOtherHobbes 10 hours ago
            OLED suffers from burn-in, so you'll start seeing your IDE or desktop after a while, all the time.

            I have a couple of budget vertical Samsung TVs in my monitor stacks.

            The quality isn't good enough for photo work, but they're more than fine for text.

        • dahauns 9 hours ago
          In the context of this thread that's a non-issue. Good TVs have been in the ~5ms@120Hz/<10ms@60Hz world for some time now. If you're in the market for a 4K-or-higher display, you won't find much better, even among specialized monitors (as those usually won't be able to drive higher Hz with lower lag with full 4k+ resolution anyway).
      • xeonax 15 hours ago
        I have been using a 43 inch TV as a monitor, since last 10 years, currently on a LG. You get lot of screen-space, as well as you can sit away from desk and still use it. Just increase the zoom.
      • 112233 14 hours ago
        For me it's eye fatigue. When you put large 4k TV far enough it's same view angle as a 27" desk monitor, you're almost 1.5m away from it.
      • monkpit 15 hours ago
        Usually refresh rate and sometimes feature set. And it’s meant to be viewed from further away. I’m sure someone else could elaborate but that’s the gist.
    • 8cvor6j844qw_d6 14 hours ago
      What would you pick for your next laptop if you had to buy one?

      I had an Asus laptop, but the frequent security firmware updates for one of the Dell laptop that I had makes me think it might make a good candidate in terms of keeping up with security updates.

      Not sure for the current latest models for Asus/Dell/HP/etc., but I liked the fact that disassembly manuals are provided for older Dell and HP. I can hardly find disassembly manuals for Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.

      • speedgoose 13 hours ago
        I’m only one data point, but I also swear that I would never buy an Asus laptop again. If you are fine with the operating system, a MacBook Pro is the best in my opinion. It’s not even close.

        Otherwise I had okay Dell or Lenovo laptops. Avoid HP, even the high end Zbook ones. A framework might be worth a try if you have a lot of money.

        • sspiff 13 hours ago
          I have used a ZBook G1a for the past few months because it is the only laptop with AMD's Ryzen 395+, and while not thinkpad or XPS/Precision tier, the laptop has been perfectly fine.
          • xarope 12 hours ago
            I've been toying with getting one of these with 128GB of RAM. What's your opinion (especially since you have compared it to thinkpad/xps)?
        • simulator5g 13 hours ago
          You can also run Asahi Linux or Windows for ARM on Macs
          • sspiff 13 hours ago
            I run Asahi Linux as a daily. Support is imperfect and for a daily driver you can probably forget about using anything newer than an M2 at the moment. On my M2, missing features include USB-C video out and microphone support. Windows on ARM is worse and has zero drivers for Mac hardware as far as I know.
      • tombert 5 hours ago
        I am a pretty huge fan of Thinkpads. I bought mine a year ago and love it.
      • mrguyorama 2 hours ago
        My girlfriend's 2 year old Asus Zenbook had easy to find repair manuals and was pretty repairable. Though consumer laptop naming conventions make googling for it error prone.

        The main problem was parts. She had a fan that was defective and noisy, and the Asus parts store didn't have it in stock, and there was one on ebay for $30.

        But the replacement was easy, the construction was solid, and there have been no issues since.

        >Asus when I have to do maintenance such as swapping out thermal paste/pads and clearing out the heatsink fins.

        If you have to do this more than once or twice over a ten year lifespan of a laptop, you probably should invest in air cleaning systems. Mid range consumer laptops are way less thermally constrained than they used to be. Ryzen CPUs are essential for that, though I think Intel now has usable cool laptop CPUs

      • 4kchiefofstaff 12 hours ago
        [dead]
  • omnibrain 7 hours ago
    Sadly it's just 16:9. Not even 16:10.

    I now run 2 3:2 Displays (BenQ RD280U) at home (more in the office, but I never go there) and love my vertical real estate. (No, portrait mode won't work out)

    • Flockster 5 hours ago
      Thank you for this suggestion!

      For Laptops 16:10 (and with the framework and surface even 15:10/3:2) is already quite common, while in the desktop market 16:9 and these ultrawides are dominant.

      With 16:9 the whitespace on websites even with Tabs on the side is simply to high ;)

  • qqxufo 6 hours ago
    8K doesn’t create new design problems; it removes the blur that used to hide them. A quick checklist that keeps UIs sane across DPI:

    Design at 1× first and preview at 100% OS scaling on a non-retina display.

    Prefer SVG/CSS for icons/illustrations; avoid shipping @2× PNG backgrounds.

    Use the system font stack; if you need web fonts, cap to two weights, subset with unicode-range, and font-display: swap.

    Keep layer/effect counts low to avoid GPU over-compositing; rasterize heavy shadows.

    Make images responsive (srcset/sizes) and lock aspect ratios to prevent reflow.

    Budget third-party JS; measure on WebPageTest/Lighthouse, not just a dev box.

    Do these and 8K stops being scary; most “looks bad on non-retina” is self-inflicted.

    • alsetmusic 5 hours ago
      > preview at 100% OS scaling on a non-retina display.

      Yeah, I feel like anyone who is serious enough to purchase a display like this ought to have a secondary display connected just for previewing their work on a "cheap" consumer device. It should be always available so that it only takes a few seconds to preview changes in the "worst-case" scenario by dragging a window over. It's why (good) phone devs test their apps on multi-year old devices.

      This is why people working in pro-audio test mixes in lots of environments. Sure, it sounds great in a treated studio. How does it sound in the car? How about on headphones? What about on the built-in speaker on your phone? You want to hear the mix in the widest and most compromising scenarios to truly understand how well it holds up. Very few people have top-of-the-line hardware and most of your audience presumably don't.

  • e40 8 hours ago
    I’ve had a few ProArt monitors and they aren’t very high quality, IME. I had high-pitched whine and blinking off/on issues, on several Mac models, from iMac to Air to Studio. Yes, I tried a variety of cables. The Apple Studio monitor, while insanely priced, has been flawless for me, sitting next to a ProArt.
    • JKCalhoun 8 hours ago
      I've often gone into an expensive display purchase with hesitation but then never regret it as, years later, when machines have moved in and out of my workspace, the display is still there.
      • e40 6 hours ago
        And something I forgot to mention, the color response of the ProArt is very odd and off. I didn't realize it when I had 2 ProArts, but when it was sitting next to the Studio display, it was obvious.

        I suspect the ProArt can be calibrated, but when I do photo editing, I just use the Studio.

  • cheema33 17 hours ago
    There is a lot of marketing material at the linked page. But there is no mention of price and available sizes. Also, there is no link to purchase one. This is November. I can look these things up, but why link to a PR fluff piece if there something more substantial available?
  • ChrisMarshallNY 10 hours ago
    Nice monitor, but its target demographic is pretty small, and its price makes Eizo look cheap.

    I’ve done a lot of color-calibrated work, and, for the most part, don’t like working in a calibrated system. I prefer good ol’ sRGB.

    A calibrated system is a “least common denominator” system, where the least capable element dictates what all the others do. So you could have one of these monitors, but, if your printer has a limited gamut, all your images will look like mud, anyway, and printer technology is still pretty stagnant. There was a big burst of improvement in inkjets, but there hasn’t been much progress in a long time. Same with scanners. I have a 12-year-old HP flatbed that is still quite valid.

    A lot of folks get twisted over a 60Hz refresh rate, but that’s not something I worry about. I’m old, and don’t game much. I also watch entertainment on my TV; not my monitor. 60Hz is fine, for me. Lots of room is my priority.

    • Scene_Cast2 9 hours ago
      Where do you go for wide gamut prints? How do commercial printers compare to consumer printers in this regard?

      I'm working on a few wide gamut art pieces, and so far the test prints have been less than stellar. Disclaimer - I'm an amateur in this field.

      • ChrisMarshallNY 9 hours ago
        Inkjets are the best bang for the buck. I had good luck with higher-end Epson printers (with good gloss/matte photo paper). The ink is much better at remaining viable for a long time, and no longer freaks out, whenever the relative humidity goes up.

        With inkjets, though, you need to keep using them. Otherwise, the ink clogs.

        Expensive process printers have wide gamuts. Laser printers basically suck. Xerox used to make decent color laser printers, but they had an odd “waxy” ink. Not sure if they still do it.

        I don’t think anyone does dye-sub printers, anymore. They used to be good.

  • evadne 1 hour ago
    Is it not November already? :p
  • metaphor 16 hours ago
    8K HDR implies that DSC becomes unavoidable...but DSC's "visually lossless" criteria relies on the human eye and is statistically subjective at face value.

    Any domain experts know how that actually squares in practice against automated colorimeter calibration?

    • amarshall 16 hours ago
      DisplayPort 2.1 (which the monitor supports) provides sufficient bandwidth for 7680x4320@60 Hz 10-bit without DSC when using UHBR20. The press release unfortunately doesn’t clarify whether the monitor supports UHBR20 or only the lower UHBR10 or UHBR13.5 speeds. Of course, the GPU must also support that (Nvidia RTX 5000 only at the moment, as I believe AMD RX 9000 is only UHBR13.5).
      • a-french-anon 13 hours ago
        I believe you're right regarding AMD's lack of UHBR20 on its cards. Fingers crossed for their next gen!
        • jsheard 9 hours ago
          AMDs current workstation cards do support UHBR20, just not their consumer cards, even though it's the same silicon. Artificial segmentation on GPUs is nothing new but segmenting on display bandwidth is a strange move, especially when the market leader isn't doing that.
    • altairprime 16 hours ago
      8K 60fps 4:4:4 8bpp uncompressed requires a 96gbit HDMI cable, which is labeled Ultra96 in HDMI 2.2 afaik: https://www.hdmi.org/download/savefile?filekey=Marketing/HDM...

      DisplayPort over USB4@4x2/TB5 at 120Gbps would be required for uncompressed 12bpp.

      • metaphor 15 hours ago
        Apologies for not tracking; the monitor in question is spec'd with HDMI 2.1 and TB4 I/O.
        • altairprime 13 hours ago
          Apologies never expected when it comes to USB and HDMI naming and spec stuff. I have to look them up every time.

          But, that’s 8K DSC or.. 24fps maybe? then. Weird oversight/compromise for such a pro color-focused monitor, perhaps Asus reused their legacy monitor platform. “8K HDR” at 24fps could be a niche for theater movie mastering, perhaps?

  • cmgriffing 17 hours ago
    I shudder to think how small the macOS ui text will be on this but I’m willing to find out.
    • Kerrick 15 hours ago
      For macOS, 8K should have a larger screen. This 8K monitor is 32 inches, which leaves us with a very awkward 275ppi. 42" would be 209ppi, which is great for 16.5" from your face. 48" would be 183ppi, which is great for 18.8" from your face (my preference). But at 32" and 275dpi, that would be a 12.5" viewing distance, which is far too close for a 32" monitor. You'd be constantly moving your neck to see much of the screen--or wasting visual acuity by having it further.

      macOS is optimized for PPIs at the sweet spot in which Asus's 5K 27" (PA27JCV) and 6K 32" (PA32QCV) monitors sit. Asus seemed to be one of the few manufacturers that understand a 27" monitor should be 5K (217ppi), not 4K (163ppi). 4K will show you pixels at most common distances. But if you follow that same 217ppi up to 8K, that leads to 40.5" not 32".

      My wife has a triple vertical PA27JCV setup and it's amazing. I've been able to borrow it for short stints, and it's nearly everything I've ever wanted from a productivity monitor setup.

      • numpy-thagoras 15 hours ago
        Yeah I currently daily drive a 43" monitor and it has been a life changer since I got it in 2022.

        I'm still happy with it, would kill for an 8K 43" 120hz monitor but that's still a ways away.

      • zakki 14 hours ago
        What is the right size for 4K monitor and the distance from our eyes? I have Skyworth monitor at 27" already. If I set macos resolution at 4K, the default font is too small. My distance with the monitor is around 16,5".
        • Kerrick 7 hours ago
          My personal preference is 24" nominal for 4K (a 23.8" 4K display has 185ppi), running at @2x (1080p equivalent), because I keep my monitors a bit further back on my desk than others do. I run a triple Dell P2415Q setup. It was great when I started, but as more websites assume 1080 CSS pixels of width means I want tablet mode, its utility has decayed. Because of that I've gone from 3 vertical to 1 horizontal flanked by 2 vertical. These aren't made anymore, and for years nobody else was making monitors at this density. It seems they've become popular again since COVID, and ViewSonic even released the VP2488-4K this year, complete with Thunderbolt 4 and DCI-P3. Asus's offering at this size & resolution is the PA24US.

          Another great option is 22" nominal (a 21.5" 4K display has 204ppi), also running at @2x. This is better if you keep the monitor closer to the keyboard side of your desk, or if your desk is not very deep. These are hard to find, and even when you do they're likely to be a portable monitor. Asus has a page for a PQ22UC but I can't tell if it is no longer available, or no longer sold in the US.

    • pugz 15 hours ago
      I recently (a couple of weeks ago) got the 6K version of this screen, the Asus PA32QCV. It has the same pixel density as my MacBook Pro, so the UI looks great. To be honest, it's enough screen real estate that I now operate with my laptop in clam shell mode.

      My only complaint is that the KVM leaves a bit to be desired. One input can be Thunderbolt, but the other has to be HDMI/DisplayPort. That means I need to use a USB-C cable for real KVM when switching between my two laptops. I'd like two cables, but four cables isn't the end of the world.

    • Cyphus 14 hours ago
      You can scale the UI according to your preferences, but the real problem is that if your monitor’s ppi is not close to the macOS sweet spot of 220ppi (or an integer multiple thereof) you’re going to have aliasing issues with text and other high contrast elements.

      https://griffindavidson.com/blog/mac-displays.html has a good rundown.

    • SamuelAdams 17 hours ago
      You can run it natively, but it is better to downscale to 4k or 1080p. I run three 5k versions of this monitor and they are all downscaled to 1440p. I get 1:1 pixel mapping so text looks crisp in every app except Microsoft Teams.
      • Tepix 14 hours ago
        Isn‘t downscaling the wrong term? You‘re still taking advantage of its native resolution.
    • BoorishBears 17 hours ago
      It'll look normal, maybe even a little big by default if the XDR is anything to go by

      OSX does great at scaling UIs for high resolutions

  • hoangtrannn 9 hours ago
    Is there a good 5k monitor at 27" that does not burn the wallet? It's worth mentioning that it should be also very reliable because these monitors seem to have issue after awhile, especially burn-in.
    • ebbi 1 hour ago
      If you're not too worried about warranty etc, you could buy an old 5k iMac and convert that to work as an external display. I've converted a dozen now - highly recommend! If you aim for a 2017+ year model, you'll have a pretty reliable display. The 2015 models tend to have issues with red banding on the edges which gets super annoying when using a text editor with white UI.
    • def- 9 hours ago
      ASUS ProArt PA27JCV
  • BloodOrb2 6 hours ago
    ASUS warranties are useless now. Don't buy. See Gamers Nexus vids for more details.
  • jeffbee 3 hours ago
    I once bought an Asus ProArt Display and I boxed it up and sent it back inside of 20 minutes. They have fans, a fact that is not mentioned anywhere in their sales materials.
  • bob1029 12 hours ago
    I tried a 32" 4k for a while but the form factor never worked for me. 8k seems absurd after working with that monitor.

    27" 1440p is much easier to drive and live with day to day. I can still edit 4k+ content on this display. It's not like I'm missing critical detail going from 4k=>qhd. I can spot check areas by zooming in. There's a lot of arguments for not having to run 4k/8k displays all day every day. The power savings can be substantial. I am still gaming on a 5700xt because I don't need to push that many pixels. As long as I stay away from 4K I can probably use this GPU for another 5 years.

    • zokier 12 hours ago
      32" 4k is pretty much the worst of all worlds configuration. It is just dense enough that traditional 100% scale is not great, but not dense enough to get that super smooth hidpi effect either. I'd argue that for desktop monitors around 200 ppi is sweet spot, so 5k for 27" or 6k for 32".

      This 8k is bit overkill, but I suppose makes some sense to use a standard resolution instead of some random number.

      • jon-wood 11 hours ago
        These things aren't for use in an office setting where you're fiddling with a web browser, Excel, or writing software. They're for situations where colour calibration matters, so either designing for print, or working on video.

        Particularly for the people doing video an 8k display is great - that means you can have full resolution 4k video on screen with space around it for a user interface, or you can have a display with the 8k source material on it if the film was shot at that resolution.

      • stephenr 11 hours ago
        Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.
    • bartvk 11 hours ago
      There's two instances where 32" is helpful. First for Xcode and Android Studio, where you write some UI code and the phone/tablet preview on the right, in both horizontal and vertical orientation.

      And second for doing writing and research, because recently I had to get a certificate for which I had to write a portfolio of old-fashioned essays. 32" but even 40" is extremely helpful for this. Basically I kept my screen organized in three columns with the word processor on the left, and two PDFs in the middle and on the right.

    • ThatMedicIsASpy 11 hours ago
      42" 4k 100%

      I don't want to ever go back but I got this 2020 Dell for 200. I don't want to pay 800-1400 if I ever have to replace it

    • baq 11 hours ago
      I HATE (yes, all caps) Apple for very actively discouraging 1440p as a useful resolution (as in, it is literally, not figuratively, painful to use out of the box). I'm a happy customer of BetterDisplay just to make it bearable, but it's still not as sharp as any other OS.
  • polaris421 15 hours ago
    This looks amazing for creators — 8K, HDR, and auto calibration in one screen!
  • veridianCrest 15 hours ago
    The specs look impressive, especially the 8K HDR and built-in color calibration. It’ll be interesting to see how it performs compared to Apple’s Pro Display XDR in real workflows.
  • cubefox 2 hours ago
    I wonder why local dimming zones remain so limited in LCDs. This one has 4032, which corresponds to a background LED resolution of only 84×48. That's about 90×90 (over 8000) colored LCD pixels per white background LED.

    This is far too coarse to accurately resolve fine differences in HDR brightness, e.g. from lamps, car lights, street lights, specular highlights etc.

    Perhaps the background LEDs are still relatively costly? Or customers just don't care enough to justify putting in significantly more? Which is unfortunate, since although OLED monitors have perfect fine HDR contrast, the overall achievable screen brightness is quite low compared to LED LCDs.

    Which makes both technologies suboptimal for HDR content, for different reasons.

  • guerrilla 16 hours ago
    Why does it have blinders?
    • andrewstuart2 16 hours ago
      To prevent glare and reflections usually. Similar to how a lens hood functions.
      • jerjerjer 4 hours ago
        Seems to be a strange setup for a marketing photo.
  • inatreecrown2 11 hours ago
    i long for a “Eizo” like quality monitor, 15 or 17 inch, with “retina” ppi count.

    am i the only one who thinks that this would make sense?

    • jeroenhd 8 hours ago
      There are a few displays like that, although quality will differ on your criteria of course. Many of them are marketed as "portable monitors", some specced for gaming, others for artists, some built to be cheap.

      ASUS ProArt PA169CDV, UPerfect 184T01, Lipa AX-60 (and AX-60T), UPerfect UFilm A17, UPerfect UGame J5, and two portable screens by Verbatim, just to name a few.

    • precompute 10 hours ago
      Hey, I think that's a great idea, too. 4K panels on phones (tiny!) exist for some absurd reason. But somehow there are no 22" 4K monitors. I think they probably don't sell well. Probably the same reason why all monitors are 16:9.
      • jeroenhd 8 hours ago
        At one point there was the ASUS ProArt PQ22UC, but I don't think that panel was produced after that stopped selling.

        If you go slightly bigger, there are the ASUS ProArt PA24US, Japannext JN-IPS2380UHDR-C65W-HSP, ViewSonic VP2488-4K, AG Neovo EM2451, and the UPerfect UColor T3.

  • jbellis 17 hours ago
    About twice the price of the Dell 8k.
  • jiggawatts 16 hours ago
    This is a direct competitor to the Apple Pro Display XDR.

    I wouldn’t be surprised if it comes in at a similar price point.

    The sustained 1,000 nit HDR and Dolby Vision support suggest their target market is very specifically film color grading.

    • rainbaby 15 hours ago
      it’s already on sale in the Chinese market for about 70,000 CNY, so the price is likely around 9,000–10,000 USD.
  • byyoung3 16 hours ago
    How much
  • efficax 16 hours ago
    realistically what’s the point of all those pixels at 32 inches? 5k at 27 inches seems more than enough.
    • jeswin 15 hours ago
      If you need 5k at 27 inches, you need more at 32". But if you're saying that 32" are excessive, I think it's a personal preference. I would never go back to a smaller monitor (from 32) personally - especially as you grow older.
    • metaphor 16 hours ago
      Apparently, ASUS believes there's an addressable market willing to pay a premium for +26.5% color-calibrated ppi in larger form factor.
      • efficax 5 hours ago
        i'm not suggesting there's no market, i'm just trying to understand why it exists