Any general 4k monitor advice?

I'm currently using a 24 inch 1920x1200 monitor which is about 8 years old and, I suspect, beginning to fail (uneven lighting and an annoying audible buzzing) so I'm thinking about replacements. My first thought was a 27" 1440 so that I'd have a little more screen area, and I've got my eye on a few with IPS panels and G-sync, the latter to hopefully make up a little for my 960's relative inadequacy. I can't really justify the cost compared to a more basic model, even if I can technically afford it.

For just about the same price, it seems that there are now some 27-28" 4k monitors with similar specs and features, albeit lower refresh rates (and while my GPU's obviously never going to hit those frame rates in games, I guess I might see a difference on the desktop) so I'm wondering about the long-term benefits of a little future-proofing. However, I'm also wondering about the practicalities of 4k resolution on a 27" monitor, and I don't think I'd really get along with a 32" for desktop work. I know that there are a few people on here who work/play on 4k, so I thought I'd gather some impressions.

My first thought: is Windows 10 and third party software generally able to cope with high DPI screens yet? I know Windows has scaling options, but do they work in practice?

My second thought: is 27-28" too small for 4k? My work laptop had much higher DPI than a typical 1080 monitor and with my eyesight I found it uncomfortable to work with: if scaling isn't that good in practice, then I think I might have similar problems with a small 4k, and I think a 32" would be too big for comfort at my typical desktop range of a couple of feet.

Any advice relating to the above, or about any other notable experiences with 1440/4k monitors, would be appreciated. Thanks in advance.

I have two Dell 27" 4K monitors and they're fine for me. I also have a Dell 32" 5K but unless they've come down in price a lot I'd say 4K is the sweet spot.

Most things in Windows 10 work well enough although mixed DPI still has a lot of issues.

It depends what you want 4k for.

If you're tired of seeing giant pixels on your computer, while your phone has had much finer details for years, then it's not too small.

OTOH, If you want to run things without any DPI scaling and want more space, not more detail, then it is too small.

I run two 24" 4k screens at 200% scaling (so it is like two 24" 1080p screens but with 4x the pixels/detail), and that works great.

As Jon says, DPI scaling in Windows and most apps is pretty good now. There are some annoying bugs in the OS still (like Window borders have weird gaps in them if you look closely, and the odd icon in some UIs is tiny) but they are mostly minor and cosmetic.

Also as Jon says, mixed DPI in Windows is still something to be avoided. The OS still has a ton of bugs if you have two monitors set to different scaling factors. Avoid that like the plague.

I went with 24" 4K as I wanted an integer scaling factor. 200% scaling means things designed for standard pixel sizes can be "pixel doubled" (really pixel quadrupled) so there is no blurred scaling on them. They look exactly as they would have looked on a 1080p screen, other than the minor issue of ClearType antialiasing looking a bit weird (or more weird).

If I had a 27" 4K screen I'd probably run that at 200% as well, and just spread things out over a larger space. But once you get above 27" or 28" or so, 200% scaling is probably too much (assuming the screen is at a normal desk/chair distance) and you'd maybe use 150% or 175%, at which point apps designed for 100% are going to get blurry, which is horrible.

If you do any gaming, and want 60fps rather than 30fps, you can forget 4k native. But you can run games at 1080p or 1440p and they scale well to the 4k screen. The issues with things getting blurry don't seem to matter when scaling 1440p to 4k for a game, although a native resolution is always a bit nicer (and allows you to run things windowed, too). The next-gen cards might finally do 4k gaming at a good framerate. Right now the 1080ti / Titans barely does it and can't maintain 60fps in many new titles.

Remember that HDR is coming out and refresh rates keep going up. A new VESA standard for FreeSync is coming out (although who knows if NVidia will support it, or continue to make people pay extra for GSync). There are a couple of 4K + HDR + G/FreeSync monitors on the market now but they are very expensive. Whatever you buy, I'd say don't spend too much as you might want to replace it in a couple of years.

Personally, I have a pair of pretty cheap ViewSonic 24" VX2475. They cost about as much as a good 1080p screen but are 4K, and they're similar to IPS (but using an alternative; I forget the name).

Going back to my first paragraph: The main question to ask is what you want 4K for. And I have to say that when I went back to 1080p for a month recently, I did not really miss 4K. The main difference was that text was chunkier when reading the web, and the borders of things were thicker than I'd become used to, but otherwise things were about the same and everything runs a bit faster at 1080p than 4K. I still like 4K but I'm also still not sure if it's worth it yet. Pretty soon there won't be any real cost to it so it will be, other than a few annoying cosmetic bugs that Microsoft don't seem to notice or want to fix because their QA isn't what it used to be.

One caveat: If you use VMware a lot, DPI scaling is a problem. VMware's DPI support is still terrible and wrongheaded. You can get by if you only use it briefly (like we do to test code on different OS), but not if you use it all day. There will be problems with other particular apps that I/we don't use and don't know about which might be a dealbreaker for you if you do use them, too. Also, if you use Photoshop, you will need to move to their rental versions as the last retail versions of Photoshop are straight-up broken under DPI scaling, unfortunately.

2 Likes

Other than avoiding imminent failure, the main thing I want is more screen area. I do a lot of work using two windows docked side by side and would like the extra space to avoid some horizontal squishing that I currently put up with. I don't want to lose any pixels vertically and would ideally gain a few, hence looking at 1440 where I'd be able to get a sheet of A4 fully in view without toolbars getting in the way like they do at the moment. I don't do any video work or Photoshop, and what little graphical stuff I do tends to be in a really old version of Paint Shop Pro (7, which I think dates back to 2000...). I do use VMWare, but pretty infrequently right now.

I do game a bit, but I tend not to get around to games when they're new, so peak performance isn't as much of an issue as it could be: I expect that I'll choose to downscale to 1080 with higher quality settings rather than run native with low quality, but I'd like GSync so I have a few more options (and I wish there wasn't a premium for GSync over Freesync, or that everyone was using a common standard already). Most of the games I play these days are old ones I've had in my library for years and never got around to because I was spending most of my time in WoW/SWTOR/EVE/whatever (example: just finished Dragon Age Origins, currently playing DA II, got the first two Witchers on my list for the future).

I'm currently using an old 17" 1280x1024 monitor alongside the widescreen. The two have almost the same pixel size (0.26mm vs 0.27) and even that's enough to annoy me when I split a window across two screens, but my feelings about multiple monitors and aspect ratios would take another long post to get across (short version: it's so dependent on context that I'll never be happy until I can add/remove monitors and/or additional computers within seconds to suit my whims). If I got a 1440, I'd probably keep the old 4:3 to hand: with a 4k, I'd chuck it and probably revert to sticking my laptop next to me when I want to have a movie running alongside.

I want a little futureproofing because I'd ideally hold onto the new monitor for about ten years. I don't see the point in waiting for a "perfect" combination of features, because the target would just move, so I'm not holding out for HDR/superfast refresh rates/whatever and I doubt I'll feel the need to replace a functional monitor just to get them. I would like higher DPI so that text is crisper, ideally without ClearType if that was practical (my eyes make things fuzzy enough already, thanks) - Leo: what do you mean by "more weird" on 200% scaling?

ClearType works the same if the app supports high DPI, except it works better as the pixels and sub-pixels are smaller. That part is fine.

But if an app doesn't support high DPI, and renders at standard DPI with the OS then scaling what it renders up, then the ClearType effect causes some minor issues. If ClearType aimed to have a red line down the left edge of an 'l' character, to target one column of sub-pixels on the left, but is then pixel-doubled to 200% scaling, it's going to be hitting two columns of pixels with that instead of one. Things get even worse if it's non-integer scaling and the ClearType colors are blurred.

But it doesn't look that bad; it is just something to keep in mind. You might logically think that a 4K screen could display everything a 1080p screen could at exactly the same quality, but it can't. (The other issue is, you can't always control how things are scaled. Desktop apps are fine, the OS will double pixels if it's integer scaling. But anything full-screen or rendered via 3D or video APIs is less under your control. It may be scaled in a way that makes things blurry instead. It's generally fine, but if you think you might still do a lot of work at standard DPI then I'd still recommend a standard DPI screen).

High DPI obviously won't increase the amount of usable space either, unless you want to make things really small. You get a little bit more space, as not everything is scaled (or at least not scaled linearly). Window and control borders are a bit thinner, for example, leaving a bit more space for content.

1 Like

Would this be an example of "weirdness"? http://imgur.com/frKQ4vz

The text in the System Information window on the left seems to be better than that in the Generic PnP monitor's properties window on the right (though the title bar seems clearer to me). When zoomed in, it looks like a lot more antialiasing than normal, like on the letter "I" where the two pixel wide vertical bar seems to be smeared across five or six.

No, that's worse than what I was describing.

It looks like the Generic PnP Monitor window on the right has been scaled by a non-integer amount, resulting in everything going blurry.

As a guess, it looks like scaling is set to 175% rather than 200%.

Anything other than 100%, 200%, 300%, etc. will make non-DPI-aware apps look terrible.

It's also possible it looks like that because the scaling mode has been changed but you haven't rebooted yet. A lot of things will look wrong or blurry until you reboot after changing scaling size, including large parts of the OS itself. You'll also see the same bugs that you see when mixing monitors with different DPIs until the reboot, as it puts Windows into the same mode (System DPI is different to one or more monitor's DPI, which is when the major Windows bugs start happening).

(I've noticed you have to reboot twice to update a few aspects of things after a DPI scaling change, which is odd. The difference between the first and second reboot are pretty small, but a couple of things in different programs are the wrong size until that second reboot. DPI scaling in Windows isn't completely mature yet, but as long as you use integer scaling, don't change it often, and don't mix DPIs, it's good enough to use.)

In addition to that e.g. management-console is not dpi-aware in Creator's Update (but was before, now it's blurry here at 125%). On the other side using different dpi on 2 monitors seems to be fixed.

BTW when rebooting PC (staying on login-screen), use RDP and opening DO, then close and login on local PC, DO-scaling is too large.

Might depend which snap-in is loaded into it? It still DPI scales here for the ones I've tried.

I still see several old problems with mixed DPI that haven't been fixed, from a quick check. Docked toolbars are a quick and easy way to tell things are still broken. The OS will mis-place or mis-size the taskbar and/or the toolbar if they are on the same edge of different monitors.

Haven't seen that with Opus but have seen it with other programs. Chrome's titlebars and scrollbars can be huge if the session was started via RDP with different scaling in effect.

Saying that, I've been somewhat surprised by how well RDP works from a 1080p/100% PC to a 4k/200% PC. There are some glitches, and I do sometimes reboot once I get back to the local machine to fix a few minor things, but it works a lot better than it did in the old days where everything was huge or tiny through RDP.


I keep hoping MS will fix the window borders that have random gaps in different places (e.g. the edges touching inactive titlebars) at high DPI:

I'm still torn between the two choices, 1440 or 4k. Both have about the same number of pros and cons for me, so I'll probably end up either flipping a coin over the expensive ones or buying some super-cheap and probably crappy 4k and hating it for several years before I can either justify buying a better one (yes, I'm a rage-fuelled ball of spite).

As for Microsoft fixing something that minor, I wouldn't hold my breath: it took them at least 18 months after the release of 10 to fix the error with File History that was causing it to create 10-20 copies of files that never changed and were in folders I'd told it to ignore. Might not have been a problem but some of them were .flac files, so 25MB turned into 250, repeated for 3500 tracks, and suddenly my backup drive was full. Again.

@Leo: Yep, it's the lister's toolbar, which is wrong then.

@njmorf: It was hard to decide between 32''@4K and 34''@3K curved, but decided for the curved 21:9 (3440x1440 pixel) monitor. If you play games, edit photos/videos or make music, the extra-width is nice to have (e.g. toolbars on the left/right and full workspace in the middle). Also DO is fine, having dual and preview-/metapane open and everything has got enough width.

But the monitor-size should be 34'' or larger (that's the height of a 27/28'' monitor then), everything else is too small in height. Same with 4k, which IMO only makes sense at 27'' and larger, esp. when working with it day by day.

I disagree there, although I see a lot of people say the same thing.

IMO, you want to use 100% scaling or 200% scaling. (Or 300%, etc.) Anything else and non-DPI-aware applications will be blurry because they can't just have their pixels doubled. (Even in DPI-aware applications, they may not have assets for all the in-between sizes, so icons etc. will have to be scaled more often and not be as crisp as they are at 100% and 200%.)

You could run a huge, normal DPI, 4k screen at 100% scaling, or a normal sized, high DPI 4k screen at 200% scaling. With 200% scaling, 4k works well at 24", as it's the same as having a standard 1080p monitor, just with more detail and without the old, enormous pixels. (It's also fine at 27". I'm not saying 27" is too large, but that 24" definitely is not too small, unless you already think 24" is too small for a 1080p screen, but I don't see many people saying that.)

1080p is too low a resolution for a 24" screen. The pixels are huge and you can see them when reading normal sized text. 4k on a 24" screen is a big improvement as you can no longer see the pixels. Maybe 4k is overkill, but it's 200% scaling or 100% if you want things to look good; there's no inbetween, so it's overkill or nothing. (1440p at 200% scaling will mean a huge loss of effective space, so that's not an option.)

Of course, the downside is all those extra pixels take 4x as long to process, so things can run a bit slower, even on the desktop. Whether the extra detail is worth it depends what you're doing. I'm still not sure myself, as much as I was tired of seeing huge pixels in my face when I went from my phone to my much more powerful PC.

If you really work in high resolution using higher dpi and not having "eagle"-eyes, why should I use less than 27'' on desktop? A pixel is a pixel, when working with graphics where's the advantage of a smaller monitor? Not to mention that there're still enough bugs using higher dpi than 100% not only in WIndows (see new Adobe suite).

On mobile devices a larger resolution also consums more power (gpu and cpu and needs more ram), so it only makes sense to have a high resolution when really working with them or when using as reader or when you always have power. I had to laugh as the first 4k-smartphone was presented a few months ago ("yeah, look at my 4K phone with 4GHz CPU and my golden battery-packs, so that it has enough power for approx. 8 hours" :slight_smile: ).

The relation must be right, not what's possible.

I'm still torn. With a 1440 I'd probably still have to downscale some games to 1080 to get decent frame rates, even with reduced quality settings, and the interpolation would probably annoy me. On a 4K there'd be no reason to run at anything other than 1080 or 4k, so scaled 1080 would be fine. The loss of the 120 vertical pixels compared to my current 1920*1200 monitor wouldn't bother most games, and I'm sure I'd even be able to work around it in my current WoW user interface (so many addons fighting for space...). That said, eventually there'll be a single graphics card available in my preferred price bracket that'll be able to handle native 4k in relatively recent games, at which point I'll either kick myself for only getting a 1440 or pat myself on the back for waiting long enough to get a 4k with more of the features I want on it.

There's no way my eyes will cope with a 4k desktop at native resolution without scaling; I wouldn't be able to read 9/10 point text comfortably, if at all. That said, a 27" 1440 monitor's pixels would be noticeably smaller than those on my current monitor, meaning I'd possibly have to scale up text and icons anyway, leaving me with the choice between non-integer scaled fuzziness or integer scaled far-too-largeness.

One thing I'm pretty much settled on (I think) is that I want a 16:9 ratio. I'm not convinced that I'd like a monitor to take up that much space on my desk, or that I really want that much extra width for either Windows or games. Some games, yes: extra peripheral activity would be more immersive, but I think I'd generally be happy enough without it. Given that ratio, a 24" 16:9 4k with 200% scaling would leave me with less physical screen area than my current monitor, even if the image was smoother. That wouldn't fit my main requirement for the upgrade, increased physical screen size.

My ideal monitor would be one that morphs to whatever size I currently want, or would be made up of myriad small monitors that could butt up against each other without visible seams when I want a bigger screen and hide out of the way when I don't. I saw a concept version of something like that a couple of years ago: two screens that automatically pushed together along the long edges to create something close to 16.9, and against the short edges to create a 21:9. That'd be a good start, I think.

@njmorf: Why you have to downscale? With a GTX 1070 or better you can run full resolution and mostly max. settings (on decent frames you don't need to downgrade resolution, only put some details from "ultra" to "very high" - mostly you don't see a difference!). I run all actual games @ 3440x1400 without problems, of course you need an actual upper middleclass GFX with 6+ GB VRAM and CPU.

24" and 27" are pretty similar sizes. Both are good, and which is better depends on budget, desk space, and what people actually sell with the features you want. The difference between the two is minor. I wouldn't mind having a pair of 27" screens, but I'm happy with the pair of 24" ones I currently use, which were a lot cheaper. (Nobody sells 24" 4k screens with GSync so that may be what pushes me to move to 27" one day. It won't be because I feel like the old screens are too small, though.)

A pixel is not just a pixel. You would not use a 320x200 monitor if you had the option of 1080p or 4k in the same size, even if it was a postage stamp in size.

Even at 24", 1080p has gigantic pixels by modern standards. I can see them very clearly, and fonts are made out of little lego bricks with color fringes from ClearType. On the other hand, with a 24" 4k screen everything is smooth and I cannot see the pixels outside of certain situations where you'd see them no matter the resolution (e.g. dot crawl on thin, non-orthogonal lines drawn without anti-aliasing).

I've got a 960 with 2GB. Given it's performance at 1920x1200, I have no expectation that it'll perform all that well at 1440, so it'll be the usual tradeoff of resolution vs quality, and I'll probably favour quality. I will, eventually, upgrade the card, but probably not all that soon: my last card was a 560, and my PC upgrade a year ago replaced an eight year old mobo and a first gen i7. By the time I get around to replacing the GPU, we'll likely be on the 1100 series cards.

The problem is 2GB VRAM... 4GB is even with 1080px minimum these days, with 8GB you're actually on the save side (of course not talking about Solitaire :-)).

Yeah, I realised that I should have gone for the 4 gig version almost immediately, and shortly after that I wished I'd gone for a 970, but it was too late by then.

I've used a GTX960 to run two 4k monitors, in both the 2GB and 4GB versions. Either will work fine for two 4k monitors doing desktop work.

For 3D/gaming, you won't want to run many things at 4k on anything below a GTX1080ti, so you'd be switching to a lower resolution anyway. (Some older games run OK in 4k; Xbox 360 era stuff.)

The extra VRAM in the 4GB model can help if you're doing something in 3D at a lower res on one monitor and have the other monitor still at 4k displaying the desktop. Especially now that things are being designed for a lot more VRAM and every extra helps.

But for desktop work, 2GB is completely fine. (Although Chrome is eating more and more VRAM these days. :slight_smile: At least according to Process Monitor. Not sure how that translates into real/physical VRAM usage.)

I avoided the GTX970 as it can't do 4k/60hz over HDMI and I wanted to keep that option open. (I'd had issues with DisplayPort on older Dell monitors, although it turns out to work fine now so maybe it's no longer an issue, or only with certain models/makes.) Only the 960 and 980ti of the previous generation can to 4k/60hz over HDMI, and the 980ti was too much, so the 960 it was. Great card for the money, too.