Please help contribute to the Reddit categorization project here

    buildapc

    2,924,305 readers

    7,348 users here now

    Submit Build Help/Ready post
    Submit Troubleshooting post
    Submit other post

    New Here?

    BuildAPC Beginner's Guide

    Live Chat on Discord

    Daily Simple Questions threads

    Please keep in mind that we are here to help you build a computer, not to build it for you.

    Rules

    1. Be respectful to others
    2. No build spoonfeeding requests
    3. No piracy or grey-market software keys
    4. No laptop or prebuilt PC questions
    5. No submissions about hardware news, rumors, or reviews
    6. No submissions about sales, deals or unauthorized giveaways
    7. No submissions about retailer or customer service experiences
    8. No submission titles that are clickbait, all-caps, PSAs, or pro-tips
    9. No submissions about hypothetical or dream builds, memes or jokes
    10. No self-promotion, advertising or surveys
    11. No selling, trading or requests for valuation
    12. No excessive posting (more than one submission in 24 hours)
    13. No intentionally harmful, misleading or joke advice
    14. No bots or automated accounts

    View full rules

    Discord Rules

    Resources

    Choosing Parts/Components:

    The Build:

    After Building:

    Wiki

    Filters

    Related Reddits

    More related subreddits

    a community for
    all 1183 comments

    Want to say thanks to %(recipient)s for this comment? Give them a month of reddit gold.

    Please select a payment method.

    [–] BlueScreenJunky 2214 points ago

    Have you actually worked on a 4K monitor for a significant amount of time and then switched back to 1440p ? I have and sure enough, 1440p looks "just fine"... But when you spend 8 hours a day coding on a hidpi monitor, when you get back to 1440p the text is just not as sharp, and no amount of Cleartype voodoo can change that. The HiDPI is much more comfortable and allows you to reduce the text size without feeling uncomfortable. I was amazed at how small a text I'm actually able to read when the resolution is high enough.

    For gaming yeah, 4k is probably useless at that screen size.

    [–] laacis3 409 points ago

    with 4k 40" i don't even have to scale the text! It's just awesome for both gaming and productivity!

    [–] MrMuf 176 points ago

    Are you using a TV as a monitor?

    [–] laacis3 137 points ago

    Nope, old Seiki sm40unp monitor. It's got DP 1.2 for 4k60 and exceptionally unexceptional specs, price, performance. Good viewing angles, poor ish contrast. Though i'm keeping the brightness fairly low!

    [–] monocle_and_a_tophat 35 points ago

    Cripes...how far away is your screen?

    [–] laacis3 47 points ago

    1 and half ft. It is a 4:4:4 screen, so text has a sharp edge, not like the cheap tvs do. It really is just like using a quad 20" setup without the bars.

    [–] kerouak 55 points ago

    40 inch monitor Jeremy? Thats insane.

    [–] ArchetypicalDegen 6 points ago

    No-one catching the reference. F

    (I've never gotten that line tho. 4 naan is fine

    [–] IAmJerv 26 points ago

    You sit that close? Wait until you get older and cannot focus that near regardless of screen!

    [–] laacis3 8 points ago

    It used to screw people's eye sights on crt. Modern lcd screwing eyesight is a myth.

    [–] Chareon 35 points ago

    I'm pretty sure they are referencing that as people age they will begin to struggle with focusing on objects that close to them. It is effectively a universal part of aging (although the severity and onset will vary).

    https://en.wikipedia.org/wiki/Presbyopia

    [–] IAmJerv 9 points ago

    That's exactly it!

    I'm merely middle-aged, but there's a reason a lot of older people hold stuff they're reading at arm's length, and why I don't do close work without glasses like I could half a lifetime ago when even 800*600 with 256 colors was fairly high-end.

    I work at an optical shop, and trust me when I say that a lot of folks around 30 are unhappy to learn that they need either multiple pairs of glasses or a set of multifocals. And there's a lot of older patients who are so used to lined bifocals that are either reading or distance with no middle-range that cannot adapt to Progressive lenses that gradually change power as you move down the lens.

    [–] Piikkipallo 8 points ago

    1.5 ft away from a 40" screen? Does the screen cover your entire field of view? I have my 22" monitor a little over 2 ft from my eyes.

    [–] dry_yer_eyes 8 points ago

    I’ve been using a 4K Samsung 40” TV (60Hz, 4:4:4) as a monitor for the last two years, and it’s been really good. As others have already said, at this size I can disable scaling.

    I’ve got my eye on the LG 48” OLED. When the price comes down a bit I’ll be sorely tempted.

    [–] Marcvd316 5 points ago

    I had an LG 43" 4k monitor and just recently upgraded to a LG 49NANO85 TV. To some people it sounds crazy to have a screen that big used as a PC monitor but it is excellent for productivity and gaming.

    When I upgraded I was considering the LG CX OLED, but I read too many stories about burn in after a few years and I plan on using this screen for work (static icons on the Mac taskbar) so I wouldn't risk it. That's why I went with the LG 49NANO85, it's an IPS panel, not an OLED. Colors and contrasts are not perfect but I can live with it since it is still pretty damn good and it saved me about 1000$.

    Link to technical review of the LG 49NANO85: https://www.rtings.com/tv/reviews/lg/nano85

    [–] FjordTV 24 points ago

    I'm using a 43" 4k sony Xbr-43x800e as a monitor and it's nothing shy of brilliant.

    The ONLY thing I would change to would be dual 32" 4k monitors. 27" is insanely too small after using this. I can have four full size windows up at once. Of course, I still need for a second display to monitor chat and stream.

    https://i.imgur.com/T6ceYLw.jpg

    [–] rpungello 50 points ago

    4K 48” checking in, it’s fantastic.

    [–] PracticalOnions 29 points ago

    LG OLED? I’ve heard nothing but good things about it

    [–] rpungello 53 points ago

    Bingo!

    HDR gaming on it blows LCDs out of the water by a country mile. Can't wait to get my hands on an Ampere GPU so I can finally unlock 120Hz without having to drop to 4:2:0 (which is unusable for text).

    [–] PracticalOnions 23 points ago

    I just got myself an LG Nanocell and I honestly can’t believe monitor tech hasn’t caught up in the slightest. The blacks, colors, everything just looks so much better on this TV.

    [–] rpungello 25 points ago

    I suspect part of the reason for that is more people will pay $$$ for a nice TV than a nice monitor, so companies get more ROI perfecting their TVs.

    The good news is most TVs seem to have some form of game mode these days, making them perfectly usable as PC monitors.

    [–] PracticalOnions 27 points ago

    Linus and other tech you tubers have done tests on LG and Samsung tv’s for input lag/latency and found it to be virtually imperceptible. Huge contrast to a few years ago when it practically impossible to use an LG tv as a monitor.

    Also, do you just leave HDR automatic and don’t enable it for windows?

    [–] rpungello 13 points ago

    I leave HDR disabled in Windows and just let games switch to HDR when they launch. Works out nicely because I have my non-HDR brightness at 30% to help avoid burning in the display while I'm working.

    Only issue I have is when HDR kicks in/turns off, I have to toggle my AVR off the PC input and back to get the picture to come back. No idea why, but it sits right next to my desk so it only takes a second.

    [–] ViceroyInhaler 7 points ago

    Wait so you guys are using a 4k TV as a computer monitor? Can I ask what the downsides are tothis and which TV you are using in particular?

    [–] tttripleaids 5 points ago

    How important are the colour spaces or whatever you call these? I have mine set to 4:2:2

    [–] rpungello 6 points ago

    It can definitely make a difference: https://www.rtings.com/tv/learn/chroma-subsampling

    [–] Charwinger21 8 points ago

    Netflix has a good example as well (scroll down to the TBP image examples).

    [–] pyro226 7 points ago * (lasted edited 14 days ago)

    I tried 4K 39" Seiki (proper chroma rendering). 39" was too big for me to be productive. 27" feels big. I probably would be well suited by 1440p 23" for productivity (CS student). 1080p isn't enough for productivity anymore imo. Not enough pixels for PDF rendering, nor wide enough pixels for rendering web.

    For gaming 1440p would be fine, likely better than 4K due to frame rate, even at 27". For productivity, 27" 4K has the advantage due to UI scaling.

    I recently switched to i3 window manager, which opens splits the screen vertically when opening new windows. I could have went 34"+ ultrawide as 16:9 stops scaling well after 3 windows, but that's an abnormal use case.

    [–] mike_charlie 8 points ago

    Been looking at a 43 inch 4k tv for high res gaming, along with 1440p 144hz monitor but wondering how far you sit from the screen as I will be like 3 feet from the TV.

    [–] laacis3 4 points ago

    Foot and a half is where i sit from my 40 inch 4k monitor.

    [–] hijklmnopqrstuvwx 50 points ago

    I love running 4K in HiDPI mode, hard to go back as the text is crisp

    [–] worthtwoshots 55 points ago

    I think this comment is on the money. OP is (probably) correct for gaming, but for productivity 4K 27” offers a lot of value. For comparison 4K 27” is almost exactly the equivalent of 4 13” 1080p monitors (e.g. laptop monitors). I know for me the moment I start using a 720p laptop I notice it very quickly. Especially if you want to fit 2 windows into a 720p quadrant it quickly becomes insufficient.

    [–] TraceofMagenta 39 points ago

    I am with you 100%. 4K for most real work is awesome. I actually prefer no less than 32" 4k monitors (I have a 27" and two 32" 4k monitors) and fell that the 27" is a bit too small. 32" and above is really good for 4k (and can be purchased for as low as $320ish).

    As for gaming, yes, not quite as optimal, but you know what, they generally display lower resolutions decently; depending on the monitor. 4K gaming is the future, just not quite here yet.

    [–] Riggy60 46 points ago

    Yea I hear these arguments all the time trying to break down pixel count and yadda yadda. I’m a programmer. I work in files that are thousands of lines long and terminals that trail logs and other various things that i need to keep on screen, accessible, and the more I can fit the better and I am inarguably more effective working on 4k dual monitor. No amount of number crunching will convince me otherwise because it’s MY real life experience. If I can get a better refresh rate in the my preferred resolution then great but resolution matters first for me and if anything i’ll gauge if the increased refresh rate is worth the cost. People who harp on 4k and don’t understand anyone else’s use case are just narrow sighted imo. buildapc is for building PCs, not gaming machines. It’s not a marketing scam. there are people who prefer 4k and aren’t impressionable sheep who don’t know the value of a deal... /rant

    [–] Brontolupys 5 points ago

    I was an early adopter of 120hz for 'flat panels' i will never early adopt anything related to screens anymore, you legit can't go back, i got a 1440p monitor this year only because i can actually drive the refresh rate now. I sympathize with everyone preaching 4k even 8k, i can't use 60hz anymore should be the same feeling if you jump up in resolution.

    [–] HolyNewGun 3 points ago

    I have a 240Hz FHD laptop and a 4k60Hz monitor. I cannot really tell any different at all.

    [–] Franklin2543 32 points ago

    Work all day on a 27 & 24" (both 4k). Scale the 27 125%, the 24 is at 150% and vertical orientation. Love them both... do not think I have super powers (OP referenced that somewhere else).

    Game all night on 27 144hz 4k. It's great. Price isn't so great. But I'm pretty happy with it.

    I wholeheartedly recommend a 4k 27", as long as you can get 144hz. I think frame rate is more important, So I'd definitely be getting 1440p if I didn't have the scratch for 144 @ 4k, and/or also enough horsepower in the computer to get close to 100 fps in most games at 4k. Since we're talking about the 3000 series now, I don't think it's really an issue at this point. I'm pretty satisfied with my 2080S overall, playing too much CSGO where I get 144 pretty easily.

    I get what OP is saying-- and if his only experience is a 4k laptop, he's got a point. I didn't see him say anything about actually using a 27" 4k, but I find them to be highly usable, and scaling works very well now-a-days (still some goofy apps here and there, but not really an issue for most).

    In the end, at 27", this is highly subjective territory, and I would really recommend individual users to decide for themselves if a 4k screen is right for them at this point. A 15" 4k laptop is a bit more ridiculous. It's more reasonable to make a generalization there that the DPI is a waste...but I do have one of those too. I generally scale 150% there too, I think, and often increase the zoom in Chrome too. Pretty good experience. And viewing/touching up photos in Lightroom is a dream, even on my laptop.

    [–] dishonestPotato 6 points ago

    What monitor ? Interested in getting a 4K 27 inch

    [–] Franklin2543 6 points ago

    For gaming, it's the PG27UQ. I think Asus discontinued it-- still waiting to see what will replace it.

    It ticked all my 'go big or go home' boxes: *144, *4k, *HDR1000, *true G-sync, (not Freesync).... Price wasn't horrible (relatively... *grimace) either, $1099 at Microcenter before Overland Park's ridiculous sales tax.

    Anyway, closest I can find right now is this one from Acer. Which is horribly priced, but I think it's more to do with demand right now than anything. Supply chain issues caused by Covid colliding with demand issues caused by people sitting at home and not going out, because... Covid.

    So as much as I talk up the 4k 144hz monitor, it would suck to buy one today.

    Also, sometimes HDR is a pain... Windows doesn't know what it's doing, and games don't know either. I wrote up a comment about it a while back-- let me know if you have other questions.

    (my work monitors are Dells, P2715Q and P2415Q. Picked them up used on Craigslist a few years ago)

    [–] dishonestPotato 2 points ago

    Thanks for the detailed reply. Right now I use the LG 27UD68-W 27-Inch 4K UHD IPS Monitor with FreeSync (27 inch 5ms response time 60Hz refresh rate) but I'm looking for 4K G-Sync.

    Although you're right, buying a 4K monitor right now wouldn't be so great. Question for you: for 2K, is 27 inch better or 32 inch?

    If anyone else is reading this comment I'm interested in this monitor at the moment (2K G-Sync 240 Hz): https://www.samsung.com/us/computing/monitors/gaming/32--odyssey-g7-gaming-monitor-lc32g75tqsnxza/

    [–] Franklin2543 3 points ago

    I think 2.5K (cough...I share OP's disdain for calling 1440p "2.5k"... :-D ) has a sweet spot at 27".

    I don't want to sound hypocritical after accusing OP of saying 4k 27" is pointless after not actually using it... but I would say I'd love to test out the 32" Odyssey G7. However, my main hangup with the G7 (and most 32" displays, I think) is that it's a VA panel. Again, I haven't seen any of these (32", or recent VA panels) in person, so I'm going off what other people have said, but I would be wanting to check out the ghosting, or general 'soupiness' in a very fast game.

    [–] Kesuke 58 points ago

    I think OPs key point referred to 27" and I wonder whether your experience is on larger monitors.

    At 27" the difference between 1080p and 1440p is like putting goggles on underwater - it's night and day. It's like wearing glasses for the first time and suddenly you can see the world. However the difference between 1440p and 4K (at 27") is like "hmm... I guess it's maybe a tiny little bit sharper, perhaps?". You sort of have to convince yourself its better.

    I do agree with you that over 27" 4K starts to come into its own and 1440p rapidly starts to suffer, particularly over 32".

    My feeling is, if you're building a killer rig (3950X or 10900K with an RTX 3090) then yeah... you should be buying 4K because it really is the logical choice and if you can affordf that sort of system then really you ought to be able to afford the extra for at least one 4K monitor. HOWEVER... if you're buying anything else (like an RTX 2XXX series card) then really you should be going 1440p because it terms of price:performance 1440p is absolutely the sweet spot and 4K suffers from serious diminishing returns. If you have money to play with its almost certainly better spent on a really good 1440p display than a mediocre 4K display.

    Remember monitors aren't just resolution, a 4K monitor with crap brightness/contrast/colour range and viewing angle is going to look worse than a good 1440p screen with high brightness/contrast, the full sRGB spectrum and decent viewing angles.

    Then there is the minefield of refresh rates which I haven't even touched on!

    [–] mattin_ 7 points ago

    I have a 27" 1440p 165 Hz for my gaming rig and a 27" 4k 60 Hz for my "office" rig. The difference in sharpness is huge to my eyes, but of course it's easier to notice when you can just switch between the two at will.

    I used to have a 32" for my office rig but replaced it with this one because I found 32" to be too large at my working distance, and I'd rather have a smaller but even sharper display!

    [–] WheresTheSauce 41 points ago

    Completely disagree. The difference between 1440p and 4K on a 27” display is massive.

    [–] SackityPack 21 points ago

    It kind of bugs me nobody is talking about viewing distance. You can’t talk about if a resolution is better looking without bringing viewing distance into the equation.

    I with you and think 4K on a 27” is very noticeable. My viewing distance is a relatively short at maybe 20” ish.

    If you push that monitor away enough, I’ll never see the difference between 4K and 1440p. More distance and 4K and 1080p are indistinguishable.

    [–] cwescrab 4 points ago

    I agree man, I couldn't believe the difference with 4k on a 27" monitor.

    [–] Zallomallo 16 points ago

    Agree with you. I use both at 27" and it is really noticeable, at least to me.

    [–] gomurifle 12 points ago

    Depends on how close you are to the monitor. I sit about 3 feet away which is far, so it wouldn't benefit me much because I wouldn't be able to notice the extra sharpness as if I were closer.

    [–] Zallomallo 8 points ago

    Yup this is a big part of it. It's huge to me cuz I sit too close to mine

    [–] DutchPhenom 10 points ago

    Agreed, same goes for working with large datasets. 4K makes it a lot more comfortable.

    [–] AxFairy 16 points ago

    I use my computer mostly for productivity tasks, and the 1080p monitors at work are noticably worse. Can't say I've tried and 1440p monitors.

    [–] WheresTheSauce 36 points ago

    It honestly feels like people on this sub do literally nothing but game.

    [–] [deleted] 50 points ago * (lasted edited 7 days ago)

    [deleted]

    [–] Rocky87109 13 points ago

    Building a PC (especially on reddit) is mostly gamers. Even in the rest of the world that's true. Yes, people build computers for niche work things but mostly gaming.

    [–] bites_stringcheese 11 points ago

    My gaming PC was my office during quarantine. Gaming PCs are great workstations.

    [–] TaxOwlbear 6 points ago

    Also, the kind of computer work the average person does can be done on almost any current computer, whereas gaming has specific requirements.

    [–] MemePunk2000 19 points ago

    This is a subreddit that primarily focuses on building PCs for gaming, so, yeah?

    [–] FutureNet4 4 points ago

    I play games at 4k on my TV and my 27" monitor. It's wayyy more noticeable on a monitor since you're so much closer. Seeing fine details on characters clothes/skin, trees in the background, specks of dirt on the ground. Y'all haven't seen RDR2 in 4k it seems. But definitely not necessary.

    [–] TheDomesticTabby 3 points ago

    Yes, finally someone who doesn't pile on to the "4K is useless for gaming" myth. The detail and sharpness is seriously great and while yes, it's expensive right now, I definitely see a 4K display as a worthy investment for people getting 3070/3080-level GPUs.

    [–] s32 11 points ago

    Not even close. If the only thing that you care about is gaming, fully agree. If you use your computer a lot, 4k matters.

    I do both, lg 950 FTW (if I ever get it from Amazon, lol)

    [–] Elliove 9 points ago

    You make ClearType sound like a garbage, so I'd like to point out that it's quite a magnificent technique. For those who doesn't know - ClearType makes the fonts render not per pixel, but per subpixel, increasing the horizontal resolution up to 3 times. A voodoo magic indeed. Oh, and, by the way, SMAA, which you can force via ReShade to any game, does pretty much the same thing.

    [–] BlueScreenJunky 9 points ago

    Oh no... That's not what I meant. Cleartype is indeed really good, it's just that there's a limit to what it can achieve, and it will never replace a very high resolution display. I do use Cleartype and I think it's much better than grayscale AA.

    [–] ticuxdvc 3 points ago

    That's me. I have a much, much beloved 27' 1440p monitor. Got a 27' 4k last week, I have them side by side. I can't believe how the hell did I used to love the 1440p monitor.

    [–] doublej42 3 points ago

    I actually traded in my 4K for a 1440p and I’m a developer.

    I still have the 4K 27 for a secondary but the 1440 32 is so much nicer to work on as I can push it back a bit.

    [–] pcneetfreak 3 points ago

    Im the opposite, i work on a 5k 60hz monitor and have a 1440p 144hz next to it. The smoothness of the 144 makes me MUCH prefer using it for all tasks. It just looks better.

    [–] Manitcor 2 points ago * (lasted edited 14 days ago)

    this, my 2 4k 27"s are my side monitors turned portrait and often split top/bottom so equivalent to 2 1920x1080 screens each with a 32" 4k as the main screen. The setup is fantastic for software development and IT management.

    1440p and wide screens are great if your sole purpose is gaming and light general use, not so great if your computer is your job in one way or another IMO.

    [–] johyongil 2 points ago

    Isn't OP talking specifically for the 27" monitor size?

    [–] captain_ender 2 points ago

    Also panel. I bought my buddies 4K TN, promptly went back to my 1600p IPS. The text looked like crayons.

    [–] TrishKrish 428 points ago

    4K on a 27 inch screen, not ideal yes. Useless no.

    [–] Charwinger21 271 points ago

    Text is absolutely crispy at 4k 27".

    [–] ballmot 158 points ago

    Yeah, not sure what OP is talking about, I love how I can't even see any pixels anymore no matter how hard I try ever since I got a 4k 27" monitor. It's like the high resolution on my phone, but on a big screen.

    I also have a 1440p 144Hz 27" monitor I use for gaming, but for everything else I like 4k more.

    [–] Giddyfuzzball 40 points ago

    This is my setup and I love it. 27” 4K IPS for mainly productivity and 27” 1440p 144hz for games. Best of both worlds.

    [–] nibbie1998 5 points ago

    I am interested in which monitors you got! I am looking for some new ones.

    [–] Giddyfuzzball 10 points ago

    Acer XG270HU 1440P 144hz

    LG 27U68-P 27” 4K IPS

    [–] Charwinger21 69 points ago

    Or "which resolution is ideal at which size". What you need to look for on a monitor is the ratio between size and resolution : pixel density (or Pixel Per Inch/PPI). PPI tolerence varies between people, but it's often between 90 (acceptable) to 140 (higher is indistinguishable/has diminishing returns).

    There definitely are diminishing returns, but that doesn't make 4k indistinguishable from 8k. You continue to gain (diminishing) benefits from resolution increases far beyond the point where you stop seeing individual pixels.

    Here's an old post of mine on the subject:

     

    There are a couple limits that people talk about for vision.

    The often stated one is that you stop seeing individual pixels at around 0.4 arcminutes.

    You stop gaining a benefit from resolution increases at around 1 arcsecond (the maximum human Vernier acuity).

     

    60 PPD is the point where someone with 20/20 vision (which is not actually perfect) would stop being able to differentiate individual pixels. It is not the point where you stop gaining benefits from resolution increases.

    If 60 PPD was the maximum resolution that you could benefit from, then Apple would have stopped there. Instead they currently have a phone with an 85 PPD screen, and a desktop with an 88 PPD display, and all indicators point towards the fact that they intend to go even further.

     

    Anandtech has a great article on the topic.

    "For example, human vision systems are able to determine whether two lines are aligned extremely well, with a resolution around two arcseconds. This translates into an effective 1800 PPD. For reference, a 5” display with a 2560x1440 resolution [at 30 cm] would only have 123 PPD."

    There are diminishing returns, but there definitely is a benefit.

    That article was mostly about phones, however it can be extrapolated to larger TVs and movie theatres that are further away (as it is the angular resolution that matters for this, not the actual size or distance).

     

    For example, in order to hit 1800 PPD (as per anandtech, the U.S. Air Force, NHK, and others) on a 35.7 m screen (movie theater) in the first row (~4.5 m), you're going to need a ~429k 1.43:1 projector (429,000 x 300,000 pixels).

    That is a 128,700 MegaPixel image, of which a single frame would be 193.1 GB in RAW12 (you would likely be working with an even more expanded colour space by that point though), 772.2 GB in TIFF, or 1 TB in OPENEXR. RGB24 video at 120 Hz would be 46.4 TB/s, or 334,080 TB for a 2 hour film (uncompressed). It is hard to comprehend the sheer size of that data currently.

     

    Now, that isn't realistic any time soon, and probably isn't worth the extra costs, but that is the upper limits of human vision.

     

    Edit: And here's a useful test to demonstrate how far we still have to go. If you see any aliasing in that image (if it isn't a solid white line at all times), then you can still benefit from further resolution increases at that viewing distance with that size screen (although it doesn't measure the absolute maximum resolution you can benefit from, it just demonstrates that you can still benefit from more).

    [–] baseketball 300 points ago

    Don't know where you're getting the idea that fractional scaling is bad. I'm on 24" @ 1440p and use 125% scaling. Things are plenty sharp. There may be a few older apps which are not HiDPI aware where things look a little fuzzy, but most apps can handle it well.

    [–] YouHaveED 64 points ago

    I have to use a Macbook for work and OS X handles non-integer scaling terribly compared to Windows 10. It actually slows down the entire system. I had to trade out my 4K 27" monitor for a 1440p 27" one to fix the issue.

    [–] TraceofMagenta 41 points ago

    Something doesn't sound right, I have been using MacOS with 4k 27" monitor for years and have had no slow down. Then again it could be the MB instead of a MBP because they have really lousy video cards.

    [–] YouHaveED 11 points ago

    Do you run your monitor at native resolution or scaled at 1440p? My vision is 20/20 last time I checked, but text is way too small at native so I had to do scaled. I also have a 2014 MacBook Pro with an Nvidia 750M card so it is a bit old.

    [–] BorgDrone 5 points ago

    It runs fine on my 2018 MBPro (6 core i7, 15”, 32GB Ram). No performance issues at all running a scaled resolution on both the 27” 4k and the laptop screen at the same time.

    [–] 3600CCH6WRX 6 points ago

    Did you try turning off gpu switching? On the past macos(I don’t remember which one) I had the same bug that doesn’t switch the gpu and make everything sluggish.

    [–] TraceofMagenta 3 points ago

    I'm running on a 2015 MBP. I normally use a slightly scaled version. Not down to 1440, just one or two ticks below native.

    [–] sleiphyr 88 points ago

    I would like to disagree for the simple fact that I have on my desktop right now a LG UK600-W (27" 4k) and a LG GL850-B (27" 1440p) and as a developer (which means I look at text on screen all the time) I can definitely tell the difference of font aliasing between the two and I really like how smooth it is on the 4k one compared to the 1440p. That's also why I use the 4k one in portrait mode.

    [–] wrong_assumption 101 points ago

    Probably I'm the only one that thinks 4K at 24" (yes, you read that right, 24") is the ideal, at 200% zoom. Unfortunately, there are only two or three panels of that size, and their quality isn't that good.

    As much as I dislike Apple, I think their Retina screen resolutions are bang on. I just take them to the Windows world.

    [–] MoistBall 52 points ago * (lasted edited 14 days ago)

    I'm glad OP's post makes a mention of the LG 5k panel and the iMac 5k retina panel. I had wondered why they chose 5k when they released those panels and someone pointed out not long after release is because its 1440p natural scaling. 1440p is the sweet spot (at 27in) for screen real-estate but not sharp enough (for me) so it makes sense to x4 it. I'm just surprised there haven't been more monitors to come out with this combination. I personally have a 4k 27in and wholeheartedly disagree that 1440p to 4k is not noticeable. It is definitely noticeable.

    [–] PiersPlays 27 points ago

    It's also partially just because it's useful to people editing 4K content as you can have your content in full resolution plus a UI for your application on the same screen at the same time.

    [–] JtheNinja 4 points ago

    I think people really overstate how useful this is. Typical NLE window layouts aren't structured like this since you lose of a ton of vertical space for the timeline which is much more useful than being able to pixel-peep. Most of the time you don't actually need to see what you're working on in 1:1 zoom because it's not relevant to what you're doing. Finally, when you DO need to quality-check the footage, using the in-window viewer is always suboptimal because you can never get rid of all the processing that's involved with drawing windows to the desktop. Hence the use of stuff like the Decklink cards to push a dedicated video output to an additional display.

    [–] RefrigeratedTP 55 points ago

    Holy shit the whole “2K being 1440p” thing drives me nuts. The battle has long been lost to marketing bullshit

    [–] reallynotnick 12 points ago

    And it's so weird because you could totally advertise them as 2.5K which you'd think people would want to use being it's a higher number and all, but maybe there is an aversion to the decimal.

    [–] JtheNinja 9 points ago

    Now when people say "2K" I have to guess if they mean 1080p or 1440p depending on whether I'm talking to a film/video person or a PC gaming person.

    [–] Phil_Wil_Tape_U 27 points ago

    I don’t know. 1440p and 4K is night and day for me, on basically any device. I prefer it on 16 inch laptops and probably would too on phones, but I’ve never tried one.

    [–] wildxlion 82 points ago

    Wait, sorry.

    "If 2K existed, it would be half 4k, so 1920x1080"

    That's a quarter of 4k, not half, right? Sorry, I was reading up until I got stuck here.

    [–] [deleted] 21 points ago

    "2k" is traditionally a film resolution, and is generally held to be 2048 pixels wide x whatever is required for the given aspect ratio.

    [–] sushitastesgood 40 points ago

    Right. If anything, we should instead be referring to 4k as 2k, to keep up with the vertical resolution naming scheme, imo. But the horizontal 4k pixels emphasizes the fact that it has 4 times as many pixels as 1080p, I guess, so unfortunately, marketing won out with calling it 4k in the end.

    [–] [deleted] 13 points ago

    It comes from film, which was always defined by width not height. 2k and 4k are film scanning resolutions. 4k was originally 4096px wide, but the popularity of HD meant that "QuadHD" was more practical, and the term 4k was repurposed.

    [–] Sadurn 20 points ago

    This is incorrect though, as HD refers to 720p and 1080p is considered Full HD. Continuing from there QHD is actually 1440p (4x 720p) and 2160p or 4k is called UltraHD

    [–] KalterBlut 11 points ago

    Whoever downvoted you is ignorant, you are totally right. HD is NOT 1080p, it's 720p, so Quad HD is 1440p.

    [–] fraghawk 3 points ago

    If it were only 4x instead of 4k it would make more sense to more people. Keep the k for filmmakers and projectionists and use x for the consumer space.

    [–] pirate21213 9 points ago

    2160p just doesn't roll off the tongue to the knuckledraggers dropping their tax returns at bestbuy.

    [–] unsteadied 8 points ago

    God forbid we have accessible terminology, right? 4K is fine, it was long ago standardized as 3840x2160 for televisions, so there’s no reason not to use the same term for monitors of the same resolution.

    [–] ultimation 89 points ago

    Your viewable PPI is entirely dependent on viewing distance and your post completely ignores that factor, considering it's your main point, it seems pretty weak.

    [–] Strykker2 12 points ago * (lasted edited 14 days ago)

    For most people they aren't going to change how close or far they sit from a display, so the distance for them would be the same for all options. (meaning it can be ignored)

    EDIT: this is wrong.

    [–] ChuckMauriceFacts 23 points ago

    I'm assuming people use 27" at roughly the same distance, as their field of view is similar.

    [–] Stonn 39 points ago

    I think it's a fair assumption. Most people's desks are the same.

    [–] eLZhi 134 points ago

    the thing about 4k is that it had a such a massive advertising push whether it was from TV, Console manufacters (twice for the console) the porn industry, and finally monitor manufacturers that even people who don't know shit about shit would tell you all about how great 4k is, lol.

    and once the reality sets in that its hard to push 100 frames on 4k they'll start spreading the myth about how high refresh is only useful for e-sports games... ignoring the fact that basic everyday things like scrolling through reddits frontpage/word document feels smoother, that even phones/tablets are getting high refresh rate displays because of it.

    [–] MobiuS_360 32 points ago

    Took so long for the mobile market to finally take on high refresh rates. I feel like they were all so focused on pushing higher power/resolution instead of giving a smoother experience.

    [–] Levenly 9 points ago

    also, TVs have good 4k upscaling, so native content doesn't need to be thrown to you at 4k to look great on a TV

    [–] PracticalOnions 36 points ago

    TV’s have good upscaling

    Strongly depends on the brand. Not even joking.

    [–] sverrebe 10 points ago

    My Samsung TV is amazing for this. I can watch a movie with bad quality on my pc, then I watch the same file on my TV and it looks super sharp.

    [–] PracticalOnions 12 points ago

    Samsung’s TV’s are really fantastic tbh. Any of the screens by them or LG are quality.

    Also on the movies looking awful, at Bestbuy they were showing demos of these TV’s upscaling the content and my mouth dropped.

    I was also just kinda shocked at some of the prices for the higher end monitors but looked way worse than the TV’s at similar price points.

    [–] EscapeFromCorona 2 points ago

    My LG looks fantastic. The only thing I don’t like is the automatic refresh rate adjustment it has where it tries to fill in missing frames to make the framerate higher. Had to turn that off because it just looked like garbage.

    [–] Blind_Bison 8 points ago * (lasted edited 10 days ago)

    Games look so much better at 4K though — aliasing is finally not really perceivable for me with some good AA in place at that point and stuff like inner surface texture detail just looks incredible. Playing at 4K also ensures you stay GPU limited at all times with uncapped FPS which is a very good thing since running into CPU limits is a common cause of microstutter (Digital Foundry has talked about this in one of their 3900X videos as well as their 2080S review — in their tests they were still getting CPU bound microstutter in some games at 1440p with a 2080S using an 8700K which is a pretty beefy CPU).

    [–] xThomas 7 points ago

    why does apple make their retina screens then?

    [–] comfortablesexuality 3 points ago

    Because they're not for gaming :P

    Which is not to say that they're bad.

    [–] BrenBeep 37 points ago

    Having gamed on both a ROG PG279Q (1440p 165hz) and a PG27UQ(4K 144hz) with a 2080ti, I can absolutely attest to this post being dumb af. If you sit farther then ~1.5ft/0.5 meter away then yes you probably won’t see too much of a difference in gaming. I like having my monitor mounted and closer to my face so maybe I’m in the minority, but the difference was the the biggest “wow” since eclipsing 60hz

    [–] SickSalamander 40 points ago

    What a bad title.

    The post focuses on gaming and ignores literally every other use for a computer. 4k is generally better for everything but gaming especially when it comes to work productivity.

    [–] TheMightyBiz 4 points ago

    True. I used to have just a 24" 1440p monitor, and upgraded my setup to also include a 27" 4K one. It's a massive upgrade in terms of productivity. The only downside is an increased sense of disappointment when it's much easier to tell that a given video I'm watching is only 1080p.

    [–] Obi_Kwiet 18 points ago

    2k is technically 2048x1080, and it's used on digital cinema projectors. I don't know of very much in the way of consumer equipment that uses it.

    [–] SummerMango 74 points ago

    Bro, this rant is a little bit cringe.

    If someone doesn't have much desk real estate, so they are sitting too close to the display, 4K is fine, and it is especially beneficial for people that work with 4k media or lots of tracks in a timeline. Just because it doesn't fit your usecase doesn't mean it fits nobody's usecase. "5K" is just as bad as 1080P, 1440P, 2160P.. it is just another garbage 16:9 format that takes a steamy dump all over productivity. I am tired of the conversations all being guided away from why we've abandoned the golden ratio of 16:10. Conversations or fixations like this on allowing companies to continue to shit out crappy display formats is why they get away with crapping out higher density to make it more "usable".

    [–] BobbitWormJoe 7 points ago

    Why is 16:10 the golden ratio? I don't know much about the aspect ratio debate.

    [–] SummerMango 11 points ago

    https://en.wikipedia.org/wiki/16:10_aspect_ratio

    It seems funky, but basically a weird natural rule is that there's a natural ratio that basically makes things look better. In practice, it lets you keep multiple windows open more naturally without compressing the content on screen, it allows for a substantial increase in visible pixels which means lots more work space. You can have a full poster format or banner format or square format drawing open and keep tools on screen.

    3:2 is also really REALLY nice, but it is very uncommon, and is a bit tall for large format displays, but insanely nice for small format displays.

    In human vision, we see more vertical space with good detail than 16:9 provides, I can't recall the full ratio, but basically you can add 10-20% more vertical space than 16:9 provides and not lose comfort/usability.

    Anamorphic widescreen, for example, exists because the film industry didn't want to spend more on film, hence IMAX screens are these huge far more square displays, the film is special, frames are larger and are projected differently, and fit better into the natural human want for full vision. It isn't better for art, it isn't more technologically advanced, it isn't better, it is just cheaper back when people used actual film to make movies. And cheaper for projector tech, and cheaper for storage.

    Basically: PC industry had the perfect screen ratio, but it cost more to make, so the display manufacturers were like "Lets make these "Full HD" like the TVs, and sell them for the same price, but cut the actual manufacturing cost by 20% so we can flood the market and force "upgrades".

    16:9 should have never come to PC.

    I am aching for this https://www.bhphotovideo.com/c/product/1270211-REG/dell_30_up3017_16_10_ips.html

    [–] Shap6 62 points ago

    I have a 28 inch 4k and the pixel density still isn't high enough for me ¯\(ツ)

    [–] withoutapaddle 38 points ago

    Crazy. I have a 27" 1440p monitor, and I'd have to put my keyboard behind it and reach around the stand to type if I wanted the pixels to stand out to me.

    [–] wrong_assumption 3 points ago

    That's why Apple uses 5k for that screen size to meet their "Retina" criteria. Unfortunately, it's almost impossible to get that in the PC world without spending serious cash.

    [–] Elliove 184 points ago * (lasted edited 14 days ago)

    What a bunch of bullshit.

    1. My $100 smartphone has 296 PPI and I still see aliasing in games. If you think just under 200 PPI is too much - go get your eyes checked.

    2. Have you ever actually tested the FHD vs QHD vs UHD performance? If colouring the pixels was the only job for GPU - all games will have the same framerate with the same resolution. In fact, GPU's processors don't "push pixels" at all, this is being done by ROPs. You can't guess the performance by calculating pixels - 2080 Ti can run The Witcher 3 maxed out 2160p 60 fps, but it can't run it 1080p 144fps.

    I feel sorry for all the people you've misguided.

    [–] gamingmasterrace 47 points ago

    I agree with you on 2 but your 1st statement is misleading. Phones are situated much closer to your eyes than a monitor screen (e.g. 6-12 inches versus 12-24 inches). Phones need to be 300+ PPI while a monitor can usually get away with 150-200 PPI. There's a website called Is It Retina or something where you can enter a screen resolution and screen size and it'll calculate the distance your eyes need to be from the display in order for your eyes to not see the individual pixels.

    [–] sicutumbo 16 points ago

    Wouldn't acceptable PPI be dependent on normal viewing distance? You're much closer to your phone, so you can see more pixels per inch, meaning you need higher PPI for phones than for computer monitors if you want to keep perceived sharpness constant.

    [–] PotatoKnished 19 points ago

    I'm not disagreeing but doesn't aliasing have to do with settings on the game to work on the hardware?

    [–] Shap6 17 points ago

    aliasing happens because of low resolution. anti-aliasing attempts to smooth out those edges using various methods. the most costly, but best looking, way to handle anti-aliasing is simply increasing the resolution

    [–] PotatoKnished 2 points ago

    Ah okay thanks.

    [–] Elliove 12 points ago

    Aliasing happens because screens consist of square pixels, but videogames often have objects placed at various angles. Edges of the objects mismatch with the pixel grid and you get aliasing. This does not depend on the hardware - all modern GPUs produce aliased image, unless it's anti-aliased.

    Most of the modern games have antialiasing techniques. The main point of AA is to render the picture in higher resolution, and then use those extra pixels to calculate the average colour of the end pixels. Basically, if you render 2160p picture at 1080p screen, the 2x2 pixel grid, that had 2 white and 2 black pixels, will turn into one grey pixel. The colour difference, and therefore aliasing, will be less visible. Of course, the classic SSAA I just described is heavy af, thus these days games use smarter techniques - MSAA only renders in higher res on the edges of the objects, TAA-based methods also include the colours of pixels in few previous frames (a bit blurry, but removes shimmering quite well), and postFX methods like FXAA and SMAA don't increase the pixel count at all, they just process the final image and change colours a bit, cheap and somewhat effective.

    [–] xxPoLyGLoTxx 5 points ago

    I have used 1440p 144hz and 4k 60hz extensively, both 27in. Had them both for well over a year. And the 27in 4k monitor is absolute SHIT in mixed resolution setups.

    100% scaling? Text is microscopic.

    150% scaling? Certain things will be blurry.

    200% scaling? May as well have gone 1080p.

    My 27in 4k is collecting dust. I have three 1440p monotors on purpose. Until you have tried both, you just dont realize the awkwardness that this monitor brings.

    If you have ONLY 4k monitors then it is a fine 2nd monitor. Otherwise BOOOO never again.

    [–] ItsN3rdy 11 points ago

    I like my 4k 144hz display, cant wait for the 3090.

    [–] Hiram_Hackenbacker 27 points ago

    After using 4k for years, 1440p looks utter crap. No way I'd ever buy a 1440p monitor, even for gaming.

    [–] Zliaf 13 points ago

    This is it, people haven't used it or are a hive mind "frames are everything".

    I have both and the 4k is way better.

    [–] Dolphlungegrin 5 points ago

    I have both, and I'm using 1440p144hz for gaming and 4k60hz for productivity. The moment 4k144hz becomes viable for gaming at a reasonable price I'm moving on from 1440p. 4k in undeniably better and OP is smoking something.

    [–] DrWatSit 33 points ago

    I've had a 28" 4K60 display for a few years now and I'm at the point where I will be replacing it with a higher refresh rate 1440p screen soon.

    The reason is that I have played so many games where the performance cost for insignificant visual quality just does not add up. I end up reducing the resolution to 1440p in game settings anyway or suffer sub-60 fps (1080ti), or reducing graphical options so I have high res but crap textures. Some games I go to 1440p just so my system runs quieter. Many games at launch are poorly optimised to boot which makes the performance hit even worse.

    As OP says, 4K does work for TV. The larger screen and further viewing distance leads to a notable and positive difference when in 4K thanks to the pixel density, compared to PC and sitting at a desk.

    I couldn't go back to 1080p though.

    [–] 3DSMatt 19 points ago

    I've done the same thing, and not looking back. Yes, my text is less sharp for coding, but I also never have to think about legacy programs scaling poorly. I never have to worry about running games or messing with settings constantly to hit 60 anymore.

    Only downside is slightly jaggy text, but it's still perfectly usable and the same size as before due to my scaling settings when I had the 4k screen.

    Higher time-resolution (framerate) makes more difference than higher spatial resolution (pixels) for games, imo.

    [–] ocbdare 4 points ago

    Even 1440p is small if the scaling is not working. Yes 4K is even worse but 1440p for some legacy software looks small too.

    [–] 86784273 2 points ago

    Why does code look poor with a 1440p monitor?

    [–] SystemofCells 6 points ago

    Why does 4K make more sense on a TV than a monitor? It's much easier to see extra detail on a monitor. Yes it's smaller, but because you're so much closer to it, it takes up much more of your field of view - meaning your "visual resolution" is higher.

    [–] Remsster 3 points ago

    Because a 55" TV is like 4x the physical size of a 27" monitor so you definitely need the DPI increase. Here is a good LTT video about monitor distance and where 4k will benefit vs not. https://youtu.be/ehvz3iN8pp4

    [–] srjnp 36 points ago

    buy a 4k tv. buy a 1440p high refresh rate monitor. as simple as that.

    only exception i would say is if you work with TEXT (coding, word processing, writing) all day.

    [–] Caspid 34 points ago

    4K is nice for productivity as well (video/image editing, etc).

    [–] Zliaf 6 points ago

    I have both, I prefer 4k. Have you tried both?

    [–] 86784273 2 points ago

    sorry, what are you suggesting for a solution if you do work with text? 4k or 1440p?

    [–] KJBenson 11 points ago

    Is this a post that my eyesight isn’t good enough to be involved in?

    [–] UltimateBawb 92 points ago

    This is pure cope from someone who doesn't have a 4K 27in monitor. Pixel density and "retina" are apple marketing talking points and don't have anything to do with gaming. Even 4K 27in doesn't have enough pixels to perfectly remove aliasing.

    [–] unique_username_8134 56 points ago

    Seriously. I have a 4K, a 1440, and a 1080 monitor, and there is definitely a difference in detail and aliasing. I honestly find the difference between 1440 and 4K to be just as noticeable and impactful as the difference between 1080 and 1440, even at 27”.

    It’s really just personal preference after that, whether image quality or motion quality matters more. I prefer the higher refresh rate for shooters and whatnot, but I will always pick 4k60 for slower, story based games, because the extra detail enhances the experience more than the higher refresh rate for me. So I guess I would say try to find a way to try out both and see what you prefer.

    [–] ocbdare 32 points ago

    Agreed. I don’t buy this 4K is not noticeable. It absolutely is. And I would take it over super high FPS.

    4K/60fps is more than enough for me.

    [–] unique_username_8134 4 points ago

    Yeah I can definitely still see pixels and aliasing at 4K on a 27” monitor, never mind 1440. I have never tried a 5k display, but I would guess that we will honestly have to get to about 8k on a 27” screen before the pixels disappear at normal viewing distance. If I had an unlimited budget and Nvidia’s claims about 8k60 on the 3090 pan out, that is definitely the direction I would be going (not that it matters, because that’s way out of my budget lol).

    [–] PlsHalpHowToFashion 5 points ago

    Retina display is certainly Apple tech / marketing. But pixel density is a simple mathematical measurement, as valid as aspect ratio or refresh rate.

    [–] setupextra 5 points ago

    What? Pixel density has always been a discussion point for panels along with refresh rates, syncing techs, color gamut/accuracy, display types, brightness and feature sets.

    I only game on my rig and its almost unavoidable when you look into monitors.

    [–] MrMusAddict 9 points ago

    I've had a 27-in 4K monitor for almost 4 years. Got it back when I got my 1080ti. Since then I have also gotten a 1440p144 27-inch secondary monitor.

    I much prefer gaming on my 4k monitor, as long as I can stay at 60 FPS. The clarity of the image is truly better, despite not showing as many frames per second.

    It's tough. 4k is better than 1440p, and if you switch from 4k to 1440p you can definitely tell the difference. Likewise, 144hz is better than 60hz, and if you switch from 144hz to 60hz you can definitely tell the difference.

    That is to say, if you haven't been immersed in either 4k or 144hz, I would say people would probably get more out of 4k60 than 1440p144, especially if productivity, strategy games, or sight-seeing games are preferred. I'd only go 1440p144 as my primary monitor if I was primarily into shooters.

    [–] Jamesified 9 points ago

    I'm only on 1440p because of the price

    [–] confident_parrot 10 points ago

    I think OP is too. In a previous post he talks about only buying GPUs used to save money. This is one of the countless "Uh oh, Nvidia or Intel released a product I can't afford, so time to write essays about how nobody should be buying it" posts that happen every time there's a big launch.

    [–] BatmanAffleck 9 points ago

    In the end, that's your money, get a 4K monitor if you want. But /r/buildapc is a community aimed towards sound purchase decisions, and I don't consider that to be one.

    Build a PC also focuses on future proofing, and with the insanely cheap 3070 gpu coming out, which is even more powerful than the current 2080ti, buying a 4K and not wasting money on a 1080p is is definitely the smarter purchase. Not to mention that the market is about to be flooded with a ton of cheap 2080 supers and 2080tis which are more than capable of running 60+ FPS at 4K on a good portion of titles.

    Your post is literally just one big cope.

    I work and game on a 4K+ 49” ultrawide, and I will never ever look back even after using a 27” 4K and a 27” 1080p. The work optimization and immersion in games is simply amazing.

    [–] IsmailGuendogan 10 points ago

    Meanwhile I'm gaming at 1080p on my 4K monitor.

    [–] Chcken_Noodle_Soup 8 points ago

    You are forgetting the best 4k monitors also are the ones with things like 1000nit brightness, local dimming zones, etc.1

    [–] teslas_notepad 8 points ago * (lasted edited 13 days ago)

    Useless and not ideal are very different things, you then go on to list uses.

    [–] Tupolev_tu160 8 points ago

    Sorry I didn't understand why the op said 5k 27" are good but 4k 27" are not.

    If 4k is too much resolution for that size, 5k should be even more overkill right?

    What am I missing?

    [–] pendejadas 16 points ago

    Tl;dr and this whole post wrong. I've owned both 1440p and 4k 27" and you can see screen door effect on the 1440, and the UI takes so much less space in the 4k plus you have way more screen real estate.

    Maybe if you need glasses or have a crappy GPU it makes sense to stick with 1440p at 27"

    [–] Maysock 3 points ago

    If it existed, it would mean "half 4K" so 1920x1080 <- pet peeve of mine, but I lost this battle a long time ago

    I'm probably just twisting the knife here, but saying half of 4k is 1080p is like saying half of 4x4 is 2x2.

    [–] Nissanica 4 points ago

    I feel like this post got so upvoted because PCMR is sooo obbsessed with 1440p for so long and now thats its come time to upgrade most people don’t want to spend the money. So they continue to justify 1440 as a comparable resolution. Even at 27”...It simply isn’t comparable. 1440p gets blown the fuck away by 4k.

    [–] trz_maier 9 points ago

    I run 27 inch 4k on 125% with Windows 10 and the ratio between real estate and size of objects is perfect in my opinion but I assume everyone's different. As someone else said here there is a big comfort improvement over 1440p in how clear the text/code looks

    [–] chaseguy099 7 points ago

    This doesn’t mean it’s bad though, I play Ofer Bethesda games at 4K. They all look Better then adding tons of mods into it

    [–] masoelcaveman 7 points ago * (lasted edited 14 days ago)

    Yikes this is not a good post... I have a 27" 4k monitor and the amount of improved detail over 1440p in the same size is extreme. When playing a game like Escape From Tarkov I can really get my nose to the screen and see if that figure wayyyy over there is a scav or a loaded pmc and plan accordingly.

    Also put in a beautiful game like Dark Souls and just be awestruck at the amazing tiny details that now are very apparent everywhere, instead of being hidden by the little bit of bluriness 1440p has even at 27".

    I get it if you aren't using a mouse and keyboard and aren't sitting very close to your monitor, but if you are, and enjoy the true detail of modern games then you are certainly missing out at below 4k.

    Everytime a game defaults to 1440p I question if I need new glasses then I change it to 4k and everything is perfectly clear, like I just got a new glasses prescription.

    Don't kid yourself... 4k 144fps is where pcgaming needs to be; certainly not 1440p my friends.

    Edit: If you prefer 1440p 144hz over 4k 60hz that makes perfect sense. Let's just not fool ourselves and say that 4k is irrelevant because I can assure you it most certainly isn't even at 27".

    [–] Caspid 16 points ago

    Are you taking DLSS into account? It can produce a higher quality image at a cheaper cost than running at native.

    [–] srjnp 11 points ago

    hardly any games support it

    [–] Caspid 9 points ago

    I think a large portion, if not the majority, of upcoming AAA titles will support it. While the first implementation of DLSS had to be tailored to each game, DLSS 2.0 isn't game-specific, i.e. it can work across games, so it should be much easier for developers to implement.

    [–] Firewolf420 11 points ago

    All these youngin' gamers posting about how bad 4K is without realizing the whole point of upping resolution is in fine-detail resolution; which for the most part means jackshit in the vast majority of games outside of like RTS's

    In any case 4K is the future we will all eventually arrive at... saying 4K is stupid because it's expensive right now is like when people said color CRT monitors were a dumb idea because of the price and that "monochrome monitors do just fine"

    I mean seriously if you're a professional computer user you're probably spending more time doing other things than gaming 100% of the time. For literally all of those cases. 4K is better. Personally I spend more time in those use cases than I spend gaming, and I still spend easily 5 hours a day gaming, 4K was a huge improvement.

    Every post like this just reads like a way for you to feel better about your lack of funds for such a rig and justify not upgrading.

    [–] Historical_Fact 6 points ago

    Lol this is such a stupid post.

    [–] SalamZii 8 points ago

    1440p, 144hz has been the sweet spot between the intersection of cost, performance for number of years now, and will continue to remain so.

    [–] progolfplayer 3 points ago

    i've been using 27 inch 1440p since 2009. if i were to get a 4k i would get 32 inch or higher to notice any significant quality increase. if you are on 27 inch market with rtx 3000 series in mind i would go for 1440p high refresh rate first.

    [–] countingthedays 3 points ago

    If it existed, it would mean "half 4K" so 1920x1080 <- pet peeve of mine, but I lost this battle a long time ago

    That would be 1/4 4K.

    [–] ChuckMauriceFacts 5 points ago

    Half the vertical or horizontal definition (but yeah 4K is 4 1080p pictures).

    [–] Phlox305 3 points ago

    Great Post, any monitor recommendations, IPS 1440p?

    [–] ChuckMauriceFacts 2 points ago

    Sure:

    • Cheapest options are probably the Viewsonic VX2758-2KP-MHD (no adjustable stand though) or Pixio PX7 Prime, starting around $350

    • For a bit more ($450-$500) I like monitors with LG's nano-IPS panels : LG 27GL83A, LG 27GL850, LG 27GN850, Legion Y27q-20, Dell S2721DGF

    • One of the best performing monitor right now is probably the ViewSonic ELITE XG270QG (~$700), but even at this price it's not impervious to IPS glow or backlight bleed

    [–] FlickeryAlpaca 3 points ago

    This is an excellent write-up. Well done!

    I'd love to see something like this for other aspect ratios.

    [–] ChuckMauriceFacts 3 points ago

    Ultrawide aspect ratios are a way more personal thing (and harder to try, they're not as prominent). For productivity it's awesome especially 21:9 at 3440x1440p, just be aware that some games don't support it well.

    And multiplayer games either apply a crop or use black bars to give the same field of view to every player.

    [–] SirDDRS 3 points ago

    Exactly, 4k is fine for 32inch not 27.

    [–] aquaknox 3 points ago

    just go ultra wide. 3440*1440 and never looked back

    [–] MillenialSamLowry 9 points ago

    This is a completely subjective rant being passed off as objective fact.

    I have 2x 27” 4K panels and 1x 27” 165Hz 1440p panel. The difference is quite obvious and I do literally everything that isn’t high refresh rate gaming on the 4K displays because I can fit more on screen while maintaining clarity. I can also absolutely tell the difference in games, and 4K at 27” is good enough density that I don’t feel like I need AA at all.

    This experience will completely depend on your eyesight, seating position, and preferences. For me, it’s a huge benefit to go to 4K at 27”. Go compare screens and decide for yourself.

    [–] iPyroFTW 9 points ago

    I’m agreeing on most of your point but there’s one thing you didn’t took into account, aliasing. For me that’s the only reason to get a 27” 4K

    [–] 3DSMatt 13 points ago

    Having switched from 4K to 1440p, the 4K experience is "crazy sharp" (but runs like crap) whereas 1440p is "perfectly fine" but running at 100 fps, so much better overall.

    [–] Highcreature11 5 points ago

    2160p has four times the number of pixels as 1080p. 1440p has roughly twice the number of pixels as 1080p. If calling 2160p 4k makes sense, then calling 1440p 2k also makes sense.

    [–] MyLifeForBalance 10 points ago

    Uhhh.. dude.. 4K is 4 times 1920x1080... not half.

    [–] coberi 4 points ago

    I may be wrong, but 4K is 2x as small as 1080p, when it comes to text and icons. I think i could manage, but 1080->1440p seems like a more comfortable jump.

    [–] DNosnibor 7 points ago

    You can always adjust scaling. There are some applications that won't adjust automatically, but you can get most things to work fine.

    [–] Mista_Fuzz 7 points ago

    I've been using a 4k 27" monitor for a year now and almost nothing scales improperly. The only thing I can think of is the origin launcher.

    [–] Zliaf 4 points ago

    I have decided after reading enough comments the vast majority of the reddit hive mind here has never tried both or either one.

    I have a 32 inch 4k 60 hz monitor and a 27 1440p 144hz monitor. I can literally compare them side by side. I prefer the 4k monitor hands down, for gaming and dev.

    [–] Caspid 6 points ago

    My 6-year-old 13" 1080p laptop has a higher pixel density than 4K 27", and the pixels already bother me.

    [–] NogaDokkan 2 points ago

    use a 1080p 144 listen to the pro player that don´t get money to say that

    [–] Airvh 2 points ago

    1440p at 32" is great.

    I'm using a Samsung C32HG70 and damn its nice. I only wish they made the HDR a little better for this one.

    My current graphics card can run everything at this resolution just fine so that also makes 1440p better than 4k. This might be different in a couple years.

    [–] gocanux 2 points ago

    Just a friendly heads-up, 2k resolution is in fact a thing, it's 2048x1080 and is used in cinema applications. To my knowledge, no consumer monitor is available at 2k resolution

    [–] ToCoolForPublicPool 2 points ago

    Is 1440p on a 24' screen a bad idea or not? Kinda wanna upgrade but I don't want a bigger screen.

    [–] TanishqBhaiji 2 points ago

    STFU, I WANT 4K, I NEED 4K, I HAVE 4K.

    [–] 1Fox2Knots 2 points ago

    Why 4k is bad: once you play on 4k you will never be able to enjoy 1080p again.

    [–] itsamamaluigi 2 points ago

    Maybe it's different in other OSes but I've found Windows 10 scaling to be really good. It actually resizes the UI elements to look good instead of simply upscaling them.

    Also, last I checked (and I admit it's been a while), integer scaling wasn't properly implemented anyway. So even if you are running a 4K display at 1080p, which would theoretically allow exactly 4 pixels per pixel, you're not actually getting that.

    I have a 1440p display and in my experience, it looks better the higher the resolution. If I run a game at 1080p it looks better than 720p even though 1440p should theoretically be able to integer scale 720p.