Mini vs iMac

On my 27" monitor:
theoretical resolution is 5120x2880 → 218 ppi
the max I can set it to is 2880x1620 → 122 ppi
default is 2560x1440 → 109 ppi
I’m using it at 1600x900 → 68 ppi

The monitor works at 5120x2880 behind the scenes at all those settings, and I can’t see pixels at any of them, at any distance, unless I take a screenshot and blow it up to at least twice its size on the screen.

(But I just got new glasses for the monitor and I’m switching to 2048 x 1152 ≈ 87 ppi)

Wait a second. The native resolution is in fact 5120 x 2880, right? It’s not theoretical. Otherwise this behind the scenes statement would make no sense.

Could you please explain how exactly you set those other resolutions? I imagine you mean via the display settings in macOS?

In that case you stare at the same 218 ppi all the time.

That’s the hardware native, but there is no setting that achieves it.

I suppose I do … but here’s the max software setting:



The same red dot, blown up a lot.

Okay, we’re on the same page. It’s definitely always 218 ppi. You can’t see a difference because there isn’t any, regardless of the scaling. Except for the UI dimensions.

It affects the size of everything, not just the user interface, but you’re not wrong. (I’ve learned a lot in the discussion. I’d never done the math before or saw the actual numbers.)

Yes, it affects the size of everything, but the hardware-resolution stays the same. Your calculated ppi numbers were true (I guess) if you calculate them with different native display resolutions and sizes. Those don’t change here.

1 Like

With the OP in mind, the question is whether any of the monitors in question are less than adequate. I doubt it.

Depends. For “image and video editing” (are there other requirements that I’ve missed?) cutting corners at the display wouldn’t be the smartest move. It’s hard to answer without knowing details. This isn’t just a question of resolution. Color accuracy / “calibratability”, framerate and other factors come to mind, ergonomics, etc.

The article seemed to say 140 ppi is enough for image and video editing, and for my money color is more important.

Not really. But even if it would… why not 120? Or 175? It’s a completely random number. The article isn’t really helpful, to be honest, because the author(s) don’t fully understand what they’re talking about. Take this sample:

Gaming: 95 - 140 PPI The best number of PPI for gaming depends on the type of game and the size of the screen you’re gaming on. An average of 95 to 110 PPI is enough for every gamer. This allows you to enjoy the details the game offers, without everything looking grainy. For open world games, a higher PPI is a good choice. Take GTA V, for example. The skyscrapers look a lot sharper from afar with a high PPI.

Talk to a gamer and you will hear a lot about (variable) high refresh rates (144 Hz and up), latency, plane switching, and so. Skyscrapers looking sharper isn’t high on the priority list. Unless it is. 4k gaming is a thing. The article is full of misunderstood facts and weird blanket statements.

None of that says the gamer needs more than 140. You said yourself that skyscrapers looking sharper isn’t high on their priority list.

I’ll ask my gamer brother-in-law (I meant nephew-in-law), but it’s not necessary for us to debate it, either way.

I also said “4 k gaming is a thing”. So “110 PPI is enough for every gamer” is a bullshit statement. Point being: They talk about the most irrelevant stuff for most gamers, but completely fail regarding those which have other priorities. It’s equally bad in other categories, for example:

For office tasks and school, all monitors will do. If you only open 1 window at a time, 75 PPI is enough. Multitasking requires a higher number of PPI.

When was this article written? 2007?

That’s not your fault. But the advice given there is questionable at best.

Where does “218 ppi” come from? What is the math behind that?

OK I found it here Pixel density - Wikipedia

hmmm. don,t overcomplicate this stuff. a 27 inch 4K screen means you won,t see individual pixels without bumping your nose on the screen.

Human perception is way above anything stated in that article. It annoys me when general articles written by people who are not experts state “we can’t see or don’t need X” when we can/do. I’m talking about perceptual limits, and “good enough” is a different matter… As I mentioned specifically with the 24fps cadence of cinema, “good enough” or “what I’m used to” can depend on cultural association as much as objective truth. But the objective truth remains.

The reasons we don’t use 400+DPI displays is because pushing so many pixels is hardware-constrained, and manufacturing of such small pixel grids at desktop size is tough and therefore adds greatly to costs. Articles stating we do not need anything >140DPI reminds me of the apocryphal: who needs more that 640kb or RAM!? :stuck_out_tongue_winking_eye:

Scaling GUIs to handle high-DPIs is relatively trivial[1] and with enough GPU power is not a limitation. I for one welcome our future 4X high-DPI overlords eagerly…


[1] I think macOS was an excellent example where the introduction of retina displays was relatively smooth going. Windows and Linux still have some problems, but this is more to do with maintaining compatibility and/or lack of focus…

3 Likes

My Intel I9 27-inch iMac sits unused. So do the two 27-inch external monitors I had attached to it (originally, this was for a digital audio workstation).

One reason that all now sits unused is portability. It would be difficult to take all of that to Starbucks (yuck—I haven’t been in a Starbucks for over a decade), or out onto my patio.

The other reason is comfort. I can’t sit at a desk for seven hours straight.

I’ve moved to the 13-inch MBP, which I use in two or three different locations. So I have Scrivener ‘to go’ (as well as Logic Pro ‘to go’).

My primary comfy writing chair has this, a Bluetooth keyboard, a Bluetooth trackpad, and a stand to elevate the MacBook, all of which sit on a rolling hospital bed table, which means I can roll that between the legs of the chair and write for seven hours a day (and that’s about what I average), in comfort.

I finally realized I don’t need all these monitors. I can swipe between applications or switch between views on Scrivener very easily.

Multiple monitors? They turn out to only be useful if you can look at two things at the same time. Since we can’t, and changing what we’re looking at can be done instantaneously with a swipe, one monitor is plenty.

2 Likes

Could send a photo of your setup so as to better visualize it.

I agree with you about swiping between different desktops; that’s how I work.

I also love my M1 MBA and can work at it happily when away from home, but I find working in Final Cut Pro on a 13" screen is not comfortable. Also, as I get older, I prefer to sit further away from the screen, so more and more, at home I find myself using the 5K iMac for every day things like email and writing, image manipulation etc.—which previously I did on my laptops—even though the MBA is more powerful, faster and has twice the SSID capacity.

In fact, at home, the most important piece of software that I need to use the MBA for is Topaz Video Enhance AI, because it needs the MBA’s power.

With hindsight, I could wish I had not gone for a customised MBA with extra RAM and SSID, so as to put the extra money towards a new M1 desktop set-up, which I’m now moving towards.

Oh, I have a comfortable office-type chair and all the kit on an ordinary table.

:smile:

Mark

1 Like