Any thoughts on unified memory?

Just to throw more gasoline on this bonfire of the vanities…

Remember that with the new Apple Intelligence hype comes a system footprint that might not be recoverable for most users. Even if the word from Apple HQ is correct, that doesn’t mean the near major OS revision wont change everything… Anyone here remember how Lion, OSX 10.7, killed off Power-PC support?

My point here is that we may discover in the near future that Apple Silicone systems are splitting the unified memory between the CPU (and OS tasks), the GPU (and video tasks), the AI cores (and whatever temp storage and work space they need), in addition to anything else like supplemental L4 or L5 caching for the system bus. For all I know that unified ram might be getting split amongst five or six different needs.

Thinking ahead about how things will look when 16k visuals become the bleeding edge, some new breakthrough requires the Nine-Chasers to switch to a 128bit OS, or similar hard to divine today.

64GB of Ram might be just fine as a top standard performance goal, as 12GB ram equipped GPUs work well at 4k/120Hz for gaming. 128GB feels like overkill if your not involved in AR or VR authoring, 4k+ video editing, or other high octane computing needs. Especially for the cost.

In short it comes down to what you think you will be doing in five years with this machine. If you can’t see that far ahead in your work flow, get the most RAM you can afford. The 128GB is probably overkill, but remember that when the M1s came out, Apple tried – HARD – to convince people that Apple Silicone with 8GB was-as-good-as Intel and 16GB… It wasn’t exactly true but the lack of many third-party software that ran on Apple Silicone at the time let them fake it.

One last thing, the more RAM you have, the better you can make use of Virtual Machines to make use of software that isn’t optimized for Apple Silicone, if that might be of interest.

I think the whole point of unified memory is to facilitate exactly that.

I am now speaking as an absolute Apple fanboy. Almost three years ago, I bought a Mac Studio with an M1 Max processor and 64 GB of unified memory.

Even today, the machine is insanely fast for everything I demand of it on a daily basis (4K video editing, graphic design, large Scrivener projects, etc.). Then as now, an Apple computer is an investment and if your budget allows it, maximise the equipment. Coming from an Intel MacBook, even an M1 processor is a huge leap.

Of course, you make your own purchasing decision, but I can assure you that even with extremely demanding work, the 64 GB of shared RAM of my Mac Studio has never reached its limits, not even close. The Mac will serve me for another few years, I’m sure.

1 Like

My Mac Studio experience is similar, but…

The memory requirements of AI workloads are truly enormous. And there’s a pretty direct correlation between the size of the model and the accuracy of the results. If you anticipate using AI tools for real work – as opposed to just experimenting to see what they can do – you’ll want as much memory as you can afford. Especially if you want to keep as much of the work local as possible, for either performance or security reasons.

2 Likes

Gonna straight system design perspective, get more ram, not “what you need”. Every release of software/os will require more ram. Every technology change will require more ram. And if you are running VM, this demand will span all VM.

To me 128 would be the way to go simply because the ram is shared. You just don’t know what will suddenly demand it, but you know something will.

That said, @Orpheus, if you are planning to use the then give to some one else when your needs make it underspec’d then your plan is solid.

1 Like

There is one area where memory is the most critical component of a system: generative AI — for running generative models locally people are already maxing out 128GB easily. Current “sweet spot” local LLMs are 80 billion paramaters (which depending on model compression need >64GB), and frontier models are around the 400 billion parameter range (needing multiple GPU rigs, the GPU processor itself is not so important, how much memory and the bandwidth is). Each model is tuned for different requirements, and it is not uncommon to want to load up several models at a time. Apple has pushed all their systems to 16GB minimum driven by this AI use, and for anyone that wants locally-run “intelligence”, there is no other spec that matters. I run 8b and 13b parameter models on a 16GB macbook air, and the 13b model can cause memory warnings when running other stuff. Running these models has made me totally reevaluate my view on hardware components.

For everything else, I probably agree that 64GB will serve the next 5 years without breaking a sweat.

3 Likes

Even if I got 128 GB I would still give it to my wife in 3-4 years because by that time the 2018 MBP that she would have would not be able to support the latest macOS as well as browser versions and other such problems that she is experiencing with the 2009 MBP.

As for AI that is something I would possibly dabble in but for the next few years my focus would be elsewhere, so 64GB would be enough. But If I find that is not enough I would upgrade to whatever is available then. Who knows by then 64 GB might be considered the bare minimum to run the latest software – things change so fast. I still remember my first TRS-80 which had a whopping 48k of RAM (-:

Anyway, I will make the final decision next week.

1 Like

Ha - I built a TRS-80 clone from a bare motherboard I got from the US. Soldered everything by hand using my own designed and built thermocouple controlled soldering station. I had to rows of Intel 1k x 4 DRAM, can’t remember the total.

I was the Admin Mgr for an electronics manufacturer and our suppliers helped me out with quite a few ‘samples’.

We had a metalwork division to make our own enclosures for the kiln controllers we made and the CEO of that side had them make up a beautiful powder coated case for me.

4 Likes

I disagree with storage space. These days cloud-based & unlimited is a better option. Devices break, or get stolen. You can multi-cloud and encrypt stuff you don’t need on demand all the time.

The MacBook Pro I’m typing on now has a 1Tb SSD, and 600Gb free because macOS offloads unused stuff into iCloud. I also have the entire machine mirrored onto BackBlaze. So I can always get at my files, but there is no need at all for me to spend loads on local storage any more.

(my 256gb iPhone sees the same storage and is practically empty for the same reasons, but I have access to all my files on demand)

Many people, including me, do not consider internet (cloud) sync services as “backup”. Beware. Also, to enable Scrivener to work, the files have to be set for “offline” and disk optimisation probably should be turned off.

How do you do that? There are a lot of files that BackBlaze documents they do not backup. Do you have some sort of special account?

2 Likes

Nothing special in my setup. My Scrivener files live in a Dropbox folder which is sync’d with my various devices. Scriv backup zips are in the iCloud filesystem, so live both on my local machine and in iCloud. Then I have BackBlaze picking up my user folder and a number of other manually selected folders. So the important stuff is in 3 clouds & on at least one physical machine (given I have various other machines also syncing, could be on as many as 4 other machines). All encrypted. You only lose a novel once before becoming super paranoid about all storage…

So not mirrored as in “restorable mirror”. But I have every file I could ever need.

All my day job stuff lives in Git…

Out of curiosity, what is the monthly cost for those services?

Dropbox is free tier
iCloud is £33/mo (includes x5 family accounts, Apple Music, Apple TV+, News+, Fitness+, 2Tb, unlimited devices)
BackBlaze is about £100 a year (and unlimited space)

Just wanted to clarify your statement of “entire machine mirrored to Backblaze” which unless you have something special with them, this not the case. Didn’t want others to think that.

Yeah fair enough, poor choice of words from me.

Depends on your use case. Cloud-based assumes ubiquitous high-speed internet access. And even if you have that, things like video editing will often show just how much faster the connection to an external drive is.

2 Likes

The thought of putting my files on the cloud makes feel as if I was walking naked in public. The only thing I have in DropBox is my library and my Scriv files, and I am now going to move the Scriv files.

The only reason I know of to use Dropbox for Scrivener is the ease of which one can sync between a desktop (Windows or macOS) to Scrivener’s Apple iOS devices (iPhone and iPad).

It also provides easy syncing between devices, but all that not needed if not running in a multi-device world.

I do use Dropbox for other reasons of convenience, but do not consider it “back-up” (as posted here so many times already).

1 Like

I save my Scrivener backups to OneDrive, but also have multiple automated Time Machine backups per day.

2 Likes

Keep in mind, I’m a bit old school.

Solid state NAS (4TB using 2x4TB and still room for 2more drives) with TrueNAS ran me less than $400. Accessible from all my devices “from the net” with OSS apps (there is risk but I know the devs). Now I don’t use scriv on iOS but I do use this to back up iOS. I also run all my home media and a number of VM on it.

I only say this ti suggest that there are ways to have a bit of cake while consuming the cake simultaneously.

1 Like