@GreyT you have my sympathy. Sometimes you ask the internet “where can I buy my favorite chocolate?” and you get a lot of “you shouldn’t eat chocolate!” "and “Have you tried eating carob?” instead of folks actually helping with your problem.
I wish I could help, but I don’t have a Mac Studio – also, my Scrivener projects are very light and small.
I would be surprised if Scrivener was multi-threaded, because it’s really not designed to do very heavy lifting – though I welcome correction from @AmberV or anyone else in the know.
My guess (and it is just an educated guess) would be that you could improve performance with a faster CPU, more RAM and a bigger boot drive for scratch/virtual memory use.
There is probably a real limit to speed improvements that can be had by upgrading hardware. Your chosen workflow puts a big burden on Scrivener.
If at some point, improved speed & performance becomes more important than the comfort of your familiar workflow, you might look at DevonThink. Putting non-text files in DT and linking back to Scrivener would improve your system’s responsiveness dramatically.
Until then, see how much faster the Mac Studio is at single CPU tasks, and get as much RAM and boot drive space as you can.
For indexing, the main bottleneck is going to be the speed of the disk or SSD. In order to rebuild the search index the software has to go through every single file in the project that relates to text data, opening and closing them as it goes, and that’s one of the slowest links in the chain.
Getting over that bottleneck with consumer-grade hardware is going to be difficult. The answers to that problem typically involve specialised setups like RAID arrays, and the workstations that can handle them, which cost a lot to assemble. Frankly, the last time I encountered someone pushing Scrivener to a similar level, that’s exactly the kind of setup they had.
I would say in most cases, breaking things down into smaller projects and having a master index project that integrates with them via external item links is probably a more effective use of one’s time and wallet thickness. Unless you’re already into video production or something, that’s a lot of hardware and overhead to get into just to avoid a more fragmented workflow.
It happens, but I don’t see anyone doing that in this thread.
Exactly! The workspace and time required by some algorithms are exponential functions of the number (N) of items searched, indexed, sorted, etc. Every algorithm gets slower as N grows, and resources are always limited. That being so, reducing the number of items often is the best solution.
As for whether anyone should buy a Mac Studio, the answer, of course, is to buy the biggest, fastest machine you can afford.
It’s not as if users answering questions on the forum have any power to find and squash bugs in the program or rewrite the algorithms. Nor can Literature & Latte fix it in a day.
Hi AmberV
The last part of your input is interesting for me, can you say a bit more? I gather you are not thinking on sync with folders, but I would like som pointers to how such a project with its master index, will function together with smaller projects. Took at glance in the manual, but did not find anything of what I think you are talking about.
Maybe I am indeed hoping for the possibility to sync two or more projects, from The Master project!!! So that changes made in one sub-project will be synced to the Master, and vice-versa
There are downsides of course, searches need to be run more than once if you don’t know where something is at all, but a “meta project” that does that as part of its function—helps you find where things are and can be embellished and made better as you find yourself looking for things—is part of the idea here.
Perfect advice AmberV…
Now I know of Bookmarks, and I rediscovered some Layouts I designed some time ago. So now… goodbye to the mega-monster-project and wellcome to happy coexisting (my screen is 34")
The left layout is for the “live” project, and to the right, nicely stacked are the 3 separate projects. Called by individual bookmark (to a Folder), when needed.
There is a bit of nostalgia over the demise of the monster… Afterall I resonance with the Queen line:
“I want it ALL… and I want it NOW”
Just as an afterthought, on the bloated by huge photos filesize, I have createde a small routine including a macro, whereby any selected picture/photo/graphic file will be scaled down to 500 x 333 pix, thus keeping the project file somewhat slimmer, (a full size photo adds about 40MB to the project filesize), and each trimmed item will have a ‘footer’ stating ‘Trimmed…’ making it easier for me to know if trimming might be needed.
Hash it! And ignore all the nonsense of using primes for the size of the hash table. Hopgood and Davenport (1972) showed conclusively that making hash tables a power of 2 meant it was easy to expand them. Hashes are also ~ O(1).