Hi, I have a MacBook Pro - the latest edition, with 8GB RAM.
My Scrivener file is approaching 20MB, with thousands of folders inside of it.
Looking ahead, I was wondering when I might start to see some slow down.
Actually, I’m already beginning to see some spinning wheel behavior when I click on the folder that contains the thousands of sub-items. But, with that amount of items, I expect that to a degree.
What affects slowdown more - the size of the Scrivener file or the amount of individual items in it?
Honestly, it takes a lot. When I was last stress-testing 2.x it was around a year ago, and I was tossing encyclopaedias of information around with ease. There are some minor slowdowns once you get into the thousands, especially with corkboards since those require more drawing. A corkboard with 2,000 items on it will load slower than one with 100, no doubt. The other main slowdown that is to be considered normal is when loading a couple hundred thousand words in Scrivenings. That will take a few seconds, especially if a few hundred items are involved in the assembly—but given the scales we are talking about here, that isn’t really too out of line with expectations. You are, after all, asking a program to load 200,000 words from 800 files and piece them all together into a pseudo-file session that allows atomic editing! The largest tests I ran were in the 1.5 to 2 million word range. I had no problems editing, though it did take a while to load that. Keep in mind this was about a year ago; many optimisations have been done since then. One area that can cause slow-downs is dynamic outliner sorting. In large lists (500+), operations can be a little slow if the list is sorted. I think we’ve nailed most of the performance problems here though and it should be reasonable now.
20mb by the way should be no problem. The user manual project is roughly that size and I work in it daily. It probably doesn’t have that many folders though.
The total size of the project will impact automatic backups the most. Get over 500mb, and closing the project will probably take too long for your patience. The
Project/Back Up/Exclude from Automatic Backups option can come in handy here. Load and close times will slow down as word count increases. You’ll notice progress bars (albeit briefly) in a normal large sized book project (150k to 200k), as search indexes and such are updated. Total quantity of items in the project, large corkboards and such aside, I would say has much less of an impact on performance, and that is easily mitigated by using more sub-folders. Try to keep folders under 1k items if possible. Overall, the program only loads into RAM what it needs to load, so even if the project is many gigabytes in size, it won’t be putting all of that into memory.
Very informative. Thank you!