I am experimenting with a style of using the binder in which I might end up with several hundred binder documents (or more), all of them with some content. At what number of binder entries would I be pushing my luck, as in “it will break” or “hmm… has not been designed / tested / … for those kinds of sizes” ?
Technically, I seem to remember from a long-ago thread, since Scrivener packages are just a special kind of folder under the hood, and each of your documents is just a file like any other as far as OS-X is concerned, the limit is millions, or as restricted by the amount of disk space you can give it and the number of huge image/movie/audio files you include. You will run out of the ability to navigate your way through it, long before Scrivener runs out of steam.
That said, I don’t think anyone has got far enough to explore the limits of the Scrivener solar system, leave alone its galaxy or universe!
As Mark says, the only limit is hard drive size and the maximum number of files that OS X allows inside a folder (which is hundreds of thousands or more, I cannot remember off the top of my head but you can easily find out online). That said, I don’t recommend making your projects enormous, as Scrivener isn’t optimised as a database.
All the best,
Sophie, if you plan to use data in other projects, it’s better to have a good textual database program. I use DevonThink Pro, others like EagleFiler, so there’s a choice. I use the Research file in Scrivener only for light duty. As Keith just said, it’s not suited for database work.
My largest project had over 600 text files in about 100 folders and sub-folders. The documents were average-sized (although I did have about 2 dozen large PDFs in the research folder). Everything worked at normal speed.
I know this isn’t conclusive but anecdotal evidence is all I have!
Quantity as well as volume in each entry in the Binder seems to apply.
I HAVE had slowdowns. But on an older PowerBook G4 so it is probably not relevant to most current users or most writer’s practical use/needs.
Horsepower of the machine seems make a difference it ‘seems’, but then available disk space (caching) is also relevant. Mine is running S v1.5 now with Leopard 10.5.5. I was running at 5% HD capacity for a good while. Deleting my iTunes library opened up another 15% of space, but performance only increased some, say 25% (subjective). So I attribute the continued lag to a weak CPU. I have plenty of RAM.
In my case I have little inside of each binder entry. A page of text on average. I am primarily a poet. But I have perhaps 4000 files in 60 folders? It takes 3 seconds to down arrow from folder to folder in the binder and I have considerable lag in writing and deleting time as I write. 7 or 8 character delays? It became unusable as a daily working project. I now use a working Project of about 500 entires and move them every couple months over to the Master. It is necessary for me to effectively manage and search for anthologies of work. Until I can leverage a new MacBook that will have to do.
So yes, volume and number matter, but with today’s Core Duos in most all Intel Macs, I double it will impact you unless you go to work for, Oxford, Britannica or try to manage subsets of Wikipedia. Hope this was a helpful perspective from my ‘off the beaten’ track usage.
I believe I ruled that out in time.
The simple test of course was to load the MASTER Project along with my working file. THe fact that they have different behaviors when loaded simultaneously leads me to believe it is not such a conflict.
The WORKING Project remains relatively snappy, as do smaller projects. I believe it is the shear size of the Project’s hierarchy and the relatively primitive 1.33 PowerPC G4 CPU compared to today’s, possibly coupled with Leopard being more optimized for Intel than earlier versions. My Intel iMac handles it fine as well.
FWIW, the MASTER backs up to 18MB the WORKING is only 5MB.
One thing the above statement causes me to think is that the issue is not CPU, but disk IO. If it were CPU then both would slow down. You might consider raising the auto save interval to a larger value (say 120 seconds) and see if that helps. If that does help then the issue is simply a slow disk/controller. The only fix for that would be a replacement systems (sorry).
An interesting point. I’ll check out the setting change.
Would just scrolling through Project folders or contents within folders show a similar tardiness on the Master file compared to the Working file. I/O should not affect that too often. Much is cached in RAM besides disk. Without making changes in content, the ‘save interval’ should be moot. Yet the two projects still show a proportional sluggishness to one another. If I were working with Photoshop I could see this, but Scrivener seems very cache efficient.
All said, it is old technology and a drive that has seen its share of use. I am amazed daily at the machine’s heartiness. 5 years and just batteries and AC issues. Knock on Aluminum.