Scrivener as a (big, really big) cardbox?

Hi there,
first let me say that I’m really happy to have Scrivener as a perfect tool for working out my texts.

Before starting now with a bigger project I have a question:
I’m thinking about to use Scrivener as a tool for a main cardbox for ALL my notes - following the idea of Luhmann’s Zettelkasten (notes are sequently numbered, not categorized but linked with keywords).
I have some thousand index cards and, as you may guess, a mega pile of keywords. The texts within the notes usually are rather short.
So, I would like to know if Scrivener can handle that big amount of documents (one document for one index card each) also as that big amount of keywords.
Any responds appreciated.

BTW: I guess the Luhmann system could work well with Scrivener: Yes, some thousand documents in the Binder will be disturbing - but I use it as a ‘slip box’ only; just focus on search and collections. Do a keyword search; put the search results (the shown documents) into a collection. That simple.

There are some performance issues when you get into truly enormous numbers of keywords (hundreds or thousands). Another user recently brought these to our attention, but fixing the problem is not going to be simple.

Even leaving that question aside, though, Scrivener was not really designed to be a large scale database. You might be happier with either a personal Wiki or something like DevonThink Pro.


thanks for the answer.
Hm, if the keywords may create a problem I could solve this by adding the ‘keywords’ directly into my notes, i.e. documents. Like #keyword. So: solved.
But what’s about a huge number of documents? Mine are, as I said, relatively small, maybe approx. 150 words each.
Any experiences with this issue?
Thanks again and best regards,

Quantity of files will rarely cause you trouble, as Scrivener does not open everything at once. You should have no troubles with even thousands of documents. The main situation I’m aware of that can slow things down a bit is if you have hundreds of high-resolution images in a single folder and view it as a Corkboard—it takes a while to build all of those thumbnail previews the first time per session (it should be fast after you’ve done it once). If you run into that problem, just split things up into smaller folders, or use Outliner view instead.

A few thousand small files like that is not a problem at all. I have a number of projects that could be classified as being around that large, and the only slow-downs I see are with routine backups, even working on a slower MacBook Air, or when performing large-scale data operations, like searching for all 1,500 files at once and then sorting by the Modification Date column. That, understandably, takes a few seconds. :slight_smile:

The built-in keywords feature is extremely useful for projects of this type. I would only consider dodging it if you are certain you will have several hundreds of individual keywords (not assignments, so “Keyword” is one single instance even if it is applied to 3,251 documents).

Sounds to me like no probs for my project - wonderful. You made my day - thx :smiley: