Scapple doesn’t use a database or any kind of large-scale storage system, but rather a simple single text file (in XML format), so it’s probably running into memory issues trying to load and continually save a file that is as large as you describe. It’s worth noting you are well beyond the design intents of the software. This is the sort of thing where it can work fine up until a certain point, and then start sporadically causing problems until finally not working at all.
Salvaging could be difficult—if you have access to a more powerful computer that might help, and what I would do then is break apart the file into maybe a dozen smaller boards and see if those load more efficiently on your main computer. Of special note is the File/Export/Images… menu command. If you can get the file open at all, that is what I’d be trying to do first.
The next thing to do is get the data stable. One brute force approach that comes to mind is to wipe out the graphics data from the file using a command line tool. A command such as the following should suffice:
sed -E 's/<ImageData.*Name="([^"]*).*/<String>\1<\/String>/'
You would want to leave a space on the command line after that part, then drag and drop your Scapple file into the window, followed by typing in > ~/Desktop/cleaned.scap, then pressing return (so the basic structure of the command is sed -E ‘search_replace_pattern’ input.scap > output.scap). The result will be a new Scapple file (the original will be untouched, but make a copy of the original if you haven’t) on your Desktop with every image removed and its node position replaced with the name of the original image in the text of the note—so all connections and styles will be preserved. If it doesn’t work right, it probably won’t open at all—I didn’t test the command extensively on but a few simple .scap files with test images. If you know a thing or two about regular expressions you might be able to fine-tune the search pattern.
So optimally you could get all of the images exported to files and then a copy of the board with their file names printed in their positions instead of the images, meaning the whole thing could be gradually put back together more efficiently.
To that end, and with the thought of not having to sacrifice how you work as a strategy going forward, two tips can save you a lot of space and maximise the quantity of images you use:
- Scale the images to size before importing them into text file formats like RTF and Scapple files. The little handles and resize tools in these programs may be convenient, but they don’t solve any underlying issues of scale.
- Use software that can save clean and simple images with little to no meta-data. Some programs by default insert large amounts of meta-data and thumbnail information—sometimes in excess of the image itself. A tool like Photoshop’s web save export can keep a file down to just the bytes necessary to display it, and also help in keeping what bytes we do use as efficient as possible.
Like I say that’s more of a strategy going forward. It would be a lot of work to do this retroactively, but if you can get all of the graphics exported and then clean the file using a technique such as described, you could go through and clean them up, pasting them back in as you go.
To put it into perspective, if you used images around 4 to 6cm square and compressed them reasonably well, you could probably fit around 25,000 graphics into 320mb. Or a better way of putting it is that under normal usage you wouldn’t get close to 320mb in a single file.