Scrivener 2.0.2 now available

Hi all,

Just to let you know that Scrivener 2.0.2 is now available for download. This is a free update for all registered users of Scrivener 2.0. Although 2.0.1 was only released on Thursday, I received hundreds of crash reports over the weekend while I was away, most of them down to a bug I had introduced in the compile panel. Thus I thought I’d get this out to everyone as soon as possible. The changes are as follows:

  • Fixed bug introduced in 2.0.1 whereby Scrivener would crash when you selected “Custom” as the Compile format (and in various other situations when using the Compile sheet).
  • Changed “Copy without Inline Annotations or Footnotes” to “Copy without Comments and Footnotes”, and fixed bug whereby the links to inspector footnotes and comments would not get stripped when using this option.
  • Scriptwriting elements no longer show project auto-complete list items if “Include project auto-complete list” isn’t ticked for the element.

To download Scrivener 2.0.2, either:

  • Select “Check for Updates” from the Scrivener menu within Scrivener itself and follow the on-screen instructions.


If you have already seen the update notification and followed the on-screen instructions to update, you don’t need to do anything else.

For 2.0.3, my plan is to start using the “Beta Testing” section of the forums again to release some public betas before it goes live (but 2.0.3 won’t be for a good few weeks yet, so please no questions about when it will be :slight_smile: ).

All the best,

Hey Keith: Is there a reason that “Check for Updates” is grayed out on my main Scrivener Menu? I’m “still” running 2.0.1. I could upgrade through the Home page as you also suggest…but a little concerned that it might think I already have 2.0.2 and might screw something up in some mysterious way. Thanks. David

There is no harm in downloading from the web site and updating manually.

I’m not sure why this is greyed out though; never seen that happen before. It isn’t a case of the program thinking you are already updated—it doesn’t check to see if there are updates before it checks to see if there are updates, in other words. :slight_smile: Even if you run it and you are up to date, it will still be available to try again. I tried disabling my 'net connexion to see if it detects that, but it doesn’t. Does this condition persist after a relaunch?


thank you for the fix! :smiley:

I’ve got a question.
I’ve absentmindedly accepted the automatic upgrade and Scrivener did indeed upgrade to 2.0.2.

I said “absentmindedly”. That is because I prefer to manually update my software, when I can. I also like to store the DMGs of what I have on my mac. Just in case.

I noticed, however, that the 2.0.2 version which comes with the manual download is smaller. 10MB smaller. The inner structure of the package is slightly different and it seems to have been last modified later than the automatic update.
Which one is better?

Thank you! :slight_smile:

There’s no difference between the one that comes by automatic update or the one on the DMG - the automatic update just downloads a zipped-up version of the exact same file that is on the DMG. So there shouldn’t be such a file size difference.
All the best,

Thanks, Amber. Upgrade went fine. And after the upgrade, the “problem” of the grayed-out “check for updates” went away (not after relaunch). So no big deal. I check for upgrades automatically anyway…just thought it was strange. David

Hello, sorry for the late reply.
I am afraid that they really are different. … 1at200.png
On the left the automatic update one. On the right the DMG one.

They both seem to work, but I’m still curious about the difference.

Actually, I looked into it a little more and it seems to down to the zipping methods used. Because Sparkle update zip files need to be code-signed, I built a small utility program that automatically zips up Scrivener and does the code-signing. It uses the commandline zip tool to compress Scrivener, and this was the issue. I wasn’t using the -y option, so symbolic links were getting expanded and turned into the underlying files, meaning some files were appearing twice. Now that I’ve added that option, though, the updater version of Scrivener ends up smaller rather than larger, because the zip utility compresses some image files to be smaller, even unzipped, than they are in the original, and so far I’ve had no luck preventing this even with the -n option. Who knew using the zip utility would be so complicated…?


FYI on my OSX 10.6.5 system, zip -n of a jpeg does the right thing. The invocation used was

zip -n .jpg zipfilename list-of-files-to-zip
zip -n .jpg:.png zipfilename list-of-files-to-zip

The ‘.’ before the extension seems to be optional so
zip -n jpg zipfilename list-of-files-to-zip
also works. The space after the -n also seems to be optional in this case, in fact I couldn’t get it to misbehave.

This is zip 3.0, adding -v to the command line will increase the verbosity and give a commentary on each file and the compression applied to it.

Hmm, maybe it makes a difference if you are zipping a directory or a list of files - my guess is that the -n command isn’t recursing into directories and may only get applied at the top level. Try this:

  1. In the Terminal, navigate to a folder with a copy of Scrivener in it.

  2. Type:

zip rqyn .tif:.jpg


zip -r -q -y -n .tif:.jpg

You’ll end up with a zip file that is 38MB. Unzip it, and the unarchived version of Scrivener is 55.6MB version, whereas the original is 64.8MB - and the discrepancy is caused by the extra compression of image files. I don’t seem to be able to find any way to prevent the zip utility from compressing files inside the /Contents/Resources folder (even -0 for “Store only” results in a file that is 55.6MB…).

All the best,

Oh, I see.

Thank you! :slight_smile:


tl;dr you’ve got two different file extensions for tiffs, .tif and .tiff

Your no-compress only mentioned .tif so the others were being compressed. It should be:

zip -q -r -y -n .tif:.jpg:.tiff

Then i get
74328 -rw-r–r-- 1 epo admin 38053161 15 Dec 15:44
before I got the smaller file
74288 -rw-r–r-- 1 epo admin 38033707 15 Dec 15:48

I discovered this in two stages first I changed your incantation (to remove -q and add verbose output) to:

zip -v -r -y -n .tif:.jpg

this produces loads of output so I narrowed that down to lines mentioning .tif by

zip -v -r -y -n .tif:.jpg|grep tif

This was still too verbose so I excluded all lines showing non-compressed files (contain the string ‘0%’)

zip -v -r -y -n .tif:.jpg|grep tif |grep -v ‘0%’

and got loads of .tiffs listed, adding ‘:.tiff’ to the list eliminated all those, you probably also want to add ‘:.jpeg:.png’ for safety.

Hmm, nope, for me that still results in a file that is 55.6MB rather than 64.8MB, even including .tiff (I’d already checked that some of the .tif files were still being compressed, I believe).

All the best,

Does this extra compression cause any issues/degradation of the graphics files? If it’s non-destructive, maybe part of your release work flow could include using the zip utility to compress and then uncompress the application. With that as your starting point, I would assume you could get the same (smaller) installed size no matter what distribution path you choose.

How odd, the files are identical (have the same contents) but occupy different amounts of disc space.

As a sanity check I tried to package up the file with tar, on unpacking the directory sizes were indeed identical.

It is strange - if you ctrl-click on Scrivener in the Finder and choose “Compress” to create the zip file, when you unzip that archive the file size is the same as the original. So it is just down to the command-line zip utility doing extra compression.

Robert - I don’t know, to be honest. I imagine there will be some difference, but I don’t know how noticeable it is - I haven’t noticed any differences. I’d rather keep the automatic update version the same as the original rather than compress further, though.

All the best,

Here is my theory, after doing some examination in Path Finder and Terminal. I took two TIFF files, one had ZIP compression applied to already, and the other had no compression applied to it. I wanted to see if the zip tool was actually adding zip compression to TIFF files, as unlikely as that would seem. The result of that test is: it’s not. However I did get the curious drop in file size—not very large, but a drop. Test file one had 217,088 bytes prior to being zipped, and after being zipped it had 212,992 bytes.

I loaded both of these TIFF files into Photoshop layers and enabled the Difference mapping mode on the top layer. This produces a result which maps RGB(0,0,0) to zero differences. So for each pixel in the layer, it is compared with the composite of the pixel data below that layer. If the pixel data is precisely identical, the resulting colour will be black. If there is any deviation, it will be non-black. The result: a perfectly black image, meaning the two files are identical at the bitmap level, despite the drop in file size.

So went to Path Finder and examined the two files in depth. The report file generated for each was pretty much the same, excluding of course things like where the file is actually stored on the physical hard drive. Then I came across this line.

Pre-compression: Resource Fork Size: 286 bytes logical, 4,096 bytes physical
Post-compression: Resource Fork Size: 0 bytes logical, 0 bytes physical

There you have it. That most certainly accounts for the discrepancy. With the many hundreds of TIFF files in the package, even if the byte counts are very small, they still have to use a 4kb block to represent it, so at minimum each file has 4kb of useless data which zip is stripping out. This amount will increase depending on your Photoshop settings. I have pretty conservative settings, but if you have it set to insert icons and other meta-data into the file, this number could go up pretty quickly.

My speculation is thus: the underlying filesystem link between a file’s resource fork and data fork is considered by the zip utility to be a type of link, and so when the -y flag is used to not flatten links, the resource fork gets dropped. This probably, by the way, produces a zip file that is “nicer” for Windows users to look at. It might not have that MACOSX folder at the top which stores filesystem meta-data. Not that such is relevant to the bundle.

Which is where FS tuning comes in. Reduce the inode/block size of the FS and the amount of empty data will go down.

But who really wants to rebuild an FS for this?

I can’t see why an casual or even semi-pro user would need to bother with tuning. Now, if you are capturing and writing raw HD feeds to a RAID-0, you would definitely want to tune, but in that scenario a larger block size is more advantageous at the IO level. I bet Apple just banks on the larger block size to make sure multimedia playback is skip-free for the majority of users, even though it does mean a large amount of waste.

Given that dmg is the common packaging scheme for OSX why are zip files used at all? (BTW we know that tar seems to work.)

Keith, sorry for fixating on the tif-file-size red herring when it was the unpack size that was wrong.

EDIT: does the file compression discussion here help?