Any chance of a 64 bit version?

I also create 3d models when I am not writing, or if I need to see something to describe it, so the extra ram doesnt go to waste. I would love to get the full 32 gb that my motherboard can support. When I added 8 gigs I had a substantial performance increase in my 3d modeling programs so, for me, it’s worth it due to my other hobby to max out my ram.:slight_smile:

for grins and giggle we loaded a 4socket UCS blade with 2TB ram and (40 cores). The SSD EMC chassis supports a stupid level of disk IO… let’s just say that for the first time ever, I didn’t feel that windows was slower than my Unix desktops (and for the record the issues with windows is the non-MS supplied components that is pretty much required these days so I’m no knocking MS for it).

But yeah… not a normal system for comparison.

I need to get one of those for my house.

lol, yeah…I thought it might be fast. I think if I get a SSD and the 32 gbs of ram that my motherboard supports then I’ll have a wicked fast machine. I just need to figure out what’s hogging my 1tb main drive, I’ve been trying to reclaim my space but havent had much luck. I’ll probably start working on moving my programs over to my other hard drives next. Then again it could be my 3d library clogging up my hard drive too. :stuck_out_tongue:
Edit, yeah…103 GB of content, must go over to the storage drive lol. I’ve gotten a lot of Daz content over the last 6 years lol

Are you actually hitting 2gb of RAM usage in Scrivener? Even with a massive project that should be very difficult, as the software doesn’t load the entire thing into RAM at once (that would be crazy, some people store gigabytes of research). There could be a memory leak in the software that you’re repeatedly running into if it does get that high. May be worth keeping an eye on its usage with Task manager while you work, and see if any operations cause a sudden spike that is never reclaimed.

I agree with the above, disk IO speeds (especially) and raw CPU (for some operations like compile) are going to benefit a writing program more than RAM. Text, even formatted text, is just about as efficient a data type as can be.

I dont think I hit the 2 gb wall yet, despite having a 141k word into it and splitting the file up (the program did slow down drastically when I got around 150 pieces but that could just be from read/write data since it doesnt do that with smaller chunks) And it did crash to the desktop, but I think that was more of an importing file issue. I was working on my paper route and imported the original doc that my manager sent me last year instead of the google docs export of a similar name. Both my book and the route winked out of existence when I did that. I dont know what kind of doc that the paper company uses but word doesnt like it either. I can only open it in libre office. However, I have gotten as high as 1gb with scrivener, when I did split screen with the full document in one and the scene in the other but I think that was more of a multi task thing than anything.

Ioa, am I incorrect in thinking that folks like me, those who aggressively split and then use scrivening mode,would see higher RAM util? My thought is more file objects (handle, pos, read/write buffer, undo buffers, etc).

So a person with 3K words in one file could possibly use less RAM than a person with 2Kwords in 30 files.

Yes? no?

I’ve been keeping an eye on it with process explorer, and noticed that the working set is slowly growing as I chop up my novel. I alredy adusted scrivener so it can use 4 gb of ram instead of 2 but there wasnt much of a difference, so the slowdown I noted must be a read/write thing and not the ram like I first suspected. Currently I am at just shy of 190k ram with 202 documents (I removed some not needed since I discovered the chapter system)

Wanting a 64bit text editor… Interesting to say the least… O_o

I know, right? Might be over kill but its nice to know that if it ever reaches that wall that it’s not going to crash.

Speaking of going past 2gb it actually did and all I did was save the file then view the entire book so I can do a global save
Scrvener beyond 2gb.png
And it’s still growing O.o makes me thankful I patched it go past 2 gb

Scrivenings is more of a memory issue with the Windows implementation. To get into the details of the differences a bit: on a Mac if you take 30 files and stitch them into a Scrivenings session, the software uses magic to get all of the data of the included RTF files into a single text text editor—in a sense it truly is a “single file” at least in RAM, it just has methods of extracting changes from that “file” on the fly back into the original data sources. Thus using 30 files to fill the text editor instead of 1 similarly sized file would be of insignificant difference in terms of RAM usage.

On Windows, we unfortunately do not have the necessary ingredients to build a session inside of one text editor. There is no static and safe way to say this text here falls into section #23 and after this code, #24. So what we’re doing there is building 30 text editors and stacking them in a scroll view. So there is a more direct correlation between memory usage and session size in terms of included item quantity (not words so much—more is being said with the chunk of code that makes a text editor than the words inside the text editor).

Fortunately the transition to the newer Qt version includes the necessary tools to build a more more efficient system like the Mac uses for Scrivenings, so this won’t always be a problem, it’s something to consider for now though—and it might indeed be easier to hit 2gb than I had thought off-hand. Definitely some interesting testing there.

Either way, I still think the best approach for us is to continue to optimise the software so that it doesn’t use so much memory, rather than just throw more bits at in and bandaid over the inefficient usage. That’s just my opinion though, I’m not precisely sure what the plans are for 32-bit or 64-bit in the future. I’m sure eventually it will just be what everything is, it is already decidedly headed that way on the Mac.

Since I found that little 32bit app 4 gb ram patch 4gb_patch.zip (21 KB) I’ve done it to all of my most used software (I bought scrivener yesterday btw, YAY!) on both of my machines. I’ve seen a significant improvement on the games that I play to relax. It seems like they like stretching their legs out or something. :stuck_out_tongue:

Anyway, I was not in scrivenings mode, I just saved then started doing some global find and replaces. I looked over at processor explorer (I have3 monitors for my 3d hobby) and saw scrivener had broke the 2 gig barrier. It hovered around 2 gb until I closed it. That’s 2.41247940063477 gigabytes (I checked it here http://www.convertunits.com/from/kilobyte/to/gigabyte) so I guess something is going on in the background.

Currently I have 238 parts to my book with 5 more chapters to chop up. Once chopped up the entire book will get a good examine and overhaul because it needs to go on a word-loss diet

I’ll keep an eye on it to see how big it really gets. On start up it only has about 100k memory use but it slowly grows the more I work with my file. I’ve used Windows most of my life, I’ve toyed with ipads and stuff but it just isnt my thing. I’ve been thinking about getting a mac book but until I actually sit down and test the os to see if its for me then I’m not serious.

Edit:
Now this is weird, once I got past 260 documents (I added image folders) Scrivener dropped to 30-40k and it sped up the way it used to work before it lagged out. Quick and speedy. So it seems that once it gets enough sub documents scivener’s optimization kicks in and speeds it back up. If you could just get that same optimization working on 160-260 then it will really speed up.

Edit 2:
I am working on the second book, importing it into scrivener for easy reference. It’s a longer book that I need to really chop up and decide which scenes need to bt put aside for a new book. I glanced over at process explorer and found that Scrivener has gone up to 3 gb
Scrivener mem hog.jpg
Perhaps you need to include the 4 gb patch into the future release of scrivener. I dont know what would have happened had I not patched it but it worries me a little that scrivener can eat up this much memory.

At least under OS X Scrivener uses bugger all memory: ~200Mb with my current 150,000 word project (double that length with notes & dozens of references images and docs). But all those files mean, potentially, a lot of disk IO, so a slow disk (non SSD) could impact performance more than RAM.

My own call for 64bit to ensure the application can use Apple’s modern API’s. For example, Force Touch under El Capitan doesn’t work properly (discussed in length elsewhere) due to Scrivener being 32bit.

I read this thread with interest because my PC crashed twice today while I was using Scrivener. The blue screen of death appeared, reporting “PAGE FAULT IN NONPAGED AREA” Is this the sort of error caused when the 2GB limit of 32 bit software is reached?

I checked the memory usage of Scrivener via Task Manager and notice that it creeps up by about after each ‘search’ where I then select all the documents found and revue them in Scrivener mode. Is this what you refer to as “memory leakage”?
I saw it go up from 222.8MB when I first opened Scrivener to 299.6MB after doing the process described below.

I can repeatedly make Scrivener crash by doing search, then putting into Scrivener mode. Here’s some background to how I can make it do this…

If I use a common word like “a” as the text to search for Scrivener shows up all the documents found OK.
But if I then select all the documents found I can watch in Task Manager as the memory used by Scrivener creeps up towards 2GB.
The first time I do this memory usage went up to 1,741.5MB.

If I repeat the process, and memory usage hits the 2GB limit, Scrivener simple vanishes - as if I’ve shut it down. i.e. Scrivener crashes. This didn’t crash the whole PC to the blue screen of death as described above. It just crashed Scrivener, so I still can’t be sure if it is Scrivener that is causing the more serious blue screen problem.

My ability to crash Scrivener, as described above is easily repeatable.

I love Scrivener. Scrivener has been invaluable in writing my books. I just hope that this book doesn’t outgrow Scrivener’s capabilities.

If you’re able to repeatably crash Scrivener, please email our support address with details, here:
literatureandlatte.com/suppo … tion-email

Responding to a 2-year old thread in the Wish List forum is probably not the most effective way to bring the problem to our attention.

Katherine

Seems that 64-bit is more of a necessity than a nice-to-have now…with the High Sierra announcement that 32-bit applications will lose support. Is there a timeline you plan to support 64-bit?

Indeed, Scrivener 3.0 will be 64-bit and it will be out before 32-bit support is dropped from High Sierra.

All the best,
Keith

So any day now? macOS High Sierra is out next week (September 25th) :stuck_out_tongue:

macOS High Sierra will continue to run 32-bit applications - it will be the last version of macOS to do so. So Scrivener 2 will continue to run fine on macOS High Sierra. Apple will stop accepting 32-bit updates to app on the Mac App Store as of January. Scrivener 3 will be out before then.

Thanks for the clarification and detail, Keith.

Zero pressure from me, by the way. I know intimately what development crunches are like.