Jump to content
  • 0

CTD at 3.1GB


viking

Question

Some key posts on this and related threads (experts feel free to note any errors or insights):

Wiki article (draft)

 

Thanks,

STEP

 

OP follows


First of all I wanted to thank you all for the great work you have done with STEP. Skyrim is the first game I installed on this computer and you guys have made it an AMAZING game. That being said, I have an issue that I hope you can help me solve.

 

My setup:

Vanilla Skyrim

gtx670 w/ 4GB @1080p w/ latest driver

16GB Memory

3770K at about 4GHz

Windows 8 64bit

ENB 149

Ultra settings

Highest available texture/quality

Mod Organizer

Step 2.2.1 + Skyrim Revisited + others

 

I have noticed a post here and there saying that Skyrim can't really address more than about 3.1GB of memory without issues. This seems to jive with my experience, meaning I CTD every time my memory hits that mark, but I couldn't really find anything definitive on the topic. The issue with googling the topic is the pre 1.3 skyrim that couldn't address more than 2gb of memory.

 

My mod list is mostly based on STEP which is why I came here for help, with about ten mods added onto the end (Interesting NPCs, Detailed Cities, Economics, COT, and a couple others). The reason I haven't included my mod list is that it doesn't seem to matter. As long as I keep the memory usage below 3GB I can have pretty much any combination of mods.

 

What I have tried so far (in no particular order):

  • resetting ini files
  • removing enb
  • not using attklt
  • only using a new game
  • removing all mods and adding one by one until issue crops up
  • running as admin
  • watching the papyrus log - it seems relatively clean, no obvious errors right before CTD

Yes, I can run STEP just fine without any issues, but I also never get near 3GB of memory. I have tracked VRAM usage as well and have seen a max of 2.7GB/4GB.

 

As an example of where I might run into issues: I start a new character with Alternate Start. I start with Breezehome. Run out of Whiterun, past the Brewry, up the hill to the bandits. Enter the cave (watching memory usage with Elys MemInfo), and it dies right after I see 3GB. I have this same issue not using AS, sitting through the intro, and then running over to whiterun.

 

I'm sorry if this post is all over the place. I have spent more than a week trying to solve this issue, and the only solution I have found is to reduce memory usage. I have got to the point where I can exchange two texture packs and get into the cave without a CTD, but with both I get a CTD. I didn't even think texture packs should even affect CTDs, but I'm relatively new to Skyrim on the PC, so I could be wrong. I also found I could get a bit further with ENB turned off, but would still crash once I got above 3GB of memory. Finally, if I reload a game after a CTD, I can play just fine...until I reach 3GB of memory.

 

I really hope you guys can help. I more than willing to try anything at this point, besides just disabling all of the mods.

Link to comment
Share on other sites

Recommended Posts

  • 0

What I think he's saying is that Skyrim CTD's whenever it tries to make a call to address RAM outside of the 4GB it's allowed which causes it to crash or returns an error after which point it crashes. I'm a bit rusty on programming and never dealt much with memory allocation but I think that's what he's meaning.

Link to comment
Share on other sites

  • 0

Close.

 

Looking at the MSDN reference for memory limits in Windows, you can see that each 32-bit process running in a 32-bit Windows environment is limited to 2GB of virtual address space, or 3GB when the IMAGE_FILE_LARGE_ADDRESS_AWARE linker directive is used during build-time.  For 64-bit Windows systems, the virtual address space available to 32-bit processes is increased to 4GB when that same compiler flag is set.

 

Since the TESV.exe CTDs seem to happen when memory usage gets around 3GB even on 64-bit versions of Windows, this leads me to think the guys at Bethesda incorporated some address range checking within their own code.  After all, it isn't Windows that's throwing allocation errors...Skyrim just exits.  If it were being caused by Windows, I suspect

 (a) it would only happen on 32-bit versions, since that's where the 3GB process limit exists, and

 (b) you'd see Windows throw an error message about TESV.exe attempting to perform an illegal operation.

 

No, I suspect Bethesda set the address range checking boundaries internally during development to ensure compatibility with the limitations of the 32-bit Windows platform.  As a result, when some thread goes to perform a memory allocation that exceeds the internally-defined range, the main process exits (CTD).

 

This could be alleviated entirely by Bethesda compiling a 64-bit version of TESV.exe, of course.  Absent that, it's anyone's guess what will or won't trip the internal program limit.  It's a minefield in there...just going by some of my Papyrus logs, there are lots of quest scripts that never get cleaned up properly during the game, and every one of them takes up CPU/vRAM.  I also suspect the high-resolution textures are likely to blame; not because the available hardware can't handle them, but because the EXE file isn't handling them efficiently.

Link to comment
Share on other sites

  • 0

I also experienced CTD's, infinite loading screens (ILS) and missing textures (purple meshes) even with 16GB and an HD7970-6GB (that's not crossfire, it's a single 7970 with 6GB VRAM)

Using the Skyrim Performance Monitor I found that it was the 3.1GB limit (which I didn't know existed) and in my search for a solution I found this gem: https://skyrim.nexusmods.com/mods/32363/ , specifically Section 8: Cleanmem

 

It's a util that launches periodically (set using scheduled tasks) and clears out a programs allocated or used RAM, so that Skyrim never hits 3GB.  It does cause a momentary pause, probably less than half a second but thats much more tolerable than CTD's.  I have it set to 2min and now hardly ever get CTD's or ILS's, usually only when fast travelling between too many locations too fast.

 

Obviously it won't help if you have so many high res textures that when any location that is loaded goes over then 3.1GB limit.  That (for me) usually results in an ILS.

Link to comment
Share on other sites

  • 0

Close.

 

Looking at the MSDN reference for memory limits in Windows, you can see that each 32-bit process running in a 32-bit Windows environment is limited to 2GB of virtual address space, or 3GB when the IMAGE_FILE_LARGE_ADDRESS_AWARE linker directive is used during build-time.  For 64-bit Windows systems, the virtual address space available to 32-bit processes is increased to 4GB when that same compiler flag is set.

 

Since the TESV.exe CTDs seem to happen when memory usage gets around 3GB even on 64-bit versions of Windows, this leads me to think the guys at Bethesda incorporated some address range checking within their own code.  After all, it isn't Windows that's throwing allocation errors...Skyrim just exits.  If it were being caused by Windows, I suspect

 (a) it would only happen on 32-bit versions, since that's where the 3GB process limit exists, and

 (b) you'd see Windows throw an error message about TESV.exe attempting to perform an illegal operation.

 

No, I suspect Bethesda set the address range checking boundaries internally during development to ensure compatibility with the limitations of the 32-bit Windows platform.  As a result, when some thread goes to perform a memory allocation that exceeds the internally-defined range, the main process exits (CTD).

 

This could be alleviated entirely by Bethesda compiling a 64-bit version of TESV.exe, of course.  Absent that, it's anyone's guess what will or won't trip the internal program limit.  It's a minefield in there...just going by some of my Papyrus logs, there are lots of quest scripts that never get cleaned up properly during the game, and every one of them takes up CPU/vRAM.  I also suspect the high-resolution textures are likely to blame; not because the available hardware can't handle them, but because the EXE file isn't handling them efficiently.

Yes, I had read that article, so it made sense to me about the LAA linker under 32-bit Win. It also makes sense that the fault is likely TESV.exe and not Windows (as Monty seemed to suspect), based on your reasoning. OK, so what role does DX9 play as far as you know?

 

What are your thoughts on the OP of the adjacent thread on memory optimizers (also see prev post by scribble)

 

Thanks for confirming ... assuming that you are a .Net developer or game developer yourself?

Link to comment
Share on other sites

  • 0

I also experienced CTD's, infinite loading screens (ILS) and missing textures (purple meshes) even with 16GB and an HD7970-6GB (that's not crossfire, it's a single 7970 with 6GB VRAM)

Using the Skyrim Performance Monitor I found that it was the 3.1GB limit (which I didn't know existed) and in my search for a solution I found this gem: https://skyrim.nexusmods.com/mods/32363/ , specifically Section 8: Cleanmem

 

It's a util that launches periodically (set using scheduled tasks) and clears out a programs allocated or used RAM, so that Skyrim never hits 3GB.  It does cause a momentary pause, probably less than half a second but thats much more tolerable than CTD's.  I have it set to 2min and now hardly ever get CTD's or ILS's, usually only when fast travelling between too many locations too fast.

 

Obviously it won't help if you have so many high res textures that when any location that is loaded goes over then 3.1GB limit.  That (for me) usually results in an ILS.

 

In regards to so called "memory optimisers" and "memory cleaners" please refer to this thread and reconsider.

Link to comment
Share on other sites

  • 0

I highly doubt the cause of crashes is the VAS being exceeded.

 

If so, the VAS on a x64 system should be 4GB for a 32bit app, not 3.1GB (the same as on a x86 OS when 4GT is enabled). Still we are seeing x86 and x64 systems crash at the same limit.

 

I suspect engine limit, or something set there on purpose by Bethesda to limit it (perhaps needed) as I cannot imagine an engine like GameBryo having such a limitation on purpose, nothing else.

Link to comment
Share on other sites

  • 0

I highly doubt the cause of crashes is the VAS being exceeded.

 

If so, the VAS on a x64 system should be 4GB for a 32bit app, not 3.1GB (the same as on a x86 OS when 4GT is enabled). Still we are seeing x86 and x64 systems crash at the same limit.

 

I suspect engine limit, or something set there on purpose by Bethesda to limit it (perhaps needed) as I cannot imagine an engine like GameBryo having such a limitation on purpose, nothing else.

This is not the case. Many users are crashing at closer to 2.5GB, and I suspect that they are on 32-bit systems for the most part. See the following as a possible explanation:

Looking at the MSDN reference for memory limits in Windows you can see that each 32-bit process running in a 32-bit Windows environment is limited to 2GB of virtual address space, or 3GB when the IMAGE_FILE_LARGE_ADDRESS_AWARE linker directive is used during build-time. For 64-bit Windows systems, the virtual address space available to 32-bit processes is increased to 4GB when that same compiler flag is set.

Also, rolandito somewhat accounts for this:

In reality, each system required dedication of some memory for video addressing, memory-mapped I/O ...

Link to comment
Share on other sites

  • 0

I highly doubt the cause of crashes is the VAS being exceeded.

 

If so, the VAS on a x64 system should be 4GB for a 32bit app, not 3.1GB (the same as on a x86 OS when 4GT is enabled). Still we are seeing x86 and x64 systems crash at the same limit.

 

I suspect engine limit, or something set there on purpose by Bethesda to limit it (perhaps needed) as I cannot imagine an engine like GameBryo having such a limitation on purpose, nothing else.

What zed said plus the other thing to take into account is that the monitors and task manager are only showing us the working set of RAM and I would assume it (windows) holds some in reserve for the process. Also thanks for correcting me roland my posts are sounding a bit erratic even to myself as I've gone over them. I've been having migraines and haven't gotten a lot of sleep due to stuff...
Link to comment
Share on other sites

  • 0

For me the challenge is managing the buffer. I do not have any scenes in my game that alone require 3.1gb. I'd say the max is about 2.6gb when I'm in city world spaces, like solitude world or whiterun world. The problem is when I enter into an interior and the game tries to load the entire interior on top of the 2.6gb from the exterior. However, sometimes depending on the interior that I'm entering, the game will flush the buffer before loading the interior cell. For me the holy grail would be finding a way to force the flushing when interior cells are entered from an exterior. I have suspicions that there is some way to do this via flags or something similar, but have not done much investigating myself. Something must be there telling the game when to do this which is why it does it for some cell transitions but not others.

For now I use pcb console command but that only helps sometimes depending on the situation.

Link to comment
Share on other sites

  • 0

Not that I know a damn thing about the specifics, but I'm wondering about what michaelrw just mentioned above. When I'm watching the monitor (thanks to gopher and STEP for the heads up at Nexus, I DL'd the performance monitor today to check out my own CTDs of late - it happens at 3.1 as well) right before I head to an interior space, I noticed that memory usage seems to stack (interior space starts at what exterior was at and then climbs until a CTD). Just chiming in that I had been wondering if something of that nature was contributing, michaelrw basically just articulated it for me.

Link to comment
Share on other sites

  • 0

i mentioned in my post that i do use pcb, purge cell buffer, command but it doesnt always help. the reason is, when you purge the buffer, it only empties out excess.. so it cannot empty out the data for the cell youre currently in. So, if you run 'pcb' right before you enter the interior it will only drop it to say 2000-2500mb, then once you enter the new cell it will load all the new data on top of that. Now, if there was a way to essentially 'pause' the loading process in between cells during the transition, then run 'pcb' command to clear the data down close to 0mb (maybe like 200-300mb, realistically), then 'un-pause' and it would load all the new cell data on top of that, then we would not have a problem. Unfortunately, we cant clear the buffer during the actual transition.

But this is what ive been saying. The game will sometimes do this for us by completely flushing all the data and essentially re-load everything starting from ~300mb. But it only does this for some cells, not all. If it did this for all cells, I would *never* have a CTD from 3.1gb bug. Of course, one might argue that doing this for all cells could increase load times in between cells since the game has to re-load all the data completely each time you make a cell transition. However, if its choosing between an extra 5 seconds of loadscreen time versus a CTD, I would choose to accept the extra loading time.

 

Here is a scenario:

Im running around outside of Windhelm. My current usage is at ~2300mb. I run up to the main city entrance and before I enter Windhelm, i run the 'pcb' console command and I notice my usage drops to ~2100mb. I then activate the main door to the city, and enter into a loading screen. I instantly notice that my usage drops all the way down to 400mb, then slowly climbs back up to 2100mb and stabilizes around 2450mb. When it is done loading I am standing directly inside of windhelm main wall looking at Candlehearth Hall. My utilization at this very moment is hovering around the 2450-2500mb mark. YAY! No CTD. Success!!

 

Here is another scenario:

Im traveling by foot through the wooded areas between Falkreath and the Dark Brotherhood sanctuary that is close by. When I arrive outside, I see my current usage is at 2700mb. I know Im getting pretty close to that 3.1gb mark, so I run 'pcb' command in console. As a result, my usage drops to ~2200mb. I activate the door and enter into a loading screen. However, this time is different. Instead of dropping down to only a few hundred megabytes of data, the game decides to load all of the DB sanctuary data on top of the 2200mb I was at before entering the cell. Before I know it, I see my usage climb rapidly towards 3.1gb, and it jumps up to 3150mb for a split second, and next thing i know I amstanding inside the DB Sanctuary with my current usage at 3050mb. I quickly open the console and run the pcb command again and it wipes out nearly 2gb of data that was in the buffer from the exterior cell i was just in a moment before. I narrowly escaped a CTD.

(The pcb command saved my butt on that second scenario. If I had not run it, I would definitely had a CTD. However, it does not always work this way. Lets say I ran the 'pcb' command but it only dropped my usage from 2500mb down to 2400mb. I would have had a CTD because I did not have enough "space" for an extra 200mb. Also note that, if the game had behaved in the same way as it did when I entered windhelm, I would not have even come close to CTD)

 

Both of these scenarios are real. They happen to me all the time, just as I have described. 'pcb' command can be helpful but it can only do so much. you cannot use it in the middle of a cell transition to force the game to do a complete flush of memory all the way down to a few hundred mb. when you can use it, it will only (usually) drop you down about 1000mb at the very most and you would be luck to see that. Most of the time we're talking about a few hundred megabytes.

Sorry for such a long response but I felt it was necessary because I often get people saying "hey just use the 'pcb' command in the console" and i need to explain to them that I have tried it and that it is only useful in certain situations. It is merely a tool to help give you ~500mb extra headroom before you enter into a new cell and can also be used to drop you down immediately after entering the new area. Use it once before, then if youre still very close to 3gb after loading into the new cell, run it again. It is helpful indeed, but we need a much better solution. Unfortunately, its the only thing we have at the moment (aside from memory management through the use of smaller size textures and other game assets)

Link to comment
Share on other sites

  • 0

I also experienced CTD's, infinite loading screens (ILS) and missing textures (purple meshes) even with 16GB and an HD7970-6GB (that's not crossfire, it's a single 7970 with 6GB VRAM)

Using the Skyrim Performance Monitor I found that it was the 3.1GB limit (which I didn't know existed) and in my search for a solution I found this gem: https://skyrim.nexusmods.com/mods/32363/ , specifically Section 8: Cleanmem

 

It's a util that launches periodically (set using scheduled tasks) and clears out a programs allocated or used RAM, so that Skyrim never hits 3GB.  It does cause a momentary pause, probably less than half a second but thats much more tolerable than CTD's.  I have it set to 2min and now hardly ever get CTD's or ILS's, usually only when fast travelling between too many locations too fast.

 

Obviously it won't help if you have so many high res textures that when any location that is loaded goes over then 3.1GB limit.  That (for me) usually results in an ILS.

In regards to so called "memory optimisers" and "memory cleaners" please refer to this thread and reconsider.

I had a read through the thread but from what I could see there where no scenario like what I found when testing myself.  While I understand what was being said by Monty, cleanmem must be interacting with Skyrim in some way that is not intented but for whatever reason, for me, works.

 

Essentially, neither cleanmem or any other tool (including 'pcb') can help if the cell being loaded has enough texture data by itself to push over the 3.1GB limit.  This includes even a brand new instance loading a saved game into a location that has too higher memory usage due to large textures.

 

A classic for me was the Ragged Flagon and I in fact used that area for trimming down textures so the area could load.

From a new instance , loading a save outside the Thieves Guild, if I went from Riften into the cistern then ran to the Flagon, guaranteed ILS's.

If I went from Riften into the cistern, waited for cleanmem to run (as above, every 2 min) then into the Flaggon, no issues.

I have repeated this test in many different locations (including running overland outside) with the same result, as long as there is sufficient memory free (the variable reported by Skyrim Performance Monitor) when new data has finished loading, no CTD (overland) or ILS (fast travel or exterior/interior etc)

 

To reitterate, turning on/off textures is not a valid test if when the high res textures being used cause the cell being loaded to go over 3.1GB even if that cell is the first cell being loaded (i.e. the first loading of a saved game in a new instance of Skyrim).  Nothing will help in that case other than reducing the texture load.

Link to comment
Share on other sites

  • 0

I know nothing of the intricacies of the hardware and coding issues of the 3.1g limit. A lot of you are speaking about the buffer and how it loads a new cells buffer onto the old briefly when transferring through areas. 

 

Would zoning the Skyrim world work? Like in the old EQ days where you would walk to a certain point in the world and it would load you into another zone? Basically making a set of cells all through skyrim? Or would that be to onerous of a task?

Link to comment
Share on other sites

  • 0

The cell structure of the game is already doing this. It only loads in a small part of the game at any one point. Depending on how detailed said point is then it cost more to load it in.

 

In order to speed up the process the game also stores alot of stuff in cache in order to swap cells around the memory faster. This is why you get worse stuttering when entering new cells you have not been in before, but not so much when you go back the same way you went.

 

However buffer problems would only really be relevant for people with low amounts of RAM so the game cannot store all info in RAM but has to Rely on HDD or SSD as the case is. I have 16Gb of RAM and when I check what my install is using, then it has what it actually physically is using, but also have a boat load of Cache reserved that vanish at once if I terminate the games process.

And I have very few issues with infinite loading screens or issues of that nature. The last few I have had have all been because I am experimenting with too high ENB settings.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines, Privacy Policy, and Terms of Use.