-
Posts
13,028 -
Joined
-
Last visited
Everything posted by z929669
-
Thanks for confirming my incorrectness ;) Good to know
-
You probably won't gain much with your 2GB then, but it can't hurt. I am not sure, but Skyrim likely has a cap on how much available VRAM it will utilize, and it is probably not much more than 2GB.
-
Thanks for the info. (wee need more like this ) I will verify when I have some time.
-
It makes sense to manually reduce anything that will likely cost VRAM in outdoor areas using DDSopt if you have 1.5GB VRAM or less. Otherwise, you should be able to run most of the HR texture packs without worry (or the pre-built performance packs). It is difficult to say for sure, but reducing clutter and misc items that are often in the world space will likely have a positive impact and would be worth it. I have decided to do this myself, but only after I have installed all mods so that I can do it all at once (rather tan by each package). Additionally, I have opted to reduce standard normal maps only as a first pass. One DDSopt enhancement would be to allow mip stripping of only defined normal maps so that we can reduce all existing rather than only a single mip level at a time .... perhaps that functionality is possible now using the INI and specific settings, but I will have to confirm with Ethatron.
-
-
Recommended settings for DDSopt
-
I am at my wits end with this, but it seems that no ENB DoF works for me no matter what version I use. I have tried messing with and resetting my INI files, ENB 0.113, 0.119 and 0.121 beta, Skyrim versions 1.7 and 1.8, double checked that my bFloatPointRenderTarget is set to '1' as well as anything else I can think of. Tried Skyrealism, Winterheart and Project ENB all to no avail. Everything works but the DoF .... I have to disable that every time. Never had this problem before with my hardware config, so this has to be a software issue. Anyone have any ideas?? I am stumped. EDIT: I fixed this and offer up some potential solutions.
-
Many saves now posted to Dropbox. Details are in the ReadMes.EDIT: also included several great screenshot primers, which are based upon the "Testy" character. These were generated by moving around via COW x y, so these are not to be confused with mod-testing saves, which are strict vanilla clean saves.
-
I have begun creating the standard save games (WB backups) to be used by all mod testers (TBD). Fri, if you want to team up on this, you might want to grab the "Helgen Keep Finish" save and begin on a mage-class trajectory. Suggest you follow the Dawnguard quest path ... we should discuss ideal save locations and quest paths & behaviors though. Admins: location is Dropbox\STEP\Mod Testing Project\Standard Save Games See 0_ReadMe for details.
-
Elder Scrolls online Trailer Released
z929669 replied to mothergoose729's topic in General Skyrim LE Discussion & Support
I personally hate the whole idea, and I think it will steer future dev away from SP versions ... TES6 will likely be a lame SP with the majority of dev supporting the MP side (just like the FPS game types). I truly hope that the SP TES experience will prevail though ... -
A smart-ass older brother might say .... What? You ARE a turkey
-
-
Doh! Farlo is falling behind his game! Happy TG to you too (and all).
-
You want me to check all this! Are you serious :) Between my mixed and pure 1024k version is about 85-100mb difference. Btw, i use 2048k for Landscape, except trees. But Landscape can also be mixed, and you can use 1024 res normal maps. Be creative man :) No, that is why I prefaced with "if you care to" (which I suspected you wouldn't). I was just trying to find a way to creatively coax more specifics from you and save myself some time ... it worked :P And I honestly cannot see any perceptible difference between many of your 1k/2k terrain textures (e.g., forest floor around Riverwood) ... but I will play around some more as you suggest when I have some time Actually i didn't know that bedroll textures are in 1.6 update? These are HD DLC textures that i have forget to remove during file transition. I probably will not release any more updates for Skyrim? I knew they were just leftover from HRDLC as I mention in the thread that Farlo linked ... not sure what he is talking about though, because I also noted that the resolution of Better Bedrolls was same as HRDLC (but I think that the author simply sharpened the vanilla HR versions, albeit a bit too much) :?
- 425 replies
-
- SKYRIMLE
- 04-foundation
-
(and 1 more)
Tagged with:
-
I operate using only 1 GB VRAM (mirrored via CrossfireX), so I'd like to determine 'stretched' versus 'un-stretched'. Then I want to construct a custom hybrid of SRO and compare only the following + HRDLC over a virgin vanilla installation without any post processing or other 'noise': SRO 1024 fullSRO hybrid full (need details on your recommendations for combining HI & LO - If I construct, I will share with you for upload to your mod)SRO 2048 fullI will likely also experiment with 4096 mountains.It easy to determine. Landscape, Dungeons and Towns are stretched in many places. But with Landscape and Towns you must be careful because they cover huge areas, and this will eat VRAM easily.Dungeons and interiors operate like individual cells, which means when you enter some house or cave only this cell is loaded. In this case you can use even 4096 textures. Clutter is tricky. Many clutter textures are placed outside, and high-res is not a good idea. I can't use all high-res clutter. Also with 1gb of VRAM forget about 4096k mountains. What you need to do is find and save places where textures are stretched or not and determine which res is the best. First install all high-res textures and then throw inside low-res version, Clutter file for example. Then enter the game and load those places with Clutter textures and look. And do not overwrite high-res files with low-res, just rename them like in attachment. Here you see Clutter which is mixed file that i use in game, and ClutterM which is high-res file. After deciding what to use you can delete high-res version file. This is how i do when testing and mixing textures. Also i have to mention that i use only parts of HD DLC; armor, weapons, clothes and some clutter. Armor, weapons and clothes textures are resized to 1024 res. I think there is mod which also did this? You also know which other mods i recommend and use, and this is included in my combination except Vurts textures. They look great but i can't use them cause game stutter to much. And to be honest those vanilla plants look pretty good. They are not stretched. Many SRO textures look very good even in 1024 resolution because i have made them to be very detailed and clear in 2048 res. So resizing wont make them look worse to much. And mods like this are very helpful: https://skyrim.nexusmods.com/mods/26126/ OK, this is a good start. I'll use DDSopt to reduce some of the HRDLC textures as you advised and then go through this exercise using SRO HI/LO to see what has the biggest effect. The mod you mention is a MUST for STEP I think (will post a thread). SMIM USP do not do this? Thanks @Starac I blended your v1.6 1k/2k textures, and the result is great so far. I simply replaced the following 2k texture folders (I did not go through each texture folder. I only replaced directories where I assume most textures occur in high density/frequency outdoors. I retained 2k trees, roads and dungeons as well as all architecture directories not named after towns or that are not part of Skyrim exteriors (e.g., highhrothgar, skyhaventemple, falmer hut). Following is a complete list of 2k textures that I overwrote with 1k. So far, my frame rates and VRAM consumption are only either same or slightly higher than with pure 1k (if you care to, please let me know if I missed anything obvious or if I unnecessarily included anything):
- 425 replies
-
- SKYRIMLE
- 04-foundation
-
(and 1 more)
Tagged with:
-
Sucks that they changed it, I wonder what kind of ads we'd get anyways (not that I have to deal with them; yay Adblock!). Â The current search tool is a little lacking, if it's not too much trouble to set up (at least for trial purposes at the moment) it might be worth seeing if Google can do better.I second the vote for a Google search locally (and I use DNT+ ... works nicely) ... ours is pretty bad
-
Finally got around to doing some testing with the Pre5b version, and following are my image compares of foliage. In each image triplet, I have: original vanilla texturesDDSopt-ed vanilla textures using large multipliers for: Behave > Textures > Normal-map steepness raiseBehave > Textures >Â Foliage-map opacity raiseBehave > Textures >Â Color-map gammaBehave > Textures >Â Alpha-map contrastDDSopt-ed vanilla textures using SMALL multipliers for the aforementioned settingsNote that there is a bit of a tradeoff between certain textures. The larger multipliers (second in each set)correspond to better looking distant pine trees (see first two image sets), but unrealistically "lush" grasses (of certain types) and distant branches (see last image set). The opposite is true for the lower multipliers (third in each set). I have not been able to identify the texture files involved with the branchy trees and yellowish grasses in the distance of the third image set, but those (and others like them) require special treatment I think (i.e., lower mip level foliage-maps should probably not be raised):
-
In everything I've been able to dig up, there is driver overhead that will end up getting reported as use in dynamic VRAM. That alone means that a baseline needs to be taken to subtract from (hence your first row as a baseline). But, Process Explorer will be able to give you an accurate figure for just TESV.exe (which needs to be done, as we don't how accurate the figure is when taking the system wide usage as reported by GPU-z and subtracting the Windows baseline). Test on my system (using Process Explorer), my System GPU Memory for TESV.exe only fluctuated between 22.3 and 22.6 MB when switching between no SRO and SRO, but the global use fluctuated as much as 5MB. Now, I admit, I didn't catch the two cases where the System GPU Memory (sorry, I don't like the term Dynamic VRAM) jumped over 100MB usage. I wholeheartedly agree that it is most likely texture data that was offloaded. But that doesn't change the fact that we need to know the frequency in which data is swapped out. But, looking again at your foot notes, you have c to designate that stuttering occurred at > 0.5x for Dynamic VRAM. Am I interpreting that accurately to mean that in those two scenarios you experienced stuttering (which would correlate to a higher frequency of swapping)? Good question... I honestly forget now :(
-
Perhaps, but if it is only driver overhead, why does it jump up suddenly from around 50MB to >120MB at the small window of Dedicated = 959 - 968MB? If it were overhead, then I'd expect it to be pretty constant based on the number of active processes (which is absolutely consistent in all of those rows of data, albeit the very first, which is the windows background that should be subtracted from each of the other rows). At this point, there really is no argument but that of the manner in which we speculate, as neither of us has provided any conclusive evidence (and it seems like it would be difficult to come up with any). We are left with speculating as to the underlying causes of the data we see, and I think that the jury is out. I am anxious to concede though, given a bit of light in this seeming black box ;)
-
ACCEPTED Ruins Clutter Improved (by raiserfx)
z929669 replied to parasyte79's topic in Skyrim LE Mods
Here you say how you aren't familiar with any of my work. But earlier in post 19. you said this:Â Â Does this makes sense to you? Yep, got it. Got it what? You didn't try my mod but you criticize it? And then you expect people to take you seriously? I don't take you seriously. -
See my original theory on the point of overflow and probability of need. Setting aside in system RAM is still much faster than reacquiring from the process (i.e., disk), and if the MM algorithm is well conceived, then the probabilities that a given piec of data will be needed at any given time should factor into the equation.You pretty much back my point. :P As an example, back when we were evaluating Vista for clients, we initially received results that didn't make sense at first in terms of system memory usage. By all accounts, we were showing that we were on the brink of needing more memory to support the applications, which didn't make sense given that nothing had changed in code. The fact is, that Microsoft changed the memory sub-system in Vista to start pre-loading heavily used applications into memory (and naturally it tended to include their products even before first use), so reported use (nearly 100% memory utilization) wasn't realistic and we had to adjust. The same concept should apply to the GPU as well. Hoard a texture in System RAM if it puts it there just in case, because it's faster to load than from disk. So, just because it is sitting in System RAM does not equate to it's requirement for use ever again in the current session. My point is that speculating 1.X of overall GPU related memory use as being safe, without taking into consideration it's actual need, is misleading. I can only see a case for determining the frequency of swapping as being the only real measure of texture use to graphical anomalies. Consequently, a high frequency of swapping should line up with over-use of higher resolution textures, which would mean anything beyond 1.0 is immediately noticeable. All well said, but it still does not explain the reality of points made previously. Please speculate on the relative change in dynamic and dedicated VRAM in the performance charts of the DDSopt guide. As I implied, I think that these should be taken as conservative "just in case" numbers, hence a soft threshold, hence a safe bet that we can exceed the reported VRAM threshold by some amount, and 1.1x seems perfectly reasonable, given the data
-
See my original theory on the point of overflow and probability of need. Setting aside in system RAM is still much faster than reacquiring from the process (i.e., disk), and if the MM algorithm is well conceived, then the probabilities that a given piec of data will be needed at any given time should factor into the equation.
-
That is true that System RAM will be used to swap out data when VRAM gets full. But one thing you also have to realize with the reported data from GPU-z, is that all of those figures are global figures as well (combined usage for all processes). Oh, I realize that ... this is the reason that you see the base windows stats at the top of that first table in the DDSopt Guide ;) It actually gets very near that barrier in my tests, but it does not ever seem to break it ... and that is precicely my point. Dynamic VRAM increases at those upper limits faster than does dedicated ... just look at the Skyrim HD table and note how you get nearly a 200% increase in dynamic VRAM for only a fractional increase in dedicated VRAM once the latter exceeds around 960MB on my 1 GB card(s). The opposite is true up to that point, so I would say that there is a soft threshold that begins to buffer using system RAM beginning at about 96% of dedicated VRAM utilization. Just look at the relative change within and among those two columns. ... That is the crux of the biscuit. We don't need physics, we just need a way to see/report on what is actually occurring during game play, specifically in regards to swapping between VRAM and System RAM. I have yet to find anything that can report that. The same concept applies in performance testing server deployments. Stress the system and gather performance data, including whether or not any heavy paging occurs. The same concept applies here, just a different sub-system. I wish Performance Monitor had sensors for the GPU. Agreed, but it sounds like Process Explorer can help (and a bit of deductive reasoning)
-
ACCEPTED Ruins Clutter Improved (by raiserfx)
z929669 replied to parasyte79's topic in Skyrim LE Mods
I appreciate the idea of incorporating realistic textures into the game ... more importantly, in context with the surroundings. Agree with Starac's general principles and Besidilo's point about subjectivity and in-game, evidence-based compares. We all should probably back up our opinions (me included) with real evidence-based arguments according to a basic set of assumptions. Plausible-looking, realistic textures in terms of: Materials (i.e., what is it made/formed from? Prevalence in Tamriel? Natural or magical?)Colors (i.e., see previous)Surface texture (i.e., smoothness/dullness/reflection/refraction ... and reason)Blending with adjacent textures (i.e., see Starac)... to name a few off the top of my head. Furthermore, anybody has the right to like what they like for whatever reason that they like it; however, textures should be graded according to a fairly strict set of well-defined criteria to leave as much subjectivity as possible behind. We will incorporate specific criteria into our future testing regime to avoid seeming bias. -
I disagree that it is misleading, because it is a fact that I get no stuttering on my system even with the 2k max overhauls and then some. It only happens when I pile on more demand, which means that the 1GB limit on my card(s) in my system is a soft threshold, so I reason that there must be something going on regarding buffers and memory management. I simply devised a plausible theory to explain it, and until an expert on this particular interaction between VRAM and system RAM can explain the reality in lay terms, I am sticking to it (although, I agree that 1.5x is probably off the mark ... 1.1x or simply "soft threshold" might be more realistic .... but still inexplicable). Exactly my point. I'd really like to get a good explanation without digressing into the conceptual barriers of theoretical physics. I'd like a simple model (which is always possible, given an expert with enough creativity to extrapolate).

