Jump to content

stoppingby4now

Founder
  • Posts

    2,416
  • Joined

  • Last visited

Everything posted by stoppingby4now

  1. Sucks that they changed it, I wonder what kind of ads we'd get anyways (not that I have to deal with them; yay Adblock!). Â The current search tool is a little lacking, if it's not too much trouble to set up (at least for trial purposes at the moment) it might be worth seeing if Google can do better.I second the vote for a Google search locally (and I use DNT+ ... works nicely) ... ours is pretty bad I highly recommend checking out Ghostery. It has been proven to block far more than DNT+, and IMO it's a lot easier to allow things when you really do want to (if you want to). The only gotcha is that by default it doesn't block anything, so you have to go into settings and select everything (very convenient Select All button). It also cleans up Flash and Silverlight cookies.
  2. I've been experiencing the same thing while doing testing. It always re-downloads installscript.vdf for some oddball reason now. You can easily tell what file(s) were downloaded by going into your Skyrim directory and sorting on Date modified.
  3. Yes, you need to have updates enabled. A brief follow-up, but it appears that Steam has changed the way that the Automatic Updates setting is handled. It is no longer reading the value from the registry for Skyrim, but it still works for the Creation Kit. All of this seems to be very closely related... Steam Client Update - Oct. 23 Skyrim 1.8 beta - Oct. 24 Skyrim 1.8 - Nov. 1 Somewhere in the time frame above and beyond, users report getting a message from Steam that it must optimize game files when attempting to launch Skyrim. Needless to say, it's back to the drawing board for Skyrim Unplugged.
  4. Yeah, some people don't like being tracked, and I'm one of them. Which is why I use a script blocker and cookie manager so I can choose what gets allowed through. Also have a flash cookie wiper, as companies started getting smart to the cookie blockers and used flash to store information. By default it allows for more storage than cookies, and they never expire. But, it is a double edged sword. Without adsense, a lot of what Google provides for free would not be so.
  5. Google changed things for the custom search engine. Free use requires adsense (so there will be ads), otherwise it's $100/year to remove them. We definitely like being ad free, but as long as folks don't mind the ads for a forum wide search via Google, I can add that in.
  6. Just judging from the videos, the graphics still look very decent. I hadn't heard of this game till now. I loved the Diablo series and that style of game play (except D3...pretty disappointing), so I really look forward to trying PoE out.
  7. New version of Skyrim Unplugged is near ready for release, and I've had a few users test it so far. I still need to finish up help documentation, but am releasing a preliminary beta release here to a wider audience before I publish it on the Nexus. Have fixed a few minor issues that have been found so far, but am now looking for a wider target audience to make sure there are no surprises (or at least hopefully catch them before release if any still exist). A few key points to note: File and Directory definitions for each product (for setting file permissions) are now stored in an XML file in the application root. This will allow for releasing an updated XML file should a new DLC come along, and can be used to test by others by just adding a new entry manually.Rather than having a one button click to do everything (Enable or Disable), each product has two buttons to give greater control (one for toggling Automatic Updates, and one for toggling file permissions per product).Breaking these functions out resolves the issues that plagued previous versions. The Automatic Update buttons will be enabled as long as the registry keys exist for those products, and so is not tied to the Skyrim Installation Path. This means if you uninstall the product, or accidentally change the path, you can still toggle the state. The file permissions buttons will enable as long as the Skyrim Installation Path exists, and the appropriate file flagged as the main Exe (defined in GameFiles.xml) exists.Upon launch for the first time, SU will attempt to determine the Skyrim installation directory from the registry. First is from the Bethesda Softworks/Skyrim key, second by basing off the Steam installation. Any path that exists, either grabbed from the registry or entered manually, will be stored in a config.xml file and used from there on out. If the path should ever change (ex. move/reinstall Steam), you would need to either manually re-enter the new path, or you can just zero out the path to an empty string which will cause SU to attempt to resolve from the registry again.I have had one report of a user that re-installed Skyrim but the required registry Value was missing to be able to Disable Automatic Updates. The tell-tale sign of this is that when launching Skyrim Unplugged, Automatic Updates is set to "N/A".SkyrimUnplugged1-6b-0045.7z
  8. In everything I've been able to dig up, there is driver overhead that will end up getting reported as use in dynamic VRAM. That alone means that a baseline needs to be taken to subtract from (hence your first row as a baseline). But, Process Explorer will be able to give you an accurate figure for just TESV.exe (which needs to be done, as we don't how accurate the figure is when taking the system wide usage as reported by GPU-z and subtracting the Windows baseline). Test on my system (using Process Explorer), my System GPU Memory for TESV.exe only fluctuated between 22.3 and 22.6 MB when switching between no SRO and SRO, but the global use fluctuated as much as 5MB. Now, I admit, I didn't catch the two cases where the System GPU Memory (sorry, I don't like the term Dynamic VRAM) jumped over 100MB usage. I wholeheartedly agree that it is most likely texture data that was offloaded. But that doesn't change the fact that we need to know the frequency in which data is swapped out. But, looking again at your foot notes, you have c to designate that stuttering occurred at > 0.5x for Dynamic VRAM. Am I interpreting that accurately to mean that in those two scenarios you experienced stuttering (which would correlate to a higher frequency of swapping)?
  9. 'ordinary' amounts of system RAM are allocated during initialisation for the regular data buffers & other support data like any other multibeam CPU task, and that's fairly small. Cuda runtime libraries, however, being proprietary & closed source, do things internally to support the detected hardware, 'gobbling' up some resources for fft libraries etc, some of which would be host side code, target card specific internal data structures somewhat dependant on Driver threads at a lower level, and various OS/driver level transfer caches & mirrored images. Some of those are part of WDDM specifications, some would be for device specific performance optimisation, and some of those would be Cuda specific to make the same Cuda code work on the different driver models, as part of an 'abstraction layer'.
  10. See my original theory on the point of overflow and probability of need. Setting aside in system RAM is still much faster than reacquiring from the process (i.e., disk), and if the MM algorithm is well conceived, then the probabilities that a given piec of data will be needed at any given time should factor into the equation.You pretty much back my point. :P As an example, back when we were evaluating Vista for clients, we initially received results that didn't make sense at first in terms of system memory usage. By all accounts, we were showing that we were on the brink of needing more memory to support the applications, which didn't make sense given that nothing had changed in code. The fact is, that Microsoft changed the memory sub-system in Vista to start pre-loading heavily used applications into memory (and naturally it tended to include their products even before first use), so reported use (nearly 100% memory utilization) wasn't realistic and we had to adjust. The same concept should apply to the GPU as well. Hoard a texture in System RAM if it puts it there just in case, because it's faster to load than from disk. So, just because it is sitting in System RAM does not equate to it's requirement for use ever again in the current session. My point is that speculating 1.X of overall GPU related memory use as being safe, without taking into consideration it's actual need, is misleading. I can only see a case for determining the frequency of swapping as being the only real measure of texture use to graphical anomalies. Consequently, a high frequency of swapping should line up with over-use of higher resolution textures, which would mean anything beyond 1.0 is immediately noticeable.
  11. Follow up to fruther expand on why I think the 1.X speculation is mis-leading. At least in terms of any texture data that is pushed to System RAM, it may just be holding data there that it doesn't currently need (and may not need for quite some time) because room had to be made for the current scenes. There may even still be some data in VRAM that isn't being used, but it hasn't needed to swap out yet. I just can't buy any speculation of 1.X times combined GPU memory use to be considered safe based on the potential for the above scenario, particularly because any texture data currently being held in System RAM just in case, may not even be used again in the current session. Once swapping has to occur, any use above 1.0 immediately has the potential to cause problems, and my point is that the key is frequency (and in tern influenced by system spec).
  12. That is true that System RAM will be used to swap out data when VRAM gets full. But one thing you also have to realize with the reported data from GPU-z, is that all of those figures are global figures as well (combined usage for all processes). I disagree that it is misleading, because it is a fact that I get no stuttering on my system even with the 2k max overhauls and then some. It only happens when I pile on more demand, which means that the 1GB limit on my card(s) in my system is a soft threshold, so I reason that there must be something going on regarding buffers and memory management. I simply devised a plausible theory to explain it, and until an expert on this particular interaction between VRAM and system RAM can explain the reality in lay terms, I am sticking to it (although, I agree that 1.5x is probably off the mark ... 1.1x or simply "soft threshold" might be more realistic .... but still inexplicable). The fact that you don't get stuttering in those limited tests doesn't mean anything conclusive. The total amount of GPU System RAM (or Dynamic RAM as GPU-z calls it) reported is extremely low. The key piece of data is Dedicated VRAM, which doesn't even hit your VRAM barrier in your tests. So again, without knowing if any texture data was being swapped, it's all conjecture and guessing. Beyond data being swapped to System RAM to make room, the next real piece of data that we need is how often data is having to be swapped. Even if GPU System RAM is reported at 200MB, and Dedicated GPU RAM is topped out, that in no way is an indication of a soft limit for stutter free play. Again, you need to know the frequency at which the data is swapped. Even when the frequency of swapping increases, the results are going to vary from system to system based on factors such as System RAM type, RAM speed, number of GPU's, northbridge chip, etc. That is why I feel it is misleading, and I still stick to that. Exactly my point. I'd really like to get a good explanation without digressing into the conceptual barriers of theoretical physics. I'd like a simple model (which is always possible, given an expert with enough creativity to extrapolate). We don't need physics, we just need a way to see/report on what is actually occurring during game play, specifically in regards to swapping between VRAM and System RAM. I have yet to find anything that can report that. The same concept applies in performance testing server deployments. Stress the system and gather performance data, including whether or not any heavy paging occurs. The same concept applies here, just a different sub-system. I wish Performance Monitor had sensors for the GPU.
  13. In practical terms, as one exceeds 1.0x, the potential for stuttering or graphical anomalies will exist. There is no solid case that can be made in regards to such a number beyond 1.0. In relation to the tables you posted, the shared memory is very low, which is what you want. There is always going to be some small amount of overhead imposed by the graphical sub system. But in terms of VRAM being over utilized and having to start relying on system RAM, the actual point at which someone will notice any glitches will be dependent on their system, and how often texture data is having to be switched out (though the later will be the biggest impact as frequency of swapping will be directly related to performance loss). In my recent tests (with a 3GB card) my reported System Shared RAM never exceeds 22MB, and I never see a drop in Dedicated GPU memory, which tells me the card is holding on to the texture data even if it's not using it (because it has the room to do so). That's the best scenario, and it also shows that it is not managed the same as system memory. In terms of the data in the performance data tables, suggesting that 1.1x is safe I feel is misleading (and even more so to suggest anything beyond that). The key piece of data that is missing is whether or not texture data is being swapped out between System RAM and VRAM during those tests, and at what frequency. I have yet to determine any way of detecting that, if one even exists.
  14. Z929669 setup TT for us, he's the man!
  15. I think SU is going to have a link that pops up Chili recipes. :P
  16. One of the great things about Chili is it's very simple, and easily customized. I use a slow cooker, as you can just set it on low when you go to work/school and dig in when you get home (and you can make all sorts of soups this way). You can get a 2 quart Rival for around $10 at Walmart, or a 4 quart CrockPot for about $17 I think. You don't need all those fancy electronics with timers, etc. They jack up the price and introduce another failure point (I'm still using a 4 quart slow cooker that is over 2 decades old and still going strong). All you need is a 3 position switch for Off-Low-High. The following is my base that I use in a 4 quart, and you can cut everything in half for a 2 quart. 2 lbs ground beef (I use Bison from my very own backyard in Colorado). 2 large onions, finely chopped 1 carrot, chopped 1 stalk celery, chopped 5 garlic cloves, minced 3 cans of diced tomatoes (you can chop them yourself, I just go the easy route here as it also has plenty of juice. I use Diced canned tomatoes from Muir Glenn Organics, and they have all sorts of varieties from plain, to garlic, chipotle peppers, etc.). 1 tbsp paprika 2 tbsp ground coriander 1 tbsp ground cumin 1 tbsp salt 1 tbsp chipotle powder (this will add some heat, so if you want it mild just go with 2 tbsp standard chili powder) 1 tbsp chilli powder The carrot and celery are optional. I use celery in almost every soup I make so always have them around, and the carrot adds more nutritional value, and very slightly sweetens. Garlic is also optional, I just love garlic and it has great health benefits. And, if you like beans, you can add those as well, but Z can probably give some pointers in that regard since I don't use them. The main prep will be to heat a large skillet over med-high heat with 1 tbsp olive oil, and saute the onions (and carrot, celery, etc.) till everything gets soft (about 5 - 6 mins). Then I'll add the ground meat in and cook that till it's just browned. Then dump all of that into the slow cooker, add in the canned diced tomatoes and remaining ingredients. Give everything a good stir, set the slow cooker on low, and let it sit for at least 1.5 hours. You don't need to go too long since the meat is already browned. But, the flavors will really develop the longer you let it sit (I shoot for 6 hours). Beyond that, you can add/remove/tweak to your hearts content to find your perfect chili. The above will also not be a thick chili as most folks are used to, but it will be hearty in terms of the meat.
  17. Are you kidding? Bison >>>>>>>>> Cow. :P
  18. Slow cooked some Bison Chili all day. Mmmmmmmm
  19. The reason Wrye Bash doesn't remove the ESP when you clean it after it is installed is because it keeps track of files based on a checksum. When you cleaned the file, the checksum changed. Your alternative approach is valid.
  20. It's interesting that a new Steam ID popped up with the name of "The Elder Scrolls V: Skyrim", and it is 72869. If you use that ID on steampowered.com it redirects to the actual ID of 72850, so I'm guessing it's a place holder for something. My first guess is for Dragonborn, but their current DLC's are in the 200000 range. Now I'm very curious.
  21. I definitely had a lot of fun with Halo 1 on my Xbox, but I haven't used it in years. Never did get it for PC. Updated the OP to state that the series is live action.
  22. I've only played the first Halo game, and not been into it beyond that, but there is a new live action Halo 4 series that is being released in parts, and I have to say it's really well done. Three parts have been released so far, and they get better with each one. First part below: [video=youtube]https://www.youtube.com/watch?v=BfJVgXBfSH8
×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines, Privacy Policy, and Terms of Use.