Jump to content

GeForce GTX 970 specifications


Recommended Posts

  • Replies 164
  • Created
  • Last Reply

Top Posters In This Topic

I run my games on 1080p and even with a Skyrim with maxed out textures, SFO and an ENB, I can still pull 60 fps, so these performance issues aren't enough for me to send my card back. Besides, if I were to do so, what would I use instead? A 980? An AMD card? That's not going to work for me. Exact specs be damned, the 970 is still my best option.

 

A bigger problem would be if the card was known for killing systems, but I've had no performance issues of that kind.

Link to comment

Unless you are at higher resolutions, and/or use silly amounts of memory and expect fully fluent gameplay then it wont matter in the slighest. 

 

Most current games, and even upcoming are still made within the console specs... so they wont be able to kill your system.

Witcher 3 is ofc. going to be a pc made game first, and most likely going to have the same silly options the 2nd one had... so there you might be able to feel a difference.

 

However at the end of the day it is memory, so the main culprint is going to be texture load.. not so much resolution.

So Neo I am not sure what you mean by "games gobbling up VRAM"....  again you almost never can use the figure of VRAM use for anything since it does not take cache into account... it will just happily be full even though it does not use the memory for anything. 

 

The reason people even noticed this is only in cases like Skyrim where you can load such silly amounts of textures that you can exceed into the slow part of the memory and then have to wait... swapping out textures at low bandwidth.. especially the slowdown here would be noticeable. 

However as the original article stated.. it would kinda require one to seek out the issue to really find it at all. 

 

I would highly doubt that any game.. even the witcher 3 will be able to provoke slowdown issues because of this... most likely some over epic DoF or other related issue will kick in way before that. Which is again not related to memory. 

 

 

As a final note.... Increased power consumption does not only equal more heat.. it also means more power is used.. which mean the card actually cost more to run. That is the main reason this chip is so epic compared to the previous generation. And still the main selling point. 

 

Hope that all made sense :) 

Link to comment

Okay this might be a bit off topic but I am curious.... So you are telling me that shadows of mordor.. at 1600p with AA on will eat up past 3Gb of VRAM atm ? and cause FPS drops and massive slowdowns ? 

 

That is just ... wow. Are we talking like 16x multisampling or what ? (In which case I have to ask.. why would you do that at 1600p.. would make almost no visual difference! Heck even 8x would probably be overkill). 

Based on what I have seen of the game it does not contain even full 2k textures majority since it is somewhat polished and optimized. So that would mean that most of it would be purely rendering the same thing 16 times at most... just does not add up to that much memory actively being used. At least not in my head! But oh well hard facts beat me down on a daily basis! :) 

 

Anyways... ofc if one can get a refund here etc. then that is an option.... now the only question is neo... why did you not just go for the top models if you wanted future proof (ish ) grade hardware! :) 

Link to comment

On release the only difference was some shaders between the cards. Now it's a memory difference. 4gb vram WAS the future proofing part...

 

Basically what you see when getting past the limit are frame spikes / stutter while the avg frames look good it's very hitchy and not very enjoyable. Funny enough I was fooled because I thought it was a SLI driver issue but in reality it was this.

Link to comment

I know Boris has made similar comments both one way and the other before, but it seems that he will be dropping any NVidia-specific optimisations in his ENB in favour of AMD. For those of us with NVidia cards, it would be advisable to retain the last known optimised dll in case of poor performance going forwards.

Link to comment

I know Boris has made similar comments both one way and the other before, but it seems that he will be dropping any NVidia-specific optimisations in his ENB in favour of AMD. For those of us with NVidia cards, it would be advisable to retain the last known optimised dll in case of poor performance going forwards.

Good advice. I just downloaded the latest DLL.

 

If AMD releases the 300 series within the next few weeks Nvidia will be hurting... Link: https://www.tweaktown.com/news/43239/amd-teases-new-fix3r-video-radeon-r9-300-series/index.html

Link to comment

I'm trying to decide what to do if/when I get my refund. I don't really feel like paying the premium for 2 980s (+$440 on top of my initial investment of $750, a whopping 60% cost increase) tho that is my only "real" option for same/better performance without the memory/stutter problems. Additionally it also gives Nvidia more of my money for being douches.

 

That being said, the 290X is old tech by comparison(sorry AMD...). The 300 series is interesting and if it was out would be very tempting. Unfortunately, this situation leaves me in a bit of a pickle and I do not want to basically stop gaming for a few months and/or buy a cheap loaner card I'll replace before long. I do not have my original 670 4GBs I had before, they are both sold.

 

Anyone with any sage advice? No, keeping them with the 3.5GB gimp is not an option as I know for a fact TW3 and other games in 2015 will be using all that VRAM.

Link to comment

Do you really need to run SLI? Can you return one at a time? My son just bought a single EVGA GTX 980 4 GB Superclocked model and he is quite pleased with it ($560 on newegg not including $10 rebate).  He is obsessive about having everything at max settings and getting 60 fps. That being said his monitor does not even do 1080p, it is an old Dell Ultra I gave him. I know you're running at higher then 1080p.  Just curious, what monitor(s) do you use?

Link to comment
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines, Privacy Policy, and Terms of Use.