Jump to content
  • 0

Disappointing Frame Rates (FPS)


Pillendreher

Question

I've upgraded my 6850 1 GB VRAM to a R9 380 4 GB VRAM, but the performance gain isn't nearly as big as I hoped. I noticed gigantic macro stuttering after installing the card, so I "stripped down the game" to find the culprit.

 

I've started with the Vanilla Game+HighResDLC. With everything set to ultra and VSync off, I'm getting 41/81/60,3 fps during the first minute of the intro (with my FX-8320@stock). This just seems off based on these Skyrim Benchmark results:

 

https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/35903-drei-modelle-der-radeon-r9-380-im-test.html?start=9

https://www.pcgameshardware.de/AMD-Radeon-Grafikkarte-255597/Specials/Radeon-R9-390X-Test-1162303/

 

The card itself is working fine since I'm getting results via the Heaven Engine benchmarks that match those by other R9 380 owners.

The FX-8320 is working at 3,7 GHz (Turbo) at 1,4V. My fps dropped even further while running the CPU at 3,7 GHz (No Turbo, OC via BIOS) at 1,35V (solid 13% drop even though the clock speed is the same). - How is that possible?

 

Before I start tackling the stutter issue again - is that kind of performance normal? I mean we're talking about a 4 year old game with no mods whatsoever barely hitting 60 fps. My fps near Whiterun aren't better mind you.

Link to comment
Share on other sites

  • Answers 80
  • Created
  • Last Reply

Top Posters For This Question

Recommended Posts

  • 0

/snip ...

I actually only made an account to report these 2 findings, fSplitDistanceMulti= tweaking and the relation between the 3 primary shadow config settings (Bias, Distance and Resolution). Kind of a funny coincidence, but I still haven't had others test it to prove that I'm right. If I'm wrong and this doesn't help, I apologize in advance. But I am pretty sure it will.

 

Like 99.997% disinfectant-sure.

Please post your disputes on these INI settings over in teh INI tweaking forum or at the relevant Skyrim/SkyrimPrefs ini forums.

 

Thanks for the input!

Link to comment
Share on other sites

  • 0

You guys do realize FPS and monitor refresh rate are directly connected.

 

Motion blur is a technology that masks tearing and such, which may be why you percieve ENB as being "Good", or the experience with it better than it actually is. But,.. I don't know if most people use ENB and get some form of blurring. The difference between the two, assuming ENB is blurring, is that motion blur can compliment higher fps if you're into it, yet it is 100% unnecessary because the frames are fluidly moving in front of your eyes and no tearing should be happening.

 

If you have a high refresh rate and stable, high fps, you will know this is true. If you have adaptive refresh on a monitor, you will know this 24-30~fps crap is the dumbest thing ever reverberated within gaming culture.

 

 

Exactly 30fps on an older CRT monitor will look better than 60hz LCD due to CRT refresh and how it displays pixels/images. Exactly 60fps on 60hz will probably look better than on 120hz depending on driver settings unless you have adaptive framerate control like Freesync/G-sync. 61-120fps will only be displayed at 61-120hz+ because the monitor is only refreshing that many times per second; it cannot physically display more than its refresh rate. Unstable 120fps on 120hz Adaptive (Freesync/G-sync) looks better than unstable 120fps on 120hz anything else, and so on. The cut-off is 144hz/144fps because 240hz+ monitors use technology to flash twice or more, displaying the same exact image twice per GPU frame time, which only cures part of tearing and actually does nothing in terms of displaying more new information to you. It's not showing you more than 120 unique images per second (FPS) even if you have 240 fps locked.

 

And, seeing as everything is in a theoretical locked, stable 30/60/whatever FPS unless it's able to utilize adaptive refresh, motion blur and low fps seems really good. It's not.

Edited by Yakuza
Link to comment
Share on other sites

  • 0

It gets even stupider when you realize that eventually the difference between 1 frame and the next is potentially only a 30% change in your monitor's displayed pixels or something, hardly noticeable in most scenes. After a certain refresh rate/fps, you are completely limited to what the game is offering you for very different images.

 

 

Obviously if you're spinning in circles in Skyrim with ultra-high mouse sensitivity, this totally doesn't matter because you will notice that sweet, buttery-smoothness of high FPS and high refresh.

Edited by Yakuza
Link to comment
Share on other sites

  • 0

 

 

You guys do realize FPS and monitor refresh rate are directly connected.

 

If you have a high refresh rate and stable, high fps, you will know this is true. If you have adaptive refresh on a monitor, you will know this 24-30~fps crap is the dumbest thing ever reverberated within gaming culture.

 

 

Exactly 30fps on an older CRT monitor will look better than 60hz LCD due to CRT refresh and how it displays pixels/images. Exactly 60fps on 60hz will probably look better than on 120hz depending on driver settings unless you have adaptive framerate control like Freesync/G-sync. 61-120fps will only be displayed at 61-120hz+ because the monitor is only refreshing that many times per second; it cannot physically display more than its refresh rate. Unstable 120fps on 120hz Adaptive (Freesync/G-sync) looks better than unstable 120fps on 120hz anything else, and so on. The cut-off is 144hz/144fps because 240hz+ monitors use technology to flash twice or more, displaying the same exact image twice per GPU frame time, which only cures part of tearing and actually does nothing in terms of displaying more new information to you. It's not showing you more than 120 unique images per second (FPS) even if you have 240 fps locked.

 

And, seeing as everything is in a theoretical locked, stable 30/60/whatever FPS unless it's able to utilize adaptive refresh, motion blur and low fps seems really good. It's not.

I don't think they realize that. I've tried to explain it several times, but have given up. It's not worth my time trying to do so. There are several aspects to take into account...fps, refresh rate, monitor's display lag, the display technology... It's never been "30fps is good enough for gaming". It's good enough for movies and TV...not gaming....depending on your monitor. ::):

 

Hence why I left this discussion a while back.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Guidelines, Privacy Policy, and Terms of Use.