Forum Settings
       
Reply To Thread

FFXIV Exploration vs Character Benchmark DifferencesFollow

#1 Aug 02 2013 at 8:37 PM Rating: Excellent
Sage
***
1,675 posts
Ok, I found the old benchmark from 7/3 or somewhere around there and decided to run it versus the new one, back to back.

They are both full screen and at their respective max settings:

(I'm also running the latest Nvidia drivers)

First the Old Exploration:

FINAL FANTASY XIV: A Realm Reborn Official Benchmark (Exploration)
Tested on:8/2/2013 6:33:55 PM
Score:4414
Average Framerate:36.616
Performance:High
-Easily capable of running the game. Should perform well, even at higher resolutions.

Screen Size: 1366x768
Screen Mode: Full Screen
Graphics Presets: Maximum
General
-Improve overall graphic quality. : Enabled
-Disable rendering of objects when not visible. (Occlusion Culling) : Disabled
-Use low-detail models on distant objects to increase performance. (LOD) : Disabled
-Cache LOD data only when necessary. (LOD Streaming) : Disabled
-Smooth edges. (Anti-aliasing) : Enabled
-Increase transparent lighting quality. : Enabled
-Grass Quality : High
Shadows
-Use low-detail models on shadows to increase performance. (LOD) : Disabled
-Display : All
-Shadow Resolution : High: 2048 pixels
-Shadow Cascading : High
-Shadow Softening : High
Texture Detail
-Texture Filtering : High
-Anisotropic Filtering : High
Effects
-Naturally darken the edges of the screen. (Limb Darkening) : Enabled
-Blur the graphics around an object in motion. (Radial Blur) : Enabled
-Effects While in Motion : Display All
-Screen Space Ambient Occlusion : High
-Glare : Normal
Cinematic Cutscenes
-Enable depth of field. : Enabled

System:
Windows 7 Ultimate 64-bit (6.1, Build 7601) Service Pack 1 (7601.win7sp1_gdr.130318-1533)
Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz
6143.176MB
NVIDIA GeForce GTX 550 Ti(VRAM 3793 MB) 9.18.0013.2049

Benchmark results do not provide any guarantee FINAL FANTASY XIV: A Realm Reborn will run on your system.

FINAL FANTASY XIV: A Realm Reborn Official Website http://na.finalfantasyxiv.com
(C) 2010-2013 SQUARE ENIX CO., LTD. All Rights Reserved.

Tweet
http://sqex.to/ffxiv_bench_na #FFXIV Score:4414 1366x768 Maximum Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz NVIDIA GeForce GTX 550 Ti


Next the Character version:

FINAL FANTASY XIV: A Realm Reborn Official Benchmark (Character Creation)
Tested on:8/2/2013 6:42:12 PM
Score:7036
Average Framerate:60.814
Performance:Extremely High
-Easily capable of running the game on the highest settings.

Screen Size: 1366x768
Screen Mode: Full Screen
Graphics Presets: Maximum
General
-Enable HDR rendering and improve overall graphic quality. : Enabled
-Disable rendering of objects when not visible. (Occlusion Culling) : Disabled
-Use low-detail models on distant objects. (LOD) : Disabled
-Cache LOD data only when necessary. (LOD Streaming) : Disabled
-Real-time Reflections : High
-Edge Smoothing (Anti-aliasing) : FXAA
-Transparent Lighting Quality : High
-Grass Quality : High
Shadows
-Self : Display
-Other NPCs : Display
Shadow Quality
-Use low-detail models on shadows. (LOD) : Disabled
-Shadow Resolution : High - 2048p
-Shadow Cascading : Best
-Shadow Softening : Strong
Texture Detail
-Texture Filtering : Anisotropic
-Anisotropic Filtering : x16
Movement Physics
-Self : Full
-Other NPCs : Full
Effects
-Naturally darken the edges of the screen. (Limb Darkening) : Enabled
-Blur the graphics around an object in motion. (Radial Blur) : Enabled
-Screen Space Ambient Occlusion : Strong
-Glare : Normal
Cinematic Cutscenes
-Enable depth of field. : Enabled

System:
Windows 7 Ultimate 64-bit (6.1, Build 7601) Service Pack 1 (7601.win7sp1_gdr.130318-1533)
Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz
6143.176MB
NVIDIA GeForce GTX 550 Ti(VRAM 3793 MB) 9.18.0013.2049

Benchmark results do not provide any guarantee FINAL FANTASY XIV: A Realm Reborn will run on your system.

FINAL FANTASY XIV: A Realm Reborn Official Website http://na.finalfantasyxiv.com
(C) 2010-2013 SQUARE ENIX CO., LTD. All Rights Reserved.

Tweet
http://sqex.to/ffxiv_bench_na #FFXIV Score:7036 1366x768 Maximum Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz NVIDIA GeForce GTX 550 Ti


Then I ran the Character version at Max, but also MAX through Nvidia Control Panel and got this:

FINAL FANTASY XIV: A Realm Reborn Official Benchmark (Character Creation)
Tested on:8/2/2013 5:11:31 PM
Score:5709
Average Framerate:46.914
Performance:Very High
-Easily capable of running the game. Should perform exceptionally well, even at higher resolutions.

Screen Size: 1366x768
Screen Mode: Full Screen
Graphics Presets: Maximum
General
-Enable HDR rendering and improve overall graphic quality. : Enabled
-Disable rendering of objects when not visible. (Occlusion Culling) : Disabled
-Use low-detail models on distant objects. (LOD) : Disabled
-Cache LOD data only when necessary. (LOD Streaming) : Disabled
-Real-time Reflections : High
-Edge Smoothing (Anti-aliasing) : FXAA
-Transparent Lighting Quality : High
-Grass Quality : High
Shadows
-Self : Display
-Other NPCs : Display
Shadow Quality
-Use low-detail models on shadows. (LOD) : Disabled
-Shadow Resolution : High - 2048p
-Shadow Cascading : Best
-Shadow Softening : Strong
Texture Detail
-Texture Filtering : Anisotropic
-Anisotropic Filtering : x16
Movement Physics
-Self : Full
-Other NPCs : Full
Effects
-Naturally darken the edges of the screen. (Limb Darkening) : Enabled
-Blur the graphics around an object in motion. (Radial Blur) : Enabled
-Screen Space Ambient Occlusion : Strong
-Glare : Normal
Cinematic Cutscenes
-Enable depth of field. : Enabled

System:
Windows 7 Ultimate 64-bit (6.1, Build 7601) Service Pack 1 (7601.win7sp1_gdr.130318-1533)
Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz
6143.176MB
NVIDIA GeForce GTX 550 Ti(VRAM 3793 MB) 9.18.0013.2049

Benchmark results do not provide any guarantee FINAL FANTASY XIV: A Realm Reborn will run on your system.

FINAL FANTASY XIV: A Realm Reborn Official Website http://na.finalfantasyxiv.com
(C) 2010-2013 SQUARE ENIX CO., LTD. All Rights Reserved.

Tweet
http://sqex.to/ffxiv_bench_na #FFXIV Score:5709 1366x768 Maximum Intel(R) Core(TM)2 Quad CPU Q9550 @ 2.83GHz NVIDIA GeForce GTX 550 Ti


So in short, the old benchmark was about 4400, the new benchmark was about 7000, and the new benchmark with the control panel settings was about 5700.

What does this mean Kierk? I don't know, I just thought it was interesting...but wait!

---

There are a few differences between both benchmarks, some subtle and some not:

I think the new benchmark is a bit longer (the last scene is noticeably different, the BLM puts the staff in the ground. In the old bench he doesn't do this AND in the old bench the meteor is bigger), other camera angles are a bit different, the effects are toned WAY down/changed a bit, the DoF is less aggressive, some textures are different and the AA isn't as good as the old benchmark. There are also some gamma and lighting differences as well.

Comparatively the old benchmark is a bit prettier, HOWEVER, with the "upgrades" through the Nvidia control panel, it looks just as good AND it's running 1700 points higher than the old bench.

Lastly I ran the old bench through the Nvidia control panel and got: 4469, that's almost no change.

----

Conclusion?

There's no doubt we're getting better FPS, and I think it's a obvious mix of the above. I think SE cut some corners graphically, but a 35% increase is a big deal, and they cut them in a way where I really I didn't notice a difference. I'd consider this cutting the fat.

More interestingly when you compare both Nvidia control panel versions there's still a 20% increase in scores, meaning SE made some room, I think with textures to get this bump.
#2 Aug 02 2013 at 9:21 PM Rating: Decent
*
233 posts
I noticed this too, but I didn't keep my numbers so it was just a gut feeling. Thanks for posting, i'm not going crazy i guess
#3 Aug 02 2013 at 9:49 PM Rating: Decent
****
4,175 posts
Kierk wrote:
Then I ran the Character version at Max, but also MAX through Nvidia Control Panel and got this:

Please explain what is meant by MAX through nVidia control panel. There is no default max setting and the options you're afforded vary depending on the hardware and configuration you're running.


____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#4 Aug 02 2013 at 10:30 PM Rating: Good
Sage
***
1,675 posts
FilthMcNasty wrote:
Kierk wrote:
Then I ran the Character version at Max, but also MAX through Nvidia Control Panel and got this:

Please explain what is meant by MAX through nVidia control panel. There is no default max setting and the options you're afforded vary depending on the hardware and configuration you're running.




Sure thing, these are the values for my 550ti

Ambient Occlusion: N/A
Anisotropic Filtering: x16
Antialiasing - Gamma Correction: On
Antialiasing - Mode: Override application
Antialiasing - Setting: CSAAx32
Antialiasing - Transparency: Supersample x8
CUDA - GPUs: N/A
Maximum Pre-Rendered Frames: 3
Multi-display/Mixed-GPU Acceleration: Single Display Peformance Mode
Power Management Mode: Prefer maximum performance
Texture Filtering - Anisotropic Sample Optimization: On
Texture Filtering - Negative LOD Bias: Clamp
Texture Filtering - Quality: High Quality
Texture Filtering - Trilinear Optimization: On
Threaded Optimization: On
Triple Buffering: On
Vertical Sync: Adaptive
#5 Aug 03 2013 at 12:08 AM Rating: Decent
****
4,175 posts
Kierk wrote:
FilthMcNasty wrote:
Kierk wrote:
Then I ran the Character version at Max, but also MAX through Nvidia Control Panel and got this:

Please explain what is meant by MAX through nVidia control panel. There is no default max setting and the options you're afforded vary depending on the hardware and configuration you're running.




Sure thing, these are the values for my 550ti

Ambient Occlusion: N/A
Anisotropic Filtering: x16
Antialiasing - Gamma Correction: On
Antialiasing - Mode: Override application
Antialiasing - Setting: CSAAx32
Antialiasing - Transparency: Supersample x8
CUDA - GPUs: N/A
Maximum Pre-Rendered Frames: 3
Multi-display/Mixed-GPU Acceleration: Single Display Peformance Mode
Power Management Mode: Prefer maximum performance
Texture Filtering - Anisotropic Sample Optimization: On
Texture Filtering - Negative LOD Bias: Clamp
Texture Filtering - Quality: High Quality
Texture Filtering - Trilinear Optimization: On
Threaded Optimization: On
Triple Buffering: On
Vertical Sync: Adaptive

Wait, you're only running one GPU? You don't need triple buffering or VSync as they're SLI related.

Texture Filtering - Negative LOD Bias should be set to it's default 'Allow'. The reason for that is that it's DX9 tech and you'd want to know if the game is attempting to use negative LOD bias to improve the quality of images. The only reason to adjust that from default is to change only that and compare benchmarks texture quality. Threaded Optimization can be left at Auto since nearly nothing makes use of it.

Different setups will have different options so this really isn't a MAX value. Folks running SLI can choose to have their second GPU dedicated to AA and the cap is at least twice as high as what you have.

I guess it's good to know the settings, but I don't know that it deserved it's own thread. If you're trying to get to the bottom of why there is such a difference in benchmark performance then you're looking under the wrong rocks so to speak. The gaps in transitions and slightly altered scenes aren't going to contribute as much as people are reporting. I'm taking people at their word when they post their new and previous scores, but some of them are reporting up to 100% gains in performance with no hardware OR nVidia settings. Doesn't add up.

____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#6 Aug 03 2013 at 12:27 PM Rating: Good
Sage
***
1,675 posts
FilthMcNasty wrote:
Kierk wrote:
FilthMcNasty wrote:
Kierk wrote:
Then I ran the Character version at Max, but also MAX through Nvidia Control Panel and got this:

Please explain what is meant by MAX through nVidia control panel. There is no default max setting and the options you're afforded vary depending on the hardware and configuration you're running.




Sure thing, these are the values for my 550ti

Ambient Occlusion: N/A
Anisotropic Filtering: x16
Antialiasing - Gamma Correction: On
Antialiasing - Mode: Override application
Antialiasing - Setting: CSAAx32
Antialiasing - Transparency: Supersample x8
CUDA - GPUs: N/A
Maximum Pre-Rendered Frames: 3
Multi-display/Mixed-GPU Acceleration: Single Display Peformance Mode
Power Management Mode: Prefer maximum performance
Texture Filtering - Anisotropic Sample Optimization: On
Texture Filtering - Negative LOD Bias: Clamp
Texture Filtering - Quality: High Quality
Texture Filtering - Trilinear Optimization: On
Threaded Optimization: On
Triple Buffering: On
Vertical Sync: Adaptive

Wait, you're only running one GPU? You don't need triple buffering or VSync as they're SLI related.

Texture Filtering - Negative LOD Bias should be set to it's default 'Allow'. The reason for that is that it's DX9 tech and you'd want to know if the game is attempting to use negative LOD bias to improve the quality of images. The only reason to adjust that from default is to change only that and compare benchmarks texture quality. Threaded Optimization can be left at Auto since nearly nothing makes use of it.

Different setups will have different options so this really isn't a MAX value. Folks running SLI can choose to have their second GPU dedicated to AA and the cap is at least twice as high as what you have.

I guess it's good to know the settings, but I don't know that it deserved it's own thread. If you're trying to get to the bottom of why there is such a difference in benchmark performance then you're looking under the wrong rocks so to speak. The gaps in transitions and slightly altered scenes aren't going to contribute as much as people are reporting. I'm taking people at their word when they post their new and previous scores, but some of them are reporting up to 100% gains in performance with no hardware OR nVidia settings. Doesn't add up.



VSync can be used with single GPUs this has always been the case, SLI has nothing to do with it.

If Anisotropic Filtering is high then Texture Filtering - LOD should be set to Clamp AFAIK.

This are the "max" settings for my card. I was looking to make the game look as good as possible, and I did that.

----

The point of this post is discussion and getting to the bottom of these differences in scores. I benchmarked in my controlled settings and my controlled settings only.

These are the things that I've noticed.

If people want to expound on why the scores are different then they can do so in this thread. If not, they won't. I didn't want to muddle the benchmark thread with this.

You say I'm looking under the wrong rocks, but guess what? I just turned over a huge chunk of the pile. If you or others want to do some digging, be my guest.
#7 Aug 03 2013 at 1:46 PM Rating: Good
****
4,175 posts
Kierk wrote:
VSync can be used with single GPUs this has always been the case, SLI has nothing to do with it.

If Anisotropic Filtering is high then Texture Filtering - LOD should be set to Clamp AFAIK.

This are the "max" settings for my card. I was looking to make the game look as good as possible, and I did that.

----

The point of this post is discussion and getting to the bottom of these differences in scores. I benchmarked in my controlled settings and my controlled settings only.

These are the things that I've noticed.

If people want to expound on why the scores are different then they can do so in this thread. If not, they won't. I didn't want to muddle the benchmark thread with this.

You say I'm looking under the wrong rocks, but guess what? I just turned over a huge chunk of the pile. If you or others want to do some digging, be my guest.


VSync deals with refresh issues that cause screen tearing. That generally happens when you have two GPUs feeding frames, but I guess it could happen with one. It doesn't affect performance, at least not on the level we're seeing here, and is something you really only need to enable if you're having tearing problems.

Negative LOD bias is something usually dealt with by the program. If you clamp it, then it basically keeps it where you have it set and the program won't adjust it. This is not always a good or bad thing, but you'd have to actually measure results in the program prior to making the call to allow it or clamp it. Here is an example image. The image on the left is clamped at zero and the image on the right has a negative LOD. Look at the tracks in the road and you'll see that allowing a negative LOD results in better quality image(at least to me it looks better). Again, this is something that is case by case so you'd have to run the benchmark with clamp and allow and try to discern whether or not one gives a clear graphical advantage over the other.

I understand what the point of your testing is. You basically took and ran with something I brought up in several other threads. No big deal as the credit for actually figuring it out going to anyone else doesn't bother me. What you need to understand is that when testing for something like this, you need to remove as many variables as possible. The scores change regardless of your nVidia control panel settings so tweaking them isn't going to provide any insight.

The only exception to this is that I noticed that in the first benchmark, the AA option wasn't clearly labled as FXAA as it is in the second. The control panel only really comes into play because it can be used to force AA settings so you could simulate what the benchmark might have been if it was originally using a different form of AA.
____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#8 Aug 03 2013 at 3:25 PM Rating: Good
Sage
***
1,675 posts
FilthMcNasty wrote:


VSync deals with refresh issues that cause screen tearing. That generally happens when you have two GPUs feeding frames, but I guess it could happen with one. It doesn't affect performance, at least not on the level we're seeing here, and is something you really only need to enable if you're having tearing problems.

Negative LOD bias is something usually dealt with by the program. If you clamp it, then it basically keeps it where you have it set and the program won't adjust it. This is not always a good or bad thing, but you'd have to actually measure results in the program prior to making the call to allow it or clamp it. Here is an example image. The image on the left is clamped at zero and the image on the right has a negative LOD. Look at the tracks in the road and you'll see that allowing a negative LOD results in better quality image(at least to me it looks better). Again, this is something that is case by case so you'd have to run the benchmark with clamp and allow and try to discern whether or not one gives a clear graphical advantage over the other.

I understand what the point of your testing is. You basically took and ran with something I brought up in several other threads. No big deal as the credit for actually figuring it out going to anyone else doesn't bother me. What you need to understand is that when testing for something like this, you need to remove as many variables as possible. The scores change regardless of your nVidia control panel settings so tweaking them isn't going to provide any insight.

The only exception to this is that I noticed that in the first benchmark, the AA option wasn't clearly labled as FXAA as it is in the second. The control panel only really comes into play because it can be used to force AA settings so you could simulate what the benchmark might have been if it was originally using a different form of AA.


I always have tearing issues, but I don't usually turn on VSync, but when I do it works.

I'll have to mess around with the Negative LOD more, but I'm going to assume it's not going to be a big deal.

---

In my original post I went with your first assumption and forced the highest AA I could in the new bench vs the old bench with it's non-labled AA, and there's still a 20%+ increase, even with other settings increased. When I get time I'm just going to force the highest AA I can and let everything else be app controlled...
#9 Aug 03 2013 at 4:09 PM Rating: Decent
****
4,175 posts
Kierk wrote:
In my original post I went with your first assumption and forced the highest AA I could in the new bench vs the old bench with it's non-labled AA, and there's still a 20%+ increase, even with other settings increased. When I get time I'm just going to force the highest AA I can and let everything else be app controlled...


You might edit that post to reflect this then. To me it reads that you ran the original 2.0 benchmark, ran the new creation bench and then ran the new bench again with all of the adjusted options. If we're trying to peg AA as the culprit then you should only be adjusting that and not the rest of what is listed.

Loop the original 2.0 bench for a score closer to average. Next loop the creation bench with the same settings for an average there. Lastly, loop the creation bench with forced AA and compare to the original score.

I'm basically just worried that FXAA being considered 'maximum' settings means that we won't have better graphical options in game at launch. It still looks good enough and the gameplay is more important to me honestly, but it's something to take into account.


____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#10 Aug 03 2013 at 5:11 PM Rating: Good
Sage
***
1,675 posts
FilthMcNasty wrote:

I'm basically just worried that FXAA being considered 'maximum' settings means that we won't have better graphical options in game at launch. It still looks good enough and the gameplay is more important to me honestly, but it's something to take into account.



That's the thing.

It's looking like if you want the game to look as good as possible (which is about the same as the old benchmark sans the "artistic" differences) you're going to have to tweak the settings in the control panel or catalyst (or whatever AMD uses).

-----

Why SE didn't include the option for "better" AA as well as other things in their settings, I don't know.

Like you say, I don't really care about graphics either and I hardly have a monster rig, so those of you that do have one, just note that you can make the game quite a bit better looking by going through the control panel.



#11 Aug 03 2013 at 5:17 PM Rating: Good
****
6,899 posts
Kierk wrote:
Like you say, I don't really care about graphics either and I hardly have a monster rig, so those of you that do have one, just note that you can make the game quite a bit better looking by going through the control panel.


I think this is the important thing to take out of this debate. Thanks for taking the time to do this and test it out, it actually does interest me. I don't run a monster rig either, but I hit just under 10k on max in fullscreen on my desktop, so I feel like I could probably still tweak my AA in catalyst control center (yep, that's what it is for AMD) and make a noticeable difference in quality without really effecting performance much. I really can't comment on the rest of the argument, as I just don't know enough about graphics processing to add to the discussion.
#12 Aug 03 2013 at 6:29 PM Rating: Good
***
1,163 posts
This thread has gone... WAYYYYY over my head. I hope someone posts the cliff notes at the bottom for me. Thanks.
#13 Aug 03 2013 at 6:54 PM Rating: Decent
****
4,175 posts
WFOAssassin wrote:
This thread has gone... WAYYYYY over my head. I hope someone posts the cliff notes at the bottom for me. Thanks.

Basically the idea is that people who are using the exact same hardware and settings are achieving extreme(in some cases 100%) increases to performance in the new benchmark with character creation than they were with the older benchmark without character creation.

I mentioned in the threads about benchmark scores that it may be due to SE removing or changing Anti-aliasing, referred to here as AA. AA basically renders the images in a way that cuts down on jagged lines and edges in graphics. Click me for an example. The images are rendered in a higher resolution and shrunk down to give them a smoother appearance.

I wanted to understand exactly what it was that is causing the increase for a few reasons. If they're removing AA from the equation then people would be stuck with gimped quality regardless of how capable their systems are of handling it. If it's something else, then there really should be an explanation as to why the same hardware that was previously only good for medium settings is now capable of running the game at extremely high settings.
____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#14 Aug 03 2013 at 8:57 PM Rating: Good
****
6,899 posts
FilthMcNasty wrote:
I wanted to understand exactly what it was that is causing the increase for a few reasons. If they're removing AA from the equation then people would be stuck with gimped quality regardless of how capable their systems are of handling it. If it's something else, then there really should be an explanation as to why the same hardware that was previously only good for medium settings is now capable of running the game at extremely high settings.


I do think part of it can be attributed to the reduced battle animations, as it's relatively clear in a side by side analysis that they reduced the overall magnitude of them. Multiply that out by the number of chars using them in the benchmark and I think that will definitely be part of the difference. I'm not savvy enough at seeing the minute differences in AA to really determine how much of an impact that might have made, although I can easily see how that could be a relatively large factor as well, which seems to be backed up by Kierk's findings.
#15 Aug 03 2013 at 10:48 PM Rating: Decent
****
4,175 posts
BartelX wrote:
FilthMcNasty wrote:
I wanted to understand exactly what it was that is causing the increase for a few reasons. If they're removing AA from the equation then people would be stuck with gimped quality regardless of how capable their systems are of handling it. If it's something else, then there really should be an explanation as to why the same hardware that was previously only good for medium settings is now capable of running the game at extremely high settings.


I do think part of it can be attributed to the reduced battle animations, as it's relatively clear in a side by side analysis that they reduced the overall magnitude of them. Multiply that out by the number of chars using them in the benchmark and I think that will definitely be part of the difference. I'm not savvy enough at seeing the minute differences in AA to really determine how much of an impact that might have made, although I can easily see how that could be a relatively large factor as well, which seems to be backed up by Kierk's findings.


Thing is, running the old bench I didn't have any periods of drag during those instances. The few times my ticker did drag along were due to spots with a lot of particles(when the chocobo are running through the dusty paths). It's still not enough to explain such a massive performance gain and I think that's why the over-exaggerated spell effects made it into the game in the first place.
____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#16 Aug 04 2013 at 2:50 AM Rating: Excellent
Sage
***
1,675 posts
I really wanted to do a side by side: But someone else already beat me to it...Benchmark comparison

It really doesn't help whatever argument I had, but you can see some of the differences I glanced over earlier.

Some things look better due to FXAA "cheating" and bluring the grass, for example, but in other areas you can see that some of the textures don't look as good and you can see where DoF was overused in the old bench.




#17 Aug 04 2013 at 4:52 AM Rating: Decent
****
4,175 posts
Kierk wrote:
I really wanted to do a side by side: But someone else already beat me to it...Benchmark comparison

It really doesn't help whatever argument I had, but you can see some of the differences I glanced over earlier.

Some things look better due to FXAA "cheating" and bluring the grass, for example, but in other areas you can see that some of the textures don't look as good and you can see where DoF was overused in the old bench.


When I asked in the other benchmark threads people didn't seem to think there was much of a difference in quality between the two, but I think the side by side shows it a lot more clearly. It's really inconsistent. Good find.

I'm on an AMD 1090T with a GTX 670, both at stock clocks. He's running an unlocked 3770 and a GTX 680. I ran the new benchmark with the same program settings on a single GPU and default settings in the nVidia control panel. I got this. He should have spanked me Smiley: dubious
____________________________
Rinsui wrote:
Only hips + boobs all day and hips + boobs all over my icecream

HaibaneRenmei wrote:
30 bucks is almost free

cocodojo wrote:
Its personal preference and all, but yes we need to educate WoW players that this is OUR game, these are Characters and not Toons. Time to beat that into them one at a time.
#18 Aug 04 2013 at 6:21 AM Rating: Good
**
728 posts
WFOAssassin wrote:
This thread has gone... WAYYYYY over my head. I hope someone posts the cliff notes at the bottom for me. Thanks.


Cliff Notes: Turn on V-Sync if you get more than 60FPS and only have a 60hz monitor, regardless of how many video cards you have.
#19 Aug 05 2013 at 5:34 AM Rating: Excellent
Is it just me or does the new bench look better than the old one?
#20 Aug 05 2013 at 8:30 AM Rating: Good
Sage
***
1,675 posts
Wint wrote:
Is it just me or does the new bench look better than the old one?


I think the lighting, AA, certain textures, and effects were a bit better on the old benchmark.

The DoF, grass (due to FXAA) and HDR come together a bit better on the new benchmark.

I'd take the better performing one regardless.
Reply To Thread

Colors Smileys Quote OriginalQuote Checked Help

 

Recent Visitors: 72 All times are in CST
Anonymous Guests (72)