Author Topic: Why do Nvidia cards run Blockland shaders better than AMD cards  (Read 5447 times)

I got to test out a GT 620 recently. Ran Blockland shaders at max settings without hiccups, somewhat adequate framerate (around 30FPS) at around 30k bricks. The GT 620 is a very low-end card. In other games, performance was comparable to my Intel HD 4600 (with some exceptions such as Skyrim), newer games were completely unplayable.

I try out a 7770, which is a significantly better card then the GT 620. It is able to play all games. Newest games at around medium settings, older games such as Arma 2 at very high settings. I try out Blockland, and I get pretty much identical performance to the GT 620.

The GT 620 was on par with the 7770 in Blockland. What the hell is up with that?

As a 7770 ghz user, I get 10-20 fps on shaders.

As a 7770 ghz user, I get 10-20 fps on shaders.
How many bricks loaded? I should actually retest the 7770, since I tested the Randomizer map with the GT 620 and I think ACM city with the 7770. Fairly sure that the Randomizer map has more bricks, so all in all the GT 620 outperforms it

How many bricks loaded? I should actually retest the 7770, since I tested the Randomizer map with the GT 620 and I think ACM city with the 7770. Fairly sure that the Randomizer map has more bricks, so all in all the GT 620 outperforms it

Any amount of bricks, it seems

Any amount of bricks, it seems
Thats odd. Maybe shaders take up a large chunk of RAM? Maybe some other bottleneck? idk

each vendor (nvidia, amd) has their own library drivers for graphics. which means the compiler that the 620 uses for compiling blockland's shaders could use optimizations like loop unrolling, etc. Maybe amd doesn't do as many? the drawbacks of optimizations is most of the time taking up memory, but it seems both cards have 1gb..

Thats odd. Maybe shaders take up a large chunk of RAM? Maybe some other bottleneck? idk

CPU and RAM are decent, 8gb RAM and an FX-4100 (3.6ghz quad core)

i get 30 fps on a gt610 with "low" shaders, it was a loving $45 card

Maybe the 7770 scales better with load? Because there's no way a lack of optimizing can make up for a degree of magnitude difference in dedicated shader compute units. Unless of course, the shader code is optimized so poorly as to not use dedicated shader compute units for computing shaders...

AMD and NVidia also have completely different architectures when it comes to compiled glsl code so there's aught to be a difference

I also did some digging around and apparently amd likes to use a combination of simple & complex shaders at high counts, and nvidia likes to just use complex shaders (that are clocked really high) at low counts. I guess that just means nvidia put more effort into shader development? no wonder why my 7870 sucks when I put shaders on max

i get 30 fps on a gt610 with "low" shaders, it was a loving $45 card

$45 card as in it should be working better or as in it is working well? A $45 card can't be expected to run everything perfectly.

Owner of a $576 AMD card here. Minimum is fine but anything higher runs like stuff even with just a few thousand bricks. I also have 12GB RAM and a 2,8GHz hexa-core CPU.

Owner of a $576 AMD card here. Minimum is fine but anything higher runs like stuff even with just a few thousand bricks. I also have 12GB RAM and a 2,8GHz hexa-core CPU.
Yes, I have the same problem. It's a CPU bottleneck. I have a AMD Radeon HD 7850 card with an AMD Athlon II x4 635 CPU (2.9 GHz). Also, Blockland doesn't work with CrossFireX, does it?
« Last Edit: July 04, 2014, 09:52:17 AM by Hammereditor5 »

drivers. nvidia whether you like it or not has put more effort and made higher quality drivers.

Yes, I have the same problem. It's a CPU bottleneck. I have a AMD Radeon HD 7850 card with an AMD Athlon II x4 635 CPU (2.9 GHz). Also, Blockland doesn't work with CrossFireX, does it?
really doubt its a CPU bottleneck. I am fairly sure Blockland uses next to no CPU power