Author Topic: Which of these GPUs  (Read 1246 times)

i got a 275 you can have for like 20 bucks steam card
http://www.videocardbenchmark.net/video_lookup.php?gpu=GeForce+GTX+275&id=21
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-275/specifications

its old but it still can rock some new games at med (1080 on new games may be hard as they require a lot more video memory)

http://www.videocardbenchmark.net/

This should answer any performance related questions when comparing cards and prices.
stuffty site which does not reflect actual performance you would see in games/other applications

do not ever use videocardbenchmark, cpubenchmark, etc

stuffty site which does not reflect actual performance you would see in games/other applications

do not ever use videocardbenchmark, cpubenchmark, etc

the stats there are from averages of thousands of users on the net giving their own benchmark results.
even with all the hardware combo possibilities out there, the numbers on those sites pretty legitimately reflect gpu performances.

what YOU are referring to are specific game profiles and developers optimizing (or lack of) games for hardware.
that is of no business to consumers who shouldnt, and cant consider those things when they spend money. there is no way to predict the future of updates or how well your hardware can even handle a new game.

thats why there are benchmark standards. that consumers shop by and developers SHOULD tier their games around being able to work.
« Last Edit: June 30, 2014, 01:08:09 PM by Bisjac »

the numbers on those sites pretty legitimately reflect gpu performances.
not even close. for example, the R9 295X2, which is the best gaming video card on the market accessible to consumers, (besides the Titan Z, but that thing is handicapped, useless in every way) is supposedly 14% WORSE than the GTX 780. while in real life it has been shown that it performs around 100% to 200% BETTER in games

not even close. for example, the R9 295X2, which is the best gaming video card on the market accessible to consumers, (besides the Titan Z, but that thing is handicapped, useless in every way) is supposedly 14% WORSE than the GTX 780. while in real life it has been shown that it performs around 100% to 200% BETTER in games

this is because amd cant make stuff work on direct X. it has nothing o do with actual hardware specs. this is the fault of amd, not benchmarks.
why dont you instead complain that amd get up to modern gaming standards. we shouldnt have to downgrade the rest of the industry (yes these benchmark sites are part of the industry) to meet a low standard of a specific company.

the problem isnt really that the 295x2 is so weak. its that amd wants to claim last gen hardware should be top tier and price as such. clearly they put it in the wrong model line then.
dont ask 3rd party to adjust accordingly.
« Last Edit: June 30, 2014, 01:12:42 PM by Bisjac »

hey while we are at it did you know a gtx 295 can keep up with an r9 280x
it even beats the 660 in benchmarks lol

old graphic cards are weird

older cards tend to run clocks higher and easier. they just lack memory.

newer cards just keep adding cores and memory. its much less power and heat hungry.

its much less power and heat hungry.
yeah i mean 300 watts to get the forgeter running lol

i was reading that nvidias 800 series are going to be much less power needing.
the new re-released 750ti's use that same tech.

this is because amd cant make stuff work on direct X. it has nothing o do with actual hardware specs. this is the fault of amd, not benchmarks.
why dont you instead complain that amd get up to modern gaming standards. we shouldnt have to downgrade the rest of the industry (yes these benchmark sites are part of the industry) to meet a low standard of a specific company.

the problem isnt really that the 295x2 is so weak.
how exactly is AMD's new Hawaii XT GPU even outdated? or any of last 2 - 3 gen AMD's chips for that matter.

 those synthetic benchmarks are obviously forgeted up anyways... tell me how the Nvidia Titan Z is below the R9 295x2 when it should be far superior? passmark benchmarks seem to generally biased against AMD, and dual-gpu cards are just completely messed up. just seems like a poorly-optimized synthetic benchmark

its that amd wants to claim last gen hardware should be top tier and price as such.
tell me about it. you can get an R9 290 for $370 (went as low as $320 recently) nowadays, which is on par with the GTX 780 (at $500). R9 290X for $450, which is just like 5% percent behind the GTX 780 Ti ($700), matches at 4k resolution due to superior bus width and VRAM amount. dont even get me started on lower-end cards lol

hey while we are at it did you know a gtx 295 can keep up with an r9 280x
it even beats the 660 in benchmarks lol

old graphic cards are weird
bullstuff

older cards tend to run clocks higher and easier. they just lack memory.

newer cards just keep adding cores and memory. its much less power and heat hungry.
GTX 295 has FAR lower clocks than most modern cards

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-295/specifications

yeah i mean 300 watts to get the forgeter running lol
lol you think thats a lot for a dual-GPU card?

i was reading that nvidias 800 series are going to be much less power needing.
the new re-released 750ti's use that same tech.
yes, maxwell architecture, supposedly will be the first to be built in 20nm
« Last Edit: June 30, 2014, 01:38:18 PM by Shitty Puns »

GTX 295 has FAR lower clocks than most modern cards

i meant comparatively. that ratio is dead. by today clocks should have grown 6 times as much. but they arent even close.
same as cpu hardware, clocks are less and less important since a few cores make up for it.

cept modern games still mainly depend on just clock speeds. not cores. this is why modern developers suck and there are so many hardware issues with brands every time a new big game comes out. nvidia is just much much faster to pump out game-profile updates to the drivers then amd seems to be.

cept modern games still mainly depend on just clock speeds. not cores.
wrong. depend mostly on neither. why does an i3-4330 (2 cores/2 threads) clocked at 3.5GhZ outperform the FX-4300 (4 cores) clocked at 3.8GhZ???

i3 has a higher IPC, instructions per core. a single-core unit with an absurdly high IPC is theoretically the best option, but since technology constricts us, we made multiple cores as a work around. games typically use up 1 - 2 cores, newer games use up to 4, so if you have 4 excellent cores, you have a great gaming CPU. hence why the i5 is so popular, outperforms anything from AMD unless it is overclocked to the goddamn limits (IPC actually does depend on clock speeds)