Poll

x86 Or ARM?

x86
ARM

Author Topic: [MEGATHREAD] Personal Computer - Updated builds thanks to Logical Increments  (Read 1329355 times)

The ram automatically clocks down when it isn't needed as much to conserve power.
oh neat. thats cool.

I do have a bit of a bug with my current windows install.
It is 8.1, and I will be upgrading to 10 in the near future when I can, but for now whenever i right click a task bar icon, the menu shows up like this and it really sucks



any ideas on how to fix?

Ran my PC through some benchmarks. As a general note, only one GPU, everything is stock (CPU is locked, no components are overclocks and I'm only using the stock cooling systems that were supplied). The 3DMark 11 Basic test had f.lux running in the background, so I'm unsure if that corrupted the result in anyway but I can't be arsed to check again.

Specs:
CPU: i7-4790
GPU: GTX 980
RAM: 16GB
Chipset: Z9X

3DMark 11 Basic - http://www.3dmark.com/3dm11/10154590
P14130

3DMark Basic - http://www.3dmark.com/3dm/8118545
Fire Strike 1.1  - 10837
Sky Diver 1.0 - 27156
Cloud Gate 1.1 - 26125
Ice Storm 1.2 - 155250

All but confirms that my CPU is holding my GPU back a little bit, but otherwise I'm basically in the top tier for all the benchmarks. Very happy with the results. Might step it up with a better CPU and another GTX 980 when I cash in another payday.

That's one of the most powerful consumer CPU's on the market. It's not going to be holding you back on anything more than a benchmark.

Ran my PC through some benchmarks. As a general note, only one GPU, everything is stock (CPU is locked, no components are overclocks and I'm only using the stock cooling systems that were supplied). The 3DMark 11 Basic test had f.lux running in the background, so I'm unsure if that corrupted the result in anyway but I can't be arsed to check again.

Specs:
CPU: i7-4790
GPU: GTX 980
RAM: 16GB
Chipset: Z9X

3DMark 11 Basic - http://www.3dmark.com/3dm11/10154590
P14130

3DMark Basic - http://www.3dmark.com/3dm/8118545
Fire Strike 1.1  - 10837
Sky Diver 1.0 - 27156
Cloud Gate 1.1 - 26125
Ice Storm 1.2 - 155250

All but confirms that my CPU is holding my GPU back a little bit, but otherwise I'm basically in the top tier for all the benchmarks. Very happy with the results. Might step it up with a better CPU and another GTX 980 when I cash in another payday.

You don't need to upgrade your CPU; it's more than powerful enough, even for benchmarks. Don't go with SLI, my experience with it has been disgustingly lacklustre. I would've gone with a 980 instead of SLI 970's if I had known that scaling would be anywhere near as bad as it has proven to be in literally every game I play.

I must ask, however; what size monitor are you using? Call this a bit of a conspiracy theory but I am fairly certain that GPU utilisation, at least on my 970's, is bottle-necked by the rendering resolution. I recently put this to the test by upgrading to a 1440p, 144Hz monitor and while I struggled to hold 60FPS in WarThunder and World of Tanks on my old 1080p, 60Hz monitor (with vsync disabled for the purposes of the test) I can easily pull 100+ FPS in both of those games on my new monitor having changed no settings in neither the games themselves nor my drivers, despite the fact that my new monitor has about 175% as many pixels as my old 1080p monitor.
I also experienced the same thing in GTAV on my old monitor by using the DSR option in the Nvidia drivers to render the game at 4k. Framerates and their stability increased noticeably despite the fact that I was pushing my graphics cards harder than I was when I could barely manage 55FPS stable.

It makes no sense unless you consider it to be intentional, I'm just saying.

You don't need to upgrade your CPU; it's more than powerful enough, even for benchmarks. Don't go with SLI, my experience with it has been disgustingly lacklustre. I would've gone with a 980 instead of SLI 970's if I had known that scaling would be anywhere near as bad as it has proven to be in literally every game I play.

I must ask, however; what size monitor are you using? Call this a bit of a conspiracy theory but I am fairly certain that GPU utilisation, at least on my 970's, is bottle-necked by the rendering resolution. I recently put this to the test by upgrading to a 1440p, 144Hz monitor and while I struggled to hold 60FPS in WarThunder and World of Tanks on my old 1080p, 60Hz monitor (with vsync disabled for the purposes of the test) I can easily pull 100+ FPS in both of those games on my new monitor having changed no settings in neither the games themselves nor my drivers, despite the fact that my new monitor has about 175% as many pixels as my old 1080p monitor.
I also experienced the same thing in GTAV on my old monitor by using the DSR option in the Nvidia drivers to render the game at 4k. Framerates and their stability increased noticeably despite the fact that I was pushing my graphics cards harder than I was when I could barely manage 55FPS stable.

It makes no sense unless you consider it to be intentional, I'm just saying.

I got 100% utilization and >70fps in GTA V with my 1080p monitor. What was your utilization % before switching?

I got 100% utilization and >70fps in GTA V with my 1080p monitor. What was your utilization % before switching?

Low 40's to mid-50's on both cards no matter what was happening or how low my framerate got. After switching to a 4k rendering resolution it went up to high-70's to high-90's on both cards depending on what was happening

Low 40's to mid-50's on both cards no matter what was happening or how low my framerate got. After switching to a 4k rendering resolution it went up to high-70's to high-90's on both cards depending on what was happening

It was probably an SLI related issue.

i just noticed my ram is running at a low frequency, and i bumped it up to 2133 Mhz in my bios but now it says its running at half that. (1065).
Is that normal?
actually mine does this too. i was told that i was because it's double data rate, it displays at half value

actually mine does this too. i was told that i was because it's double data rate, it displays at half value

Yeah that's normal.


yea i know that why i'm telling him

I was backing up your statement.


So many bugs with windows 10 for you guys.

will upgrading to ddr4 show a increase in performance in applications like blender?

will upgrading to ddr4 show a increase in performance in applications like blender?
it might, but i don't think you'll see much of a performance difference. i think you mostly see the difference with like, server workloads.