Poll

x86 Or ARM?

x86
ARM

Author Topic: [MEGATHREAD] Personal Computer - Updated builds thanks to Logical Increments  (Read 1600182 times)

Hey guys I heard you all liked lots of RAM and CPUs so...
Just a heads up for anyone seriously considering using multiple physical processors: You will need to be using either Windows 7 Professional, or Ultimate, or Windows 8 Professional. All of the Windows 7 Home editions and regular Windows 8 do not support more than one physical processor. Windows desktop operating systems don't support more than two physical sockets period, but I haven't seen many boards with more than two sockets (and certainly none on newegg). If you want more than two physical processors you'll need to run Server 2012/2008/etc. There is technically a limit on the number of cores logical processors* Windows can use as well (I think it's 256), but you will never run into this issue on a 2 socket board running Windows straight on it. The only time I think you might run into it is if you've got a huge server running VMWare and you tried to allocate 257 cores to a Windows VM.

* Small error here, because both physical cores, physical processors, and virtual cores from hyper-threading are all considered logical processors.

There's also memory limits. Windows 8 is limited to 128GB, Professional is limited to 512GB. Windows 7 Professional and Ultimate are both limited to 192GB. With 16 8GB sticks you can max out Windows 8's (not professional) RAM limit.

For both Windows 7 and 8, enterprise is almost identical to professional, but enterprise is pretty much only available in volume licensing.

The limits on RAM and processors are mostly artificial to encourage you to purchase the server operating systems. It's a little weird but it makes sense when you think about it, putting Windows on a machine with 8 processors and 1TB of ram is not a good idea. It's not a desktop, it's a server, and people cutting corners on cost by buying Windows Vista/7/8 for servers are going to be disappointed with the quality and performance, especially when you compare it to linux server OS which you can just run on there for free.
« Last Edit: April 09, 2013, 12:34:38 AM by Wedge »

Hey guys I heard you all liked lots of RAM and CPUs so...
why would a server motherboard even need 7.1 audio?

http://www.microcenter.com/product/396983/Tactical_TC-128_ATX_Mid_Tower_Computer_Case
Is this some sort of massive bargain, I feel like I'm missing something for it to have a $30 rebate making it $5.

How many hz is the crossover monitor

[quote author=NalNalas link=topic=218177.msg6514986#msg6514986 date=1365548646]
How many hz is the crossover monitor
[/quote]
This is an interesting question because there are a lot of people on the Internet who think they know what they're talking about but they're just making things up. So I'll go ahead and just clear this up once and for all. I'm pulling this straight out of Scot Mueller's Upgrading and Repairing PCs. It's a book a highly recommend to anyone even remotely interested in building computers. There's some people here who might think "Oh, I've been doing this for years, I know everything about computers," but you don't, and this book will show you that when you start reading it. I know this because I was one of those people, I've been working on PCs since the mid 2000s and do it professionally, and I still learned a lot from the book. You can probably find some edition of it in a library, although it may not talk about LED/LCD screens (there are over 20 editions of it!).

LCD and LED screens do not have a scanning frequency (or a refresh rate, or whatever you call it). The frequency refers to the number of times an electron beam rescans an entire image on a cathode ray tube. This actually involves moving a beam of electrons around and pointing it every single individual pixel at some point or another. LCD and LED screens are solid state (transistor based) technology that refresh every single pixel all at once. This occurs either continuously (basically instantly as the changes are made with no frequency), or at a very, very high frequency (much faster than how fast your graphics card can actually send data to the monitor).

Video cards just pretend it's around 60 Hz and work at that. 60 Hz would be really slow for a CRT and would cause flickering and eyestrain, but an LCD or LED screen doesn't care and looks fine. You will get no benefit from running it at a higher frequency, in fact, you'll actually just waste energy running your graphics card at a higher frequency with no image improvement.

If you look online you'll see people telling you that running an LCD/LED screen at 75 Hz will make it look better and flicker less. They have no idea what they're talking about and you should just ignore them. Or better yet, you can tell them they're wrong and then refer them to Scot Mueller's Upgrading and Repairing PCs 21st Edition, page 659.

You can get weird blurring/ghosting/flicker effects on an LED/LCD monitor, but this is not a refresh issue. Make sure you're running it at the native resolution and also check to make sure you are using a DVI/HDMI/DisplayPort connection and not a VGA cable. If you have to use VGA and the monitor is looking weird, try and use a shorter or heavie gauge VGA cable. You can also try and put a ferrite core on it.

So to wrap it up and answer your question, I'm not sure what a crossover monitor is but I bet it's an LCD or LED monitor, so the answer is it's no hz because that's something that isn't really applicable to it.

EDIT: I see the Crossover 27Q is a monitor mentioned in the thread title. I don't know if it's meant to be a joke or serious. It's not a bad monitor especially for the price, but it's also almost $500 dollars, which is definitely not a budget monitor. It's also DVI only. Last time I bought a monitor it was [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16824236052]this one[/url] for $179. I think that'd be a better price for a monitor although there are probably nicer and cheaper monitors than that one. The sound quality on it was also not very good (laptop speakers were better!), but I have a pair of headphones and speakers I use with it anyway.


EDIT: See this post
« Last Edit: April 10, 2013, 01:39:58 PM by Wedge »

Did you write your thesis on LCD and LED moniters?

Haha. To make it more complicated, I should also mention that LED monitor brightness is set by turning the LEDs on and off really fast, so technically they do flicker, but not for the same reason or in the same way a CRT does. It actually uses something called pulse width modulation, and that sets the brightness. You will never be able to see the flickering if you look at an LED monitor, and it only effects brightness and power consumption, not image quality. Same goes for LCD monitors but they use cold cathode fluorescent lamps as a back lights.

Many monitors on the market use PWM even on full brightness, so they technically "run" at 200 Hz.

Source and further reading: http://www.squidoo.com/led-backlight-flicker


EDIT: See this post
« Last Edit: April 10, 2013, 01:39:20 PM by Wedge »

This is an interesting question because there are a lot of people on the Internet who think they know what they're talking about but they're just making things up. So I'll go ahead and just clear this up once and for all. I'm pulling this straight out of Scot Mueller's Upgrading and Repairing PCs. It's a book a highly recommend to anyone even remotely interested in building computers. There's some people here who might think "Oh, I've been doing this for years, I know everything about computers," but you don't, and this book will show you that when you start reading it. I know this because I was one of those people, I've been working on PCs since the mid 2000s and do it professionally, and I still learned a lot from the book. You can probably find some edition of it in a library, although it may not talk about LED/LCD screens (there are over 20 editions of it!).

LCD and LED screens do not have a scanning frequency (or a refresh rate, or whatever you call it). The frequency refers to the number of times an electron beam rescans an entire image on a cathode ray tube. This actually involves moving a beam of electrons around and pointing it every single individual pixel at some point or another. LCD and LED screens are solid state (transistor based) technology that refresh every single pixel all at once. This occurs either continuously (basically instantly as the changes are made with no frequency), or at a very, very high frequency (much faster than how fast your graphics card can actually send data to the monitor).

Video cards just pretend it's around 60 Hz and work at that. 60 Hz would be really slow for a CRT and would cause flickering and eyestrain, but an LCD or LED screen doesn't care and looks fine. You will get no benefit from running it at a higher frequency, in fact, you'll actually just waste energy running your graphics card at a higher frequency with no image improvement.

If you look online you'll see people telling you that running an LCD/LED screen at 75 Hz will make it look better and flicker less. They have no idea what they're talking about and you should just ignore them. Or better yet, you can tell them they're wrong and then refer them to Scot Mueller's Upgrading and Repairing PCs 21st Edition, page 659.

You can get weird blurring/ghosting/flicker effects on an LED/LCD monitor, but this is not a refresh issue. Make sure you're running it at the native resolution and also check to make sure you are using a DVI/HDMI/DisplayPort connection and not a VGA cable. If you have to use VGA and the monitor is looking weird, try and use a shorter or heavie gauge VGA cable. You can also try and put a ferrite core on it.

So to wrap it up and answer your question, I'm not sure what a crossover monitor is but I bet it's an LCD or LED monitor, so the answer is it's no hz because that's something that isn't really applicable to it.

EDIT: I see the Crossover 27Q is a monitor mentioned in the thread title. I don't know if it's meant to be a joke or serious. It's not a bad monitor especially for the price, but it's also almost $500 dollars, which is definitely not a budget monitor. It's also DVI only. Last time I bought a monitor it was this one for $179. I think that'd be a better price for a monitor although there are probably nicer and cheaper monitors than that one. The sound quality on it was also not very good (laptop speakers were better!), but I have a pair of headphones and speakers I use with it anyway.
I don't think the whole hertz argument on LED and LCD monitors comes down to flickering, it really comes down to frames a second.  120hz monitors, provided that you're running a game at 120 fps, would display a much smoother game (ignoring the whole argument of how many frames the eye can see).  Correct me if I'm wrong though, as you seem to have done much more research on the topic.

I don't get the recent craze for cheap monitors, though.  I don't care what panel they use, it's still more prone to failure.  If I'm staring at it for hours at a time, I'm probably not going to spare any expenses on it.  It's like buying a car because it has the same engine as a Bentley, but is constructed entirely of rejected 2x4s.

I watched a video from Linus from NCIX about 120hz vs 60hz. He was able to pretty accurately tell the difference between 60 and 120. He said there was a difference in how smooth it was. Though they brought someone in who never used 120hz and they couldn't tell the difference.

I don't think the whole hertz argument on LED and LCD monitors comes down to flickering, it really comes down to frames a second.  120hz monitors, provided that you're running a game at 120 fps, would display a much smoother game (ignoring the whole argument of how many frames the eye can see).  Correct me if I'm wrong though, as you seem to have done much more research on the topic.

I don't get the recent craze for cheap monitors, though.  I don't care what panel they use, it's still more prone to failure.  If I'm staring at it for hours at a time, I'm probably not going to spare any expenses on it.  It's like buying a car because it has the same engine as a Bentley, but is constructed entirely of rejected 2x4s.
Frequency is independent of fps, and you can use PWM on a monitor run a game at 300fps if you like and still get 300fps out of it because not all the LEDs are turning on or off at the same time. If you run it at full brightness the LEDs are always on and change the moment the graphics card gives them new information. Same with an LCD monitor, it just changes the intensity of the backlight, you can run some game at 300fps with the back light off if you modify the monitor. You won't see anything, but the frames are still being updated 300 times a second.

I guess technically if you run an LED monitor at a really low brightness, all the frames that are being shown on the off part of the PWM cycle are being thrown away so that would cap your frame rate but I'm not really sure. LCD monitors actually have a bit of a fade between the bulb turning on and off so I imagine it would be less noticeable.

Here is an example of using PWM to change the brightness of an LED:
http://www.waitingforfriday.com/index.php/Controlling_LED_brightness_using_PWM
It's 1 LED but its the same thing as a monitor with just one pixel on it.

Here's another page on PWM (and it's much better than the one I linked to in the previous post):
http://www.tftcentral.co.uk/articles/pulse_width_modulation.htm

You'll see that the 2009 MacBook actually doesn't show any lines at all at different brightnesses. PWM isn't the only way to change brightness, so it's always on. The only limit on the framerate you could really show on this monitor is how fast your graphics card can send data to the monitor and how fast the transistors on the circuits in the monitor can switch. There is probably some awesome calculation that we could derive to figure out the theoretical max framerate of a monitor at full brightness but I don't know where to start and it's probably something insane like 10[sup]6[/sup] fps anyway.

When we look at frequency of the display in LCD/LED monitors the only thing it effects is brightness.

EDIT: I didn't get to post everything I wanted before I lost my connection.

I think part of the issue here is that I'm not talking about frame rate but focusing on frequency and flicker. The problem is a lot of people built up knowledge about how CRTs worked and then tried to apply it to LED and LCD monitors and it doesn't carry over because they're not the same thing. One's a vacuum tube with a beam of electrons that excites a phosphor on a screen, the other is transistor based and can turn every pixel on at the same time if you want it to. On an LED or LCD monitor, you won't notice the difference between a monitor with a refresh rate of 5 Hz and one with 500 Hz unless there is actually change in the screen, for example, if you type a word or watch a video. On a CRT, you're going to see flicker. There may be flicker introduced from PWM power source but this will not effect the number of fps you can see, especially if you run it at full brightness.

EDIT 2:
[quote author=devildogelite link=topic=218177.msg6516285#msg6516285 date=1365565289]
I watched a video from Linus from NCIX about 120hz vs 60hz. He was able to pretty accurately tell the difference between 60 and 120. He said there was a difference in how smooth it was. Though they brought someone in who never used 120hz and they couldn't tell the difference.
[/quote]
I watched the video. There is something I want to say about this but I need to make sure it's right before I say it.

Okay, so here's the issue.

Frame rate and frequency are not the same thing. Frame rate is a measure of how fast your graphics card can dump out frames to the monitor. You can run this as fast as you want, in 2D applications I've seen frame rates well over 1000 fps. If your frame rate is greater than the frequency your monitor runs at, you're not going to see those extra frames, they just get thrown out. So if you're running a monitor at 60 Hz, there is no point in having more than 60fps. Everything else gets thrown out, and you'll get the visual effect known as tearing. I think this may be related to PWM in LED and LCD screens.


EDIT 3:

All of these different terms are far more complicated than I thought and I'm seeing a lot of contradictory information. I'm going to read through a couple of monitor patents and some specifications and then I'll go ahead and come back when I actually know what I'm talking about. It doesn't help that frame rate and frequency are used interchangeably when sometimes they're the same thing and sometimes they're not. I'm going to go ahead and leave in the old information in small text for reference here.

I'll make a new thread when I figure it out. This stuff is awesome and it's fun reading about it but it's not straightforward at all.
« Last Edit: April 10, 2013, 01:43:24 PM by Wedge »

The refresh rate of a monitor is how many frames a second it can display correct?

alrighty since my old monitor wont work its time to find some decently new broken ones at the dump and fix them up for my new monitors

Frequency is independent of fps, and you can use PWM on a monitor run a game at 300fps if you like and still get 300fps out of it because not all the LEDs are turning on or off at the same time. If you run it at full brightness the LEDs are always on and change the moment the graphics card gives them new information. Same with an LCD monitor, it just changes the intensity of the backlight, you can run some game at 300fps with the back light off if you modify the monitor. You won't see anything, but the frames are still being updated 300 times a second.

I guess technically if you run an LED monitor at a really low brightness, all the frames that are being shown on the off part of the PWM cycle are being thrown away so that would cap your frame rate but I'm not really sure. LCD monitors actually have a bit of a fade between the bulb turning on and off so I imagine it would be less noticeable.

Here is an example of using PWM to change the brightness of an LED:
http://www.waitingforfriday.com/index.php/Controlling_LED_brightness_using_PWM
It's 1 LED but its the same thing as a monitor with just one pixel on it.

Here's another page on PWM (and it's much better than the one I linked to in the previous post):
http://www.tftcentral.co.uk/articles/pulse_width_modulation.htm

You'll see that the 2009 MacBook actually doesn't show any lines at all at different brightnesses. PWM isn't the only way to change brightness, so it's always on. The only limit on the framerate you could really show on this monitor is how fast your graphics card can send data to the monitor and how fast the transistors on the circuits in the monitor can switch. There is probably some awesome calculation that we could derive to figure out the theoretical max framerate of a monitor at full brightness but I don't know where to start and it's probably something insane like 10[sup]6[/sup] fps anyway.

When we look at frequency of the display in LCD/LED monitors the only thing it effects is brightness.

EDIT: I didn't get to post everything I wanted before I lost my connection.

I think part of the issue here is that I'm not talking about frame rate but focusing on frequency and flicker. The problem is a lot of people built up knowledge about how CRTs worked and then tried to apply it to LED and LCD monitors and it doesn't carry over because they're not the same thing. One's a vacuum tube with a beam of electrons that excites a phosphor on a screen, the other is transistor based and can turn every pixel on at the same time if you want it to. On an LED or LCD monitor, you won't notice the difference between a monitor with a refresh rate of 5 Hz and one with 500 Hz unless there is actually change in the screen, for example, if you type a word or watch a video. On a CRT, you're going to see flicker. There may be flicker introduced from PWM power source but this will not effect the number of fps you can see, especially if you run it at full brightness.

EDIT 2: I watched the video. There is something I want to say about this but I need to make sure it's right before I say it.

Okay, so here's the issue.

Frame rate and frequency are not the same thing. Frame rate is a measure of how fast your graphics card can dump out frames to the monitor. You can run this as fast as you want, in 2D applications I've seen frame rates well over 1000 fps. If your frame rate is greater than the frequency your monitor runs at, you're not going to see those extra frames, they just get thrown out. So if you're running a monitor at 60 Hz, there is no point in having more than 60fps. Everything else gets thrown out, and you'll get the visual effect known as tearing. I think this may be related to PWM in LED and LCD screens.


EDIT 3:

All of these different terms are far more complicated than I thought and I'm seeing a lot of contradictory information. I'm going to read through a couple of monitor patents and some specifications and then I'll go ahead and come back when I actually know what I'm talking about. It doesn't help that frame rate and frequency are used interchangeably when sometimes they're the same thing and sometimes they're not. I'm going to go ahead and leave in the old information in small text for reference here.

I'll make a new thread when I figure it out. This stuff is awesome and it's fun reading about it but it's not straightforward at all.
Do you program at all?  If so, maybe graphics libraries can shed some light on the subject, most of knowledge about refresh rates and frequency comes from learning a bit about OpenGL and directx.

looks like wedge confused some things :P

I try to explain it simple.

A LED Display doesn't produce pictures with leds. The name just tells you that the background light is using led technology instead of CCFL tubes. (Gives you a lower powerconsumption and a more homogene lighting)
The brightness is controlled by PWM, that's right. But the PWM is using a frequency around 1kHz to 10kHz and only applies to the background.

A OLED Display on the other hand got self illuminating pixels.

And now to the frequencys:

Refreshrate of the LCD Panel (Can be found in the datasheets):
this is the native frequency a LCD is running at and how often it can be refreshed by the monitor controller.

Refreshrates of the monitor (its controller):
This just tells you with which frequencys you can feed it. Internally it will still use the LCD refreshrate.
If the LCD is a 120Hz panel and the input signal is a 60Hz one the controller will simply stretch it and display every frame twice to get to the 120Hz.

Framerate:
Its just how many frames the GPU is producing in a second.

Now an example on how that chain works:
Lets say we got a 60Hz monitor.
The gpu is producing 300fps and the outputsignal is set to 60Hz.

Because the frame rate is higher then the refreshrate every fifth frame will be send to the monitor (300/60=5) and the others will just be dropped.

BUT if the actual framerate is not a multiple of the refreshrate we get problems which will lead to frame tearing due to unfinished frames in the framebuffer. This is why we got V-sync which will limit the framerate to the refreshrate.

To get the best picture just set the refreshrate to the native rate of the monitor and activate V-sync to not waste GPU power for useless frames.

This was just a small overview. Of course there are exceptions and other technologies.
If you want to know more just ask.






« Last Edit: April 10, 2013, 05:53:07 PM by Warground »

Does anyone know of a good site to buy a custom build from?  I found this but I'm not sure if the dealer is any good.
http://www.ebay.com/itm/Custom-Desktop-AP9-221809-410-Intel-Core-i5-3570K-3-4GHz-/310648643183?