Off Topic > Off Topic
[MEGATHREAD] Personal Computer - Updated builds thanks to Logical Increments
<< < (621/4060) > >>
devildogelite:
The refresh rate of a monitor is how many frames a second it can display correct?
Momentum:
alrighty since my old monitor wont work its time to find some decently new broken ones at the dump and fix them up for my new monitors
Marcem:

--- Quote from: Wedge on April 10, 2013, 01:46:06 AM ---Frequency is independent of fps, and you can use PWM on a monitor run a game at 300fps if you like and still get 300fps out of it because not all the LEDs are turning on or off at the same time. If you run it at full brightness the LEDs are always on and change the moment the graphics card gives them new information. Same with an LCD monitor, it just changes the intensity of the backlight, you can run some game at 300fps with the back light off if you modify the monitor. You won't see anything, but the frames are still being updated 300 times a second.

I guess technically if you run an LED monitor at a really low brightness, all the frames that are being shown on the off part of the PWM cycle are being thrown away so that would cap your frame rate but I'm not really sure. LCD monitors actually have a bit of a fade between the bulb turning on and off so I imagine it would be less noticeable.

Here is an example of using PWM to change the brightness of an LED:
http://www.waitingforfriday.com/index.php/Controlling_LED_brightness_using_PWM
It's 1 LED but its the same thing as a monitor with just one pixel on it.

Here's another page on PWM (and it's much better than the one I linked to in the previous post):
http://www.tftcentral.co.uk/articles/pulse_width_modulation.htm

You'll see that the 2009 MacBook actually doesn't show any lines at all at different brightnesses. PWM isn't the only way to change brightness, so it's always on. The only limit on the framerate you could really show on this monitor is how fast your graphics card can send data to the monitor and how fast the transistors on the circuits in the monitor can switch. There is probably some awesome calculation that we could derive to figure out the theoretical max framerate of a monitor at full brightness but I don't know where to start and it's probably something insane like 10[sup]6[/sup] fps anyway.

When we look at frequency of the display in LCD/LED monitors the only thing it effects is brightness.

EDIT: I didn't get to post everything I wanted before I lost my connection.

I think part of the issue here is that I'm not talking about frame rate but focusing on frequency and flicker. The problem is a lot of people built up knowledge about how CRTs worked and then tried to apply it to LED and LCD monitors and it doesn't carry over because they're not the same thing. One's a vacuum tube with a beam of electrons that excites a phosphor on a screen, the other is transistor based and can turn every pixel on at the same time if you want it to. On an LED or LCD monitor, you won't notice the difference between a monitor with a refresh rate of 5 Hz and one with 500 Hz unless there is actually change in the screen, for example, if you type a word or watch a video. On a CRT, you're going to see flicker. There may be flicker introduced from PWM power source but this will not effect the number of fps you can see, especially if you run it at full brightness.

EDIT 2: I watched the video. There is something I want to say about this but I need to make sure it's right before I say it.

Okay, so here's the issue.

Frame rate and frequency are not the same thing. Frame rate is a measure of how fast your graphics card can dump out frames to the monitor. You can run this as fast as you want, in 2D applications I've seen frame rates well over 1000 fps. If your frame rate is greater than the frequency your monitor runs at, you're not going to see those extra frames, they just get thrown out. So if you're running a monitor at 60 Hz, there is no point in having more than 60fps. Everything else gets thrown out, and you'll get the visual effect known as tearing. I think this may be related to PWM in LED and LCD screens.

EDIT 3:

All of these different terms are far more complicated than I thought and I'm seeing a lot of contradictory information. I'm going to read through a couple of monitor patents and some specifications and then I'll go ahead and come back when I actually know what I'm talking about. It doesn't help that frame rate and frequency are used interchangeably when sometimes they're the same thing and sometimes they're not. I'm going to go ahead and leave in the old information in small text for reference here.

I'll make a new thread when I figure it out. This stuff is awesome and it's fun reading about it but it's not straightforward at all.

--- End quote ---
Do you program at all?  If so, maybe graphics libraries can shed some light on the subject, most of knowledge about refresh rates and frequency comes from learning a bit about OpenGL and directx.
Warground:
looks like wedge confused some things :P

I try to explain it simple.

A LED Display doesn't produce pictures with leds. The name just tells you that the background light is using led technology instead of CCFL tubes. (Gives you a lower powerconsumption and a more homogene lighting)
The brightness is controlled by PWM, that's right. But the PWM is using a frequency around 1kHz to 10kHz and only applies to the background.

A OLED Display on the other hand got self illuminating pixels.

And now to the frequencys:

Refreshrate of the LCD Panel (Can be found in the datasheets):
this is the native frequency a LCD is running at and how often it can be refreshed by the monitor controller.

Refreshrates of the monitor (its controller):
This just tells you with which frequencys you can feed it. Internally it will still use the LCD refreshrate.
If the LCD is a 120Hz panel and the input signal is a 60Hz one the controller will simply stretch it and display every frame twice to get to the 120Hz.

Framerate:
Its just how many frames the GPU is producing in a second.

Now an example on how that chain works:
Lets say we got a 60Hz monitor.
The gpu is producing 300fps and the outputsignal is set to 60Hz.

Because the frame rate is higher then the refreshrate every fifth frame will be send to the monitor (300/60=5) and the others will just be dropped.

BUT if the actual framerate is not a multiple of the refreshrate we get problems which will lead to frame tearing due to unfinished frames in the framebuffer. This is why we got V-sync which will limit the framerate to the refreshrate.

To get the best picture just set the refreshrate to the native rate of the monitor and activate V-sync to not waste GPU power for useless frames.

This was just a small overview. Of course there are exceptions and other technologies.
If you want to know more just ask.






Kreon:
Does anyone know of a good site to buy a custom build from?  I found this but I'm not sure if the dealer is any good.
http://www.ebay.com/itm/Custom-Desktop-AP9-221809-410-Intel-Core-i5-3570K-3-4GHz-/310648643183?
Navigation
Message Index
Next page
Previous page

Go to full version