So what's so special about 4K and 5K?

Author Topic: So what's so special about 4K and 5K?  (Read 1504 times)

on specs, 4k and up is where you would need multiple high end graphics cards. 4k monitors are just a bigger monitor with a higher resolution. It looks much nicer, but its also pricy.

on specs, 4k and up is where you would need multiple high end graphics cards. 4k monitors are just a bigger monitor with a higher resolution. It looks much nicer, but its also pricy.
i'll keep that in mind thank you

also with a 4k monitor, the need for anti-aliasing is gone. the pixels between different objects will be so tiny it is no longer needed at higher resolutions
To clarify this, a 4k monitor has 4 pixels in the space of 1 pixel in a 1080p monitor. So it's equivalent to 4x antialiasing on a 1080p screen. The problem is it takes more effort to render every pixel in a 4k monitor than antialias every pixel in a 1080p monitor, so this isn't really a plus more than it is a side effect.
« Last Edit: October 16, 2014, 06:25:07 PM by Ipquarx »

my gpu's maximum resolution is 2560x1600 (is it 2k?)
True 4k is 4096x2160,
Ultra-HD, which is sometimes called 4k, is 3840x2160

EDIT: Oh, you were asking if it's 2k
Yeah that's a bit higher than 2k

With 1440p or 4k you don't even need anti aliasing. People forget that..
But aa is brutal on video cards and can kill fps easily.

My main is a 4k. And aax2 has no noticeable effect on games. So I keep it off

But it's worth it to have high res. 1080p is just garbage.
« Last Edit: October 16, 2014, 06:25:53 PM by Bisjac »

With 1440p or 4k you don't even need anti aliasing. People forget that..
But aa is brutal on video cards and can kill fps easily.
The problem is it takes more effort to render every pixel in a 4k monitor than antialias every pixel in a 1080p monitor

of course it takes more. native resolution is harder on a gpu then any aa setting can do. aa is just an extra to make games like blockland even bearable to play.

but aa also destroys framerates. on a 1440p or 4k monitor you never need to use aa. i dont see why you are so against that. aa is a stuffty gimmick to make up for stuffty low resolutions. its great we arent forced to use it anymore.

Also, compressing 4k and 5k to 1080p still looks better than something originally 1080p.

Also, compressing 4k and 5k to 1080p still looks better than something originally 1080p.
Compressing 4k to 1080p, assuming you're using a good reduction algorithm, will look identical to 4x antialiased 1080p. It will look better than no antialiasing at 1080p, obviously.

And if someone wants to do an experiment here, try measuring average fps (using some sort of benchmarking tool) at 1080p with 4x AA and average FPS at 4k with no AA and post the results. All using the same card. I'm honestly curious what the results will be. I'm not against 4k monitors for the record.

define normal
hint: don't use the term "Linux"
Well, based on what most people use, Windows.
Unless it has a setting for high pixel density displays.

i'm guessing you would ask the same question back when 480p was hi-tekknical if 1080p was suddenly a thing

i think it's kinda neat how sometimes with technology people are like "wtf how could it get any better," and then it totally is. i'm not being sarcastic or anything; that's seriously cool to me

i do enjoy how tech tends to nearly double in specs either about 2 years, or by tech generation (about 6 years). depending what you are referring to.

makes the hobby always interesting to do and follow news on.

thnat tech generation thing is why i am angry at consoles, and 1080p in general. they are horribly outdated. way way way way WAY outdated. and that would be fine since it dosnt technically have to apply to me. but the other apsects of the industry get held back because of consoles and 1080p tech ; ;.

(thats why we should be boycotting ubisoft, among others)
« Last Edit: October 16, 2014, 07:50:27 PM by Bisjac »

yeah any popular platform that artificially limits itself will limit the industry as a whole
the good it does is that it usually results from cost-effectiveness and so more people get interested in, and therefore involved in the industry

but it's just hard for consoles to keep up with technology in their nature. they can't release new consoles to keep up with rising standards quickly enough without making their consumer-base upset, but the platform is popular enough so the standards level out to match this, and so the rise of standards is curbed unnaturally. in some ways, this is a good thing, because 'spectacle creep,' as it's been called (how game devs are seeking to make grander and grander games to beat past works), can be damaging as well, but i think maybe if they had larger parameters to work with, the effect could be helped.

uhh not that that's entirely relevant to the topic at hand but those are my thoughts

well thats why i think the steam machine is a great idea.
with so many different models and brands involved, no one aging "console" will be the fault of the company. so games dont have to be limited to allow older ones. they can even lower specs properly to adust if needed.
can even be updated if the user wants to try.
and ween people away from xbox's and playstations in a way they are already gaming anyways. cept they can game properly like a pc user.

I tried 4K on YouTube and it was so sharp I couldn't read any text
but it was an animation so it wasn't that amazing

also I don't care about 4K, let me stick with 1080 dammit
« Last Edit: October 16, 2014, 11:44:26 PM by Sideswipe »