Does anyone think Real-Time ray-tracing will be revolutionary?

Author Topic: Does anyone think Real-Time ray-tracing will be revolutionary?  (Read 4146 times)


yeah Man trans people are scary

yeah Man trans people are scary
what are you tlaking about



its gonna make minecraft look really loving epic


minecraft doesn't look good with shaders and stuff

minecraft doesn't look good with shaders and stuff
Garry's mod will be good with shaders and RTX, but clearly Minecraft shaders is just used for photos instead of actual game play because it lags like hell.

aide loving destroys me and sleep cause im handicapped

minecraft doesn't look good with shaders and stuff
i agree entirely (also this includes both screenshots and in-game. the artstyle of the game doesnt loving work with 'realistic' shaders)

aide loving destroys me and sleep cause im handicapped
ben_shapiro_destroys_libtards_compilation_hd.wmv.mp4

me and a friend of mine have been theorizing what it would actually take to bridge the gap between "photorealistic" and actually realistic graphics. ray tracing is definitely the thing that completes real time photo realism, but im more interested in what it would take to make something look legitimately real. currently you can make a game look real as forget, but it looks real through the eyes of a picture or movie camera, hense the term "photo realistic". its as realistic as a photo could look. but i want it to look as real as what lies right in front of your eyes.

i cant say for sure, but we've theorized that models constructed from particles or some kind of individual per pixel construction rather than models being constructed from polygons could potentially be a step towards revolutionizing graphics as a whole.

imagine instead of developing polygonal 3d enviroments and spending 10-20 yrs developing that, we bulit off of a 3d concept like Power Drift instead:



and spent the next 20 years developing 3d enviroments where pixel scaling and sprites reigned supreme. (fun maybe fact, there isnt a single polygonal model in power drift apparently)

the concept of 3d models made out of particles is nothing new, the ideas been exploited in real life for really neat effects, such as gatorades "water made active" campaign:




also been used for video rendering. it doesnt look like much when the particles are really big and single colored, but if you provide millions of small particles all individually colored, you can generate some really spectacular looking visuals as well (the waves in this gif are created purely from individual pixel particles)



id be interested to see if this kind of construction can be applied to model construction in any way. it would require a whole new way of generating models, but i think motion capture and actual geometry capture, aka same techonology used for Rockstars L.A. Noirs facial animation, but scaled to encompass someones entire body, would be how you do that. photogrammetry uses points of light to generate 3d polygons. imagine just using those points of light and applying them to particles instead!

maybe thats the turn we'll take when quantum computing becomes a home technology. you're basically emulating cellular structures at this point. especially if particles can have individual properties applied to them
« Last Edit: June 05, 2019, 04:44:32 PM by mod-man »

don't forget matthew has a diaper special interest

this is a really interesting post, the problem with voxel based engines is that you have to apply physics to every single "atom" in the scene and this  is massively computationally intensive

this concept is super far into the future but it might be sooner than we thing considering ray tracing is taking off

video games in the future will be spectacular