Author Topic: 2012/08/23 - Blockland r1713  (Read 22563 times)

Badspot definitey deserves a cookie for all his hard work.
 :cookie:
Really? he deserves more 
:cookie: :cookie: :cookie:

screw cookies
He deserves a cake!

I don't have EXT_Texture3D :C

Shaders don't love people without it.

I don't have EXT_Texture3D :C

Shaders don't love people without it.
Maccccccc

No offence, but when you're trying to play video games on a computer with a freebie graphics card built into the CPU, you can't expect much performance when the industry standard cards are about $100. I know Blockland is not exactly extremely demanding, but come on.

That being said, I also have an integrated GPU, and while I can only run minimal shaders, I love the performance increases in the recent updates, and hope to see more in the future.
Wrong I have a Galaxy 210 on the current computer, its a budget card and it only cost me 60 US Dollars, Normally it would cost around 80 US Dollars, but I got it on sale.It can run shaders at Minimum and Low well before r1701, I have yet to test it after r1701, I expect it will be able to run them at Normal with a tad bit of FPS dropping but I should be able to run it at low with nearly no slowdown at all.

I have a higher grade card on one of my other computers but this one is built for Blockland (Before v21, yea I need to upgrade it a bit), but yes for Today's games a card well above $100 would be preferred.

But a card around $80 would run Blockland at Normal Shaders with Little lag.

For those who think intel is impossible to fix for shaders,
it's not. It's just a bug in the script the intels hate. If it is, there might be a way to run a virtual graphics card.


You have no idea what you're talking about, please stop talking.

For those who think intel is impossible to fix for shaders,
it's not. It's just a bug in the script the intels hate. If it is, there might be a way to run a virtual graphics card.

you are stupider than me

Also false, I have an intel chip that can run shaders but not shadows fine.


No, that is far from it.  Intel graphics literally don't have the hardware to support the shaders on any basis.  Just stop trying to run shaders on grandma's email reader.
Having an Intel CPU doesn't imply that your computer is stuff.  My brother's computer is one of the fastest, most well adapting machines I'm aware of, and it's running on Intel.  I cannot run shaders, and it isn't because my video card is Intel, but because I'm running a pentium 4 processor.  It isn't out of date software, so it shouldn't be treated as if it were.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814162080
This it $40 and can run them extremely well.
An Nvidia Geforce 9500 GT is soPhysician Prescribed Desoxyning that should run shaders without a problem.  I cannot even see them because of the intel issue.  I do not like that people are trying to justify this bug by simply stating that everyone's computer is stuff.  The ability to run shaders on high quality isn't the problem, it's the simple fact that some people can't even get start it without them simply failing.
« Last Edit: August 25, 2012, 10:11:31 AM by Lalam24 »

http://www.newegg.com/Product/Product.aspx?Item=N82E16814162080
This it $40 and can run them extremely well.

Wow, that's a good price for that card.

Having an Intel CPU doesn't imply that your computer is stuff.  My brother's computer is one of the fastest, most well adapting machines I'm aware of, and it's running on Intel.  I cannot run shaders, and it isn't because my video card is Intel, but because I'm running a pentium 4 processor.  It isn't out of date software, so it shouldn't be treated as if it were.
An Nvidia Geforce 9500 GT is soPhysician Prescribed Desoxyning that should run shaders without a problem.  I cannot even see them because of the intel issue.  I do not like that people are trying to justify this bug by simply stating that everyone's computer is stuff.  The ability to run shaders on high quality isn't the problem, it's the simple fact that some people can't even get start it without them simply failing.

Intel makes good CPUs. Their GPUs are barebones processors designed for people who only need to use basic graphic capabilities, such as HD video. Sandy bridge processors come with Intel HD 3000 graphics normally. These are 1 GB graphics chips, however, it uses system ram, not video ram, and suffers because of this. If you check the settings on an Intel graphics adapter, you'll see that the system only sees 64 MB of ACTUAL video ram, the rest is crap. This is only about the RAM though, if someone knows some stats on the processors themselves, please post.

Badspot

  • Administrator
There is no Intel CPU issue.  All of my computers have Intel CPUs.  Intel CPUs are fine.  The problem is the woefully underpowered Intel GMA line of graphics chips.  The cpu and gpu are physically two different things.

Intel CPU = good
Intel GPU = stuff

Are we clear on this now? 

Can we still take megashots?

In the future, I would like a way of excluding some entries from trace. Nowadays, there are lots of add-ons with looping code in the background, including RTB. It really mucks up tracing, and having to either force part of it off or entirely disable the add-on every time you want to trace can be impractical, especially if you depend on another part of the add-on for some functionality.

There is no Intel CPU issue.  All of my computers have Intel CPUs.  Intel CPUs are fine.  The problem is the woefully underpowered Intel GMA line of graphics chips.  The cpu and gpu are physically two different things.

Intel CPU = good
Intel GPU = stuff

Are we clear on this now? 

Do you think the new Intel HD 4000 (I think from ivy bridge processors) would be able to run the shaders or are the chips themselves just not powerful enough.

Do you think the new Intel HD 4000 (I think from ivy bridge processors) would be able to run the shaders or are the chips themselves just not powerful enough.

http://en.wikipedia.org/wiki/Comparison_of_Intel_graphics_processing_units#Intel_HD_Graphics

I'm having a hard time reading that, but it looks better. Although it's up to Badspot if that makes sense. I don't know where VRAM is in there or if it is listed.