Author Topic: Physics override override.  (Read 1492 times)

Is there any way to force physics to not cut out if framerates will turn for the worst if it continues to operate?

What you mean is to make the auto-disable turn off?

Random: Have you tried turning down the time scale, no clue if it works, but it might help.

Is there any way to force physics to not cut out if framerates will turn for the worst if it continues to operate?
I'm pretty sure it's hardcoded in a different language, I think PhysX. Not changeable as far as I know.

I'm pretty sure it's hardcoded in a different language, I think PhysX. Not changeable as far as I know.
PhysX isn't a language.  C++ is.

going off topic; i don't even think the actual PhysX even exists anymore. i know nvidia took over the support for them drivers and the games that supported it. now it comes standard in any modern video card.
that's what happens when bullcrap 3rd party companies try to make expansion hardware. its instantly outdated and unneeded the second they throw it on a shelf

going off topic; i don't even think the actual PhysX even exists anymore. i know nvidia took over the support for them drivers and the games that supported it. now it comes standard in any modern video card.
that's what happens when bullcrap 3rd party companies try to make expansion hardware. its instantly outdated and unneeded the second they throw it on a shelf
PhysX is still being done.

no it got bought out. they are dead. the physics engine is already to oldschool. it exists in name only to carry the stupid fanboys. them drivers are already ignored by nvidia's newer drivers and hardware

its called Nvidia Physx isn't it?

eh... Physics... it was exciting for the first like... 2 servers... then... or now, it's dead to me, not exciting anymore.

PhysX still exists.  Nvidia developed a programmable architecture for their cards known as CUDA, which allows the physics hardware acceleration (PhysX) to be performed on the GPU instead of requiring separate hardware.  PhysX is the only hardware physics acceleration package I know of, which really gives Nvidia a leg up in my opinion.  However, Blockland's brick physics don't make use of CUDA / PhysX.

Now, to address the topic:

Is there any way to force physics to not cut out if framerates will turn for the worst if it continues to operate?

Badspot intentionally makes things so people have as few ways as possible to turn the game into stuff.  Brick physics will always be disabled once your computer starts to show a noticeable performance difference.  You might see some tweaks in a future version which make the physics more efficient; otherwise, get better hardware.

I thought Blockland ran off the Bullet physics engine? I keep getting Bullet "AABB overflow" errors or similar things in my console.

I thought Blockland ran off the Bullet physics engine? I keep getting Bullet "AABB overflow" errors or similar things in my console.

Yeah, brick physics run on Bullet.

Badspot intentionally makes things so people have as few ways as possible to turn the game into stuff.  Brick physics will always be disabled once your computer starts to show a noticeable performance difference.  You might see some tweaks in a future version which make the physics more efficient; otherwise, get better hardware.
This is all well and good, but I feel like my computer could do more? I do really well with a lot of new games, but the physics cut out whenever bricks move far away from where they started.

Yeah, brick physics run on Bullet.
Bullet does have a CUDA api though, even though it's not fully mature yet.