I know you are not a web developer but I'll be blunt:
300mb of memory is a lot, sure, but guess what: I don't care.
The average smartphone has 4GB of ram. This means if the operating system takes up *half* of the RAM at any given moment, the user can run 6 apps simultaneously.
And that's on the average phone! No one is running these apps all at once!
Like I said previously, the ease of use and rapid deployment Annoying Oranges being able to multitask 1304 apps locally, which no one does. I wouldn't be suprised if in the future we saw a native OS/browser sandbox environment so apps don't have to ship with a web browser.
You know well that smartphones are not the devices we are talking about here. Nobody is going to gripe about devices which are by design not intended for multitasking. It's entirely apples and oranges.
On desktop will be running multiple applications (not including the 2 upwards of 3 digits worth of daemons/services that will be running in the background) all at once, most of which will not have the luxury of writing application state to disk when it's not in use. It simply does not work that way.
Having your webapp or website consume 100 MB to 2 GB of memory just because "everyone has at least x GB of memory" is absolutely inexcusable and is an awful mindset for any developer to have. It's also inconsiderate to the end-user because they will want to use other applications without having to throw down $250+ for a new set of 32 GB DIMMs. But as you said; you do not care.
This is an outdated stereotype from a time where ECMA standards didn't exist and the entire industry wasn't behind React. Web standards are very well documented and set in stone, a lot of the really annoying stuff is deprecated and Javascript is no longer the only language people develop with on the web. (I'll expand on this in my next point)Your idea that JS interpreters and HTML+CSS rendering is slow comes from another era, this is no longer the case. Web browsers have literally become one of the most optimized pieces of software ever made. I remember looking at the stats for how quickly JS has become in every engine available and being blown away by how far we've come. Your statement that "we should be looking for new replacements" is fairly ignorant because people have been pouring blood sweat and tears in optimizing the web. And guess what? You can run bytecode! You can run binaries!
And the fruits of the 3 have manifested in busy CPUs and monopolized memory space.
JS has been JITted to hell and back and yet still lags significantly behind other scripting languages like Lua and Squirrel. Sure, you can throw every last SIMD instruction, data-oriented design technique, -Ox flag, and inlined function at the problem for both the parsing and bytecode evaluation, but the end results still speak for themselves.
I still maintain that HTML+CSS are an awful bunch. Ever tried writing a parser for HTML? If not ask someone who has and ask if they had fun doing it. Better yet,
interview people who have tried to parse it with regular expressions. DOMs in general are an awful way to lay out information on a screen. It worked well when simple static, linear pages ruled supreme, but is horribly inefficient for the dynamism of the modern web. CSS is just a giant band-aid to the problem.
And just because a bunch of developers put their blood sweat and tears does not exclude it from the consideration of replacement.
OpenGL was the arguably world-wide standard API for 2D and 3D rendering for years, and many man-hours were put into extending it and optimizing drivers for it, and worked well for what was required from it. Nonetheless effort was sought into a replacement that better matched modern hardware than the fixed-function pipelines it was originally intended for
30 years ago, which is how we got the infinitely better Vulkan API. You can say the exact same about the X Windowing System and Wayland too.
Every major browser supports WebAssembly! You can precompile any language and run it in a sandbox on the client-side just like JS.
WASM's IR is a step forward but still lags behind Oracle's Java bytecode and Microsoft's CIL from the few benchmarks I've seen. I don't hold it against that though, as it's still young and still has a lot more time to get better.
The reason apps are slow, is because companies prioritize profits and quick delivery over making good things. This is not unique to web development. It's just a lot more visible because people use the web a lot.
This was what I was trying to get at. Clearly there's a gap between the tools used and how they're actually being used, given the monument of issues plaguing web development. I doubt neither the developers or the companies are going to budge, so changing the tools to fit the demands of the modern web is the most clearest option, at least in my eyes.
I know I'm probably making enemies with all the web developers reading this. Sorry, I'm from the other side of town.