Author Topic: Software engineering trends that annoy you  (Read 32629 times)

I'm not very smart when it comes to inner workings of software engineering stuff, but what is software that pisses me off is that fact my workhub websites that I've been using for the past 5 years keep changing to some more and more dogstuff versions of each other.

We managed to have one in the last 5 years I've worked here that worked and was an easy to navigate and functional hub for all my personal working data and assets but it's gone and we've gotten like 2 or 3 different ones now and they are a monster to us.

Worse yet, we have multiple websites we have to use for various things and even those are changing now too, they just deleted everyone's extra-curricular data to move it to a new website and system and lost most peoples stuff they've had for years, I lost most of my certs and people lost their career packets due to it, it loving sucks and I don't understand why it's a like bi-yearly basis to change stuff up for no reason when something decides it's fixed enough to be useable.

el texto
it has literally nothing to do with the technologies underlying hashtag The Web. HTML, CSS, and JS themselves are all very performant today, and the first two in particular always have been as they existed at any given time (that is to say, HTML 2.0 performed well in the late 90s when it was created, although modern HTML 5 may not on the same machines). JS has a more complicated history, but has been fast for like 15 years, since the introduction of V8 (and now other fast runtimes). even in actual practice currently HTML and CSS are never going to be the bottleneck, they're both implemented with compiled languages in every case, and largely by people who are the most skilled at that stuff in the world, especially Blink because google has enough money to recruit whoever the hell they want. and the same goes for V8 obviously

the issue is in some of the frameworks used. and not in others! several modern ones are really fast, and don't even jeopardize developer experience for that sake, like Svelte, Solid, Preact, or Inferno (the last three of which are all React-inspired, not because React is fundamentally best, but simply because React is popular and people know it and these frameworks aim to be used). the problem is in how frameworks are used, with a lack of care and attention (as i previously complained about). the problems come from having state changing in components that contain other components, which in turn likely contain more, and so on and so forth. meaning that a single minor change can cause large sections of the page to be re-rendered almost from scratch, and as little as scrolling or moving your mouse around can sometimes trigger these changes on particularly badly-architectured web apps

memory usage is also related, because (again due to lack of care and attention) foolish developers will often replicate state over and over again; for example, if a bunch of components need access to some basic info about the user (which is common), they will all be given their own unique copy of the entire user object, because they write their code in naive but easy-to-write ways that results in them doing that instead of sharing a single copy of the object and simply referencing it from all of its users. now multiply this with every single piece of data an app may need, and every single component that may use some data, and you've got 300MB chrome tabs

avoiding these issues requires extra work, yes, but honestly not that much. the faster frameworks I mentioned before include optimizations to minimize the impact of these stupid, thoughtless choices, but that can only go so far. some of them have different designs that specifically discourage you from doing that kind of thing, but again, in the hands of a lazy developer, any framework can be made slow. Svelte in particular does that discouragement. I can't really speak for the others I mentioned, but their decision to take after React doesn't bode well as far as that goes frankly



as for what you said about developers being reasonable not to care, I dunno. I don't really buy that. having a bad boss doesn't excuse you from making irresponsible decisions that negatively affect other people; the stakes may (sometimes!!) be lower for developers than for doctors or something for instance, but I don't think it's a good justification nonetheless

but perhaps more importantly, that is not an issue directly with web development at all. bad developers, regardless of why they're bad, will create bad software whether it's a web app or a native ui software or an operating system. that is not a web development problem. I think the issue is that web development being relatively easy to enter (which is a good thing in and of itself) means it's also easier for idiots and people who just don't give a stuff about anything to enter it. people writing rust are more likely to care in the first place because you don't really get into rust by convenience

and to be clear I'm not saying that every developer must be passionate about software. there is a big difference between the "passion" that bad recruiters look for and a fundamental concern for how your work impacts other people. we don't expect doctors to be passionate about fixing people's respiratory diseases, but we do expect them to want to help their patients above all else



to state my ideology explicitly, I think all people should care about the work that they do to the degree that it impacts other people*. I do have more sympathy for, for example, people working at mcdonalds for pay that is less than the HHS's poverty guidelines making the same stupid burgers every day. they can get some extra forgiveness for being "lazy" about what they do, making mistakes (so long as they aren't food safety-related; endangering other people crosses a line). but developers making 80k+ don't get that leeway in my eyes

*not investors, managers, etc. forget those guys. I'm talking about customers, clients, etc
« Last Edit: April 10, 2024, 01:53:14 AM by Foxscotch »

snip
tbh as my domain is exclusively in native applications, I know nearly nothing about that field so I'll just take your word for it.

Because those applications are bloated and do use a lot of memory. All the electron-based apps running on my system right now consume at least 300 MB of memory each.
I don't think that's entirely the fault of the end developers though because Electron itself is the issue, namely every app spinning up its own Chromium instance.

Does a rendering engine and an HTML, CSS, and a JS interpreter/JIT compiler use 300 MB of memory? I highly doubt it. If by chance they do, then I think it's high time to supersede these old standards.
I know you are not a web developer but I'll be blunt:

300mb of memory is a lot, sure, but guess what: I don't care.

The average smartphone has 4GB of ram. This means if the operating system takes up *half* of the RAM at any given moment, the user can run 6 apps simultaneously.

And that's on the average phone! No one is running these apps all at once!

Like I said previously, the ease of use and rapid deployment Annoying Oranges being able to multitask 1304 apps locally, which no one does. I wouldn't be suprised if in the future we saw a native OS/browser sandbox environment so apps don't have to ship with a web browser.

Idk one of the things keeping me far away from web development is the constant cyclical fad of new frameworks to solve the never-ending issues of web development - largely caused by the shortcomings of Javascript.
This is an outdated stereotype from a time where ECMA standards didn't exist and the entire industry wasn't behind React. Web standards are very well documented and set in stone, a lot of the really annoying stuff is deprecated and Javascript is no longer the only language people develop with on the web. (I'll expand on this in my next point)
I'm gonna play devil's advocate and say that if I was working as a web developer in an environment where the only thing that my manager/boss/employer cares about is getting a product to launch as quick as possible (especially with how cutthroat it is now) I probably wouldn't care much about performance either.
I think the only way around this is to change the tools and standards that can perform better in that kind of industry.

If even the best JS interpreters and HTML+CSS renderers are too slow then it's probably an issue with the languages themselves and we should start looking for a new replacement.
Maybe a scripting and a markup language that are quicker to parse and interpret? What about having them be compiled to bytecode and binary formats? Who knows.
Your idea that JS interpreters and HTML+CSS rendering is slow comes from another era, this is no longer the case. Web browsers have literally become one of the most optimized pieces of software ever made. I remember looking at the stats for how quickly JS has become in every engine available and being blown away by how far we've come. Your statement that "we should be looking for new replacements" is fairly ignorant because people have been pouring blood sweat and tears in optimizing the web. And guess what? You can run bytecode! You can run binaries!

Every major browser supports WebAssembly! You can precompile any language and run it in a sandbox on the client-side just like JS.

This further proves my point that, as a standard for creating cross-platform reactive UIs, the web platform is a no-brainer. You can literally do anything you want with it, and make it as fast as you want. In fact, many websites and high-performant web-based tools use webassembly to make their stuff fast. For example: the design tool Figma is a web application built entirely in C++. It was so good that it was the main rival to Photoshop, in fact, Adobe almost bought it.

The reason apps are slow, is because companies prioritize profits and quick delivery over making good things. This is not unique to web development. It's just a lot more visible because people use the web a lot.

300mb of memory is a lot, sure, but guess what: I don't care.
ok i decided i dont care enough to articulate this thought properly on the blockland forums but that's cringe and exactly what I'm talking about and it will be the downfall of human civilization in 15 years
« Last Edit: April 26, 2024, 11:36:27 PM by Foxscotch »

ok i decided i dont care enough to articulate this thought properly on the blockland forums but that's cringe and exactly what I'm talking about and it will be the downfall of human civilization in 15 years
to be charitable, there's probably a point to be made about overconsumption or some stuff but I really don't think it's that serious. I think that a lot of developers really drill into their brains that optimization is the most important part of software development and I don't think it's required that often. I've worked with too many developers that spend their time micro-optimizing things that don't need to be. Sometimes good enough is good enough.

THAT BEING SAID, I do think that a lot of websites and apps are slow as stuff but not because the developers want them to be, it's because the pressures of capitalism make them deliver stuff products to appease middle-management. If stuff is really slow and unusable, it should be optimized. I'm not anti-optimization, I'm anti time-wasting, and a lot of companies make developers waste everyones time by not optimizing while others waste time by making people optimize.

I would say one of the more important steps in software development is hardening for cybersecurity, and that gets overlooked about as frequently as optimization.

There's a problem in the industry of mature, popular products becoming bloated with unnecessary features and getting constant meaningless UI refreshes.
At large tech companies, to get promoted, you need to demonstrate "impact" with something lovey like a product launch or a UI refresh. Other important things, like tackling technical debt, or optimization, or improving testing, or anything else prophylactic that isn't particularly visible, isn't weighed highly for performance or promo evaluation.
I'm not sure what could be done about this, it might just be a capitalism problem. After all it's often hard to quantify what would've happened if X testing/refactoring/optimization hadn't occurred.

OTOH maybe I shouldn't complain - I'm fine with some slightly annoying UI refreshes if the alternative is the whole front-end team getting fired lol

to be charitable, there's probably a point to be made about overconsumption or some stuff but I really don't think it's that serious. I think that a lot of developers really drill into their brains that optimization is the most important part of software development and I don't think it's required that often. I've worked with too many developers that spend their time micro-optimizing things that don't need to be. Sometimes good enough is good enough.
i think the most offensive part of what you said was a website requiring 300mb of ram to run being acceptable or normal. i think that's unjustifiable outside of extremely rare and specific circumstances (eg running some full-fledged video game or youtube) and the fact you didn't think that spoke volumes to onlookers you're part of the problem with modern website performance, whether or not you want to be. i definitely recoiled at that number myself, but didn't deem it important enough to respond at the time.
« Last Edit: April 30, 2024, 06:20:10 PM by Conan »

There's a problem in the industry of mature, popular products becoming bloated with unnecessary features and getting constant meaningless UI refreshes.
this is so true. first thing that comes to mind is windows itself. not to mention countless other programs that have done this very same stuff. core functionality should be top priority.

I know you are not a web developer but I'll be blunt:

300mb of memory is a lot, sure, but guess what: I don't care.

The average smartphone has 4GB of ram. This means if the operating system takes up *half* of the RAM at any given moment, the user can run 6 apps simultaneously.

And that's on the average phone! No one is running these apps all at once!

Like I said previously, the ease of use and rapid deployment Annoying Oranges being able to multitask 1304 apps locally, which no one does. I wouldn't be suprised if in the future we saw a native OS/browser sandbox environment so apps don't have to ship with a web browser.
You know well that smartphones are not the devices we are talking about here. Nobody is going to gripe about devices which are by design not intended for multitasking. It's entirely apples and oranges.
On desktop will be running multiple applications (not including the 2 upwards of 3 digits worth of daemons/services that will be running in the background) all at once, most of which will not have the luxury of writing application state to disk when it's not in use. It simply does not work that way.

Having your webapp or website consume 100 MB to 2 GB of memory just because "everyone has at least x GB of memory" is absolutely inexcusable and is an awful mindset for any developer to have. It's also inconsiderate to the end-user because they will want to use other applications without having to throw down $250+ for a new set of 32 GB DIMMs. But as you said; you do not care.

This is an outdated stereotype from a time where ECMA standards didn't exist and the entire industry wasn't behind React. Web standards are very well documented and set in stone, a lot of the really annoying stuff is deprecated and Javascript is no longer the only language people develop with on the web. (I'll expand on this in my next point)Your idea that JS interpreters and HTML+CSS rendering is slow comes from another era, this is no longer the case. Web browsers have literally become one of the most optimized pieces of software ever made. I remember looking at the stats for how quickly JS has become in every engine available and being blown away by how far we've come. Your statement that "we should be looking for new replacements" is fairly ignorant because people have been pouring blood sweat and tears in optimizing the web. And guess what? You can run bytecode! You can run binaries!
And the fruits of the 3 have manifested in busy CPUs and monopolized memory space.

JS has been JITted to hell and back and yet still lags significantly behind other scripting languages like Lua and Squirrel. Sure, you can throw every last SIMD instruction, data-oriented design technique, -Ox flag, and inlined function at the problem for both the parsing and bytecode evaluation, but the end results still speak for themselves.

I still maintain that HTML+CSS are an awful bunch. Ever tried writing a parser for HTML? If not ask someone who has and ask if they had fun doing it. Better yet, interview people who have tried to parse it with regular expressions. DOMs in general are an awful way to lay out information on a screen. It worked well when simple static, linear pages ruled supreme, but is horribly inefficient for the dynamism of the modern web. CSS is just a giant band-aid to the problem.

And just because a bunch of developers put their blood sweat and tears does not exclude it from the consideration of replacement.
OpenGL was the arguably world-wide standard API for 2D and 3D rendering for years, and many man-hours were put into extending it and optimizing drivers for it, and worked well for what was required from it. Nonetheless effort was sought into a replacement that better matched modern hardware than the fixed-function pipelines it was originally intended for 30 years ago, which is how we got the infinitely better Vulkan API. You can say the exact same about the X Windowing System and Wayland too.

Every major browser supports WebAssembly! You can precompile any language and run it in a sandbox on the client-side just like JS.
WASM's IR is a step forward but still lags behind Oracle's Java bytecode and Microsoft's CIL from the few benchmarks I've seen. I don't hold it against that though, as it's still young and still has a lot more time to get better.

The reason apps are slow, is because companies prioritize profits and quick delivery over making good things. This is not unique to web development. It's just a lot more visible because people use the web a lot.
This was what I was trying to get at. Clearly there's a gap between the tools used and how they're actually being used, given the monument of issues plaguing web development. I doubt neither the developers or the companies are going to budge, so changing the tools to fit the demands of the modern web is the most clearest option, at least in my eyes.


I know I'm probably making enemies with all the web developers reading this. Sorry, I'm from the other side of town.

You know well that smartphones are not the devices we are talking about here. Nobody is going to gripe about devices which are by design not intended for multitasking. It's entirely apples and oranges.
On desktop will be running multiple applications (not including the 2 upwards of 3 digits worth of daemons/services that will be running in the background) all at once, most of which will not have the luxury of writing application state to disk when it's not in use. It simply does not work that way.

Having your webapp or website consume 100 MB to 2 GB of memory just because "everyone has at least x GB of memory" is absolutely inexcusable and is an awful mindset for any developer to have. It's also inconsiderate to the end-user because they will want to use other applications without having to throw down $250+ for a new set of 32 GB DIMMs. But as you said; you do not care.
And the fruits of the 3 have manifested in busy CPUs and monopolized memory space.
My point was: if the average phone can run many of these applications at once, then the average computer should be even better at it. Unless you are implying that desktop computers are somehow on average worse??

Remember, we are talking about a framework that is made for creating interactive UIs: most users cannot context switch between like 8 different graphical apps at once. I'll be running 4 MAYBE 5 different electron apps at once. Hell, even in worst case scenarios (running 10 vscode instances because ??) I've been able to use dozens of instances of these programs with no problem.

I don't think a few hundred megabytes is that big of a deal anymore when the average image on the internet is like 50mb.

JS has been JITted to hell and back and yet still lags significantly behind other scripting languages like Lua and Squirrel. Sure, you can throw every last SIMD instruction, data-oriented design technique, -Ox flag, and inlined function at the problem for both the parsing and bytecode evaluation, but the end results still speak for themselves.

Can you give me the source for this claim? I cannot find anything saying JS is significantly behind every interpreted language.

Almost every single recent article and benchmark proves that the V8 engine behind Chrome and NodeJS makes one of the fastest interpreted language (and I haven't even mentioned the Bun runtime). Your knowledge on the subject seems to be lagging by like 5 years. Lua and LuaJIT have similar or worse performance.

This is my criticism of people who still complain about things like this: JS has already come out of it's dark age and has emerged as THE web standard but people still harp about stuff that has been fixed ages ago. The V8 engine for JS literally has teams of people at Google working around the clock to make it the fastest interpreted language because they have the incentive to make their browser the fastest. If what you said was true, what would've prevented another company from making a browser that had a different scripting language and then swaying the W3C to change the standard because their stuff is faster?

And again, all these benchmarks are incredibly close. The people who obsess over them are microptimizing code. The average website doesn't need a 4% speed boost by using C or C++, the ones who do need it for graphics performance can just use WASM.
Quote
WASM's IR is a step forward but still lags behind Oracle's Java bytecode and Microsoft's CIL from the few benchmarks I've seen. I don't hold it against that though, as it's still young and still has a lot more time to get better.
It's fairly new but it's becoming really good at an alarming rate because the potential upsides. The value proposition of sandboxed apps running on every platform ever is a really good incentive to work on it.

Quote
This was what I was trying to get at. Clearly there's a gap between the tools used and how they're actually being used, given the monument of issues plaguing web development. I doubt neither the developers or the companies are going to budge, so changing the tools to fit the demands of the modern web is the most clearest option, at least in my eyes.
I know from your perspective this statement seems like it makes sense, but there are a lot of problems with it. This is like saying "The unreal engine editor is hard to use and it's really easy to make slow games on it, instead of improving the UI and patterns to fit what the devs need, lets replace it overnight with Unity and make all the devs port everything to it."

The real issue behind devs not optimizing stuff is because it is really easy to get into web development, which leads to an oversaturation of bad developers. It's because the tools for the job where really poorly made up until ~5-7 years ago and all the bad developers never updated their skills. For example, a lot of developers still reimplement their own "deep object cloning" algorithms or import one from libraries (ballooning their memory footprint) when the function to "deep object clone" has been a part of web APIs for years now. These kinds of development mistakes will not stop happening until the developers get better and the resources on the internet get updated. Javascript is unfortunately plagued by the fact that most of the content on the internet about it is wrong/outdated (or just lies), and that breeds an environment that creates bad developers.
« Last Edit: May 02, 2024, 09:58:19 PM by Aide33 »

i think the most offensive part of what you said was a website requiring 300mb of ram to run being acceptable or normal. i think that's unjustifiable outside of extremely rare and specific circumstances (eg running some full-fledged video game or youtube) and the fact you didn't think that spoke volumes to onlookers you're part of the problem with modern website performance, whether or not you want to be. i definitely recoiled at that number myself, but didn't deem it important enough to respond at the time.
I think you misunderstood my point in context. Let me clarify:

In the context of the conversation, I was talking about electron apps. Stand-alone single page webapps require a lot more resources than the average website. Apps that run standalone on your computer should obviously have way more leeway on how they operate and that should be taken into account. For example, it's not unheard of for Photoshop, Text editors, IDEs, etc to go above 300mb. They need to constantly cache things like documents, files, and web requests. In fact, the instances of terminal I have open right now are like 80-100mb of RAM! It sounds horrible for a text-based application, but it's the cost being able to scroll up the entire history since I opened it! RAM is meant to be used to make user experience snappy and useable (within reason), imagine if every time I scrolled up 100 lines vscode it had to call the HDD to open more of the file, that stuff would be slow as forget.

I wholeheartedly agree with you when saying a normal CRUD website (like a forum or a wiki) should never ever ever use more than 10mb of ram, 300mb is insane for that scenario.
« Last Edit: May 05, 2024, 05:14:38 PM by Aide33 »