Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect what will doom any such project is that it won't be designed for the primary usecase that made HTML/JS succeed: teenagers with nothing more than Notepad.exe and a search engine. It will be made super elegant and efficient and no one will use it because it will take more than three lines to make text appear, and a typo will result in a bunch of errors. HTML/JS are garbage languages and web standards are cobbled together trash, but their strength is in their utter simplicity and in how forgiving browsers are in interpreting them. If you don't have that, you've already lost.


I think the effect of teenagers with notepad is greatly exaggerated. IMO, the reason why html/js succeeded is because it was (and still is) literally the only option available if you want universal discovery and security. Software distribution logistics used to be a nightmare compared to an application that only required the user to visit a website. It also eliminated an entire class of UX issues with regard to application stability, OS compatibility, and over the air updates. Additionally, as the general population became more educated about the risks of executing random binaries downloaded from the internet, and the features/performance available to browsers simultaneously exploded, html/js became the defacto application platform of contemporary computing.

The eventual advent of the iPhone and the subsequent explosion of the app store also demonstrates that ease of entry is unimportant. The iOS platform was and still is the public platform with the highest barrier to entry, yet none of that matters because getting secure software distributed easily to users is the ultimate killer feature.


> It also eliminated an entire class of UX issues with regard to application stability, OS compatibility, and over the air updates.

Did it? Web applications are not inherently more (or less) stable than native applications, and OS compatibility issues were replaced with browser compatibility issues.


> Web applications are not inherently more (or less) stable than native applications

Absolutely. The browsers are extremely stable and generally don't crash. The individual applications may still be bug prone but those bugs generally don't take down the system.

> OS compatibility issues were replaced with browser compatibility issues

Sure, but it's a much less severe problem (the potential for bugs rather than binary incompatibility) and one that developers can easily remedy based on visitor statistics without customer's having to download, update or take any action. This problem also becomes much less of a problem every day as the browsers converge on common standards.


> This problem also becomes much less of a problem every day as the browsers converge on common standards.

Only Firefox and Chromium are able to keep up. It's an improvement over native apps working only on Mac OS and Windows as they are mostly open source, but still only two platforms.


It doesn't matter because when a webapp crashes, it doesn't bring down your whole machine (or these days, even your whole browser).

The killer feature is effective per-app sandboxing.


What? Native apps crashes stopped bringing down the machine decades ago. Sure there's the occasional bug in the OS (= kernel and system services), but browsers have bugs too.


Yeah the barrier to entry with HTML/JS is so low for anyone to make a thing that the whole world can see and frankly ... that's kinda awesome.

From there you can just be a simple site or scale up to some really impressive applications with a TON of free resources available on the internet. That's pretty amazing IMO.

As a webdev who has been dipping my toes into C#... I fire up a new command line application in VS and the fans on my laptop take off and there's just a lot of boilerplate ... come on man.


But your C# application is debuggable, unlike those "impressive applications" written in JavaScript with thousands of async callbacks so you have no idea where the originating problematic call came from in your minified JavaScript, and even if you find it you still have to deal with the minified code that is unreadable.

What boilerplate are you talking about in C#? If you think that's bad, you should try something like C++ (although it's got far better in recent years), and I say this as a fan of C++.

C# is compiled, JavaScript is not. Your C# application will involve high CPU to compile ONCE but forevermore require half the resources of interpreted JavaScript in webpages, which need to be interpreted again and again and again.

This is an issue that is worth bearing in mind for energy usage, and has a significant impact on our future if we just defer to "easy to write, expensive to run" web languages.

I'd say C# is easier to write than JavaScript and is easier to debug for the most part.


I do agree that debugging async JS code is tricky depending on how your calls are structured. However, JavaScript is definitely debuggable. Stepping through minified code is a solved problem thanks to sourcemaps.


I get the differences, I'm not making an argument for any single language to be dominant or anything.

I'm not learning C# because I think it is bad ;)

As for debuggable, I'm not sure I know enough about C# to comment on that but I find JavaScript ... "debuggable".


Somethings that's going to impress you down the line with C# debugging are using parallel stack views to debug multi threaded apps (without this could be significantly harder), remote debugging (sometimes you simply can't debug your application due to practical issues like may be your app interfaces with an expensive or impossible to deploy in your dev machine to debug with it), the watch feature in the debugger that allows you to debug complex linq queries and even expression trees on the fly and debugging external assemblies by attaching debuggers. VS is heavy on resources, but it delivers value unlike a gazillion of JS apps.


Now with Quarkus on JVM (and to some degree also Asp.NET) the compiled languages has become so lightweight that I'm feeling the last reasons that existed for using problematic languages like PHP, Javascript and Python are disappearing.

They'll hopefully be restricted to small niches were they won't do much harm :-)

I realize this might be annoying but I'm not trying to annoy anyone, I'm trying to get a point through.

(And yes, I do have some experience here, I was significantly better and more productive with both JS, PHP and Python before I really learned Java.)


Yep. The fact that XHTML lost to HTML is all the proof I ever needed of this. XHTML wasn't perfect, but it sure as hell was easier to process than HTML. I guess this is the new Embrace, Extend, Extinguish: Embrace, Expand, Monopolize. Basically make it too complex for anyone to compete.

An example would be Chrome: why would anyone choose to run the All-Seeing Eye edition from Google when it's F/LOSS and de-Googled alternatives exist? Because modifying it is complex and expensive, testing it is complex and expensive, keeping up with changes is complex and expensive (Microsoft basically owned the desktop because of introducing changes so fast the desktop application competition was always spending a significant amount of their budget just supporting the latest change), distributing modified versions quickly and reliably is expensive and complex, getting the community to agree on which anti-features to remove is impossible, and marketing it to end users is expensive. Thus we end up with obviously user hostile open source software, which would be impossible with simpler software.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: