- Joined
- Dec 4, 2007
- Messages
- 2,502
- Reaction score
- 986
C# versus Java is a no brainer for me. Irrespective of that, I see that Java is still larger than C#, and C# is actually decreasing (in favor of NodeJS / Python and surprisingly C++). Granted, the numbers are a year old, but Github's Octoverse is usually a good source (
Java used to be ahead of the competition for cross-platform applications. That's not strange, given the fact that they had barely any competition at all, the competition having a steep learning curve compared to Java. .NET has been ahead of Java since .NET 2.5 if I recall correctly. Java 9 (to be precise: from version 8 onwards) does a fine job of adding features to Java that have been in .NET for ages now (such as unsigned variables). I have always hated Java because it's outdated and it's prone to many security issues. Sure, Windows is also prone to security issues because of it's market share, but Oracle simply fails to patch security flaws in time, every time. Modern browsers actually started to disable the Java plugin by default because of prolonged security issues, and that's when Java died in the web scene. Even the IDE's for Java that are most recommended (Netbeans and Eclipse) look aged, and (yes this is still frustrating me) the lack of unsigned integers was a design flaw from the very beginning if you ask me (and they added support for unsigned integers in a rather crappy way as well IMO).
In the modern age, with for example microservices, I can't for the life of me explain why the JVM takes 512 MB of memory for a single, relatively small application, idle. I don't believe Java is suited for modern application design, but since most of the applications are pure legacy (especially in the finance / insurance market), I'm not surprised to see Java is still all that common. I think the numbers severely change if you look at more modern, e.g. cloud-native applications. I bet those are mostly written in C#, Python, JavaScript / TypeScript because they have a much smaller fingerprint, which is ideal for microservices.
Another thing I'd like to mention: I've been programming for about 6 years now, so I'm only a newbie. I however have had the opportunity to join a multinational about 2 years ago as a Software Engineer and have since traveled a ton, and have worked together with Software Engineers that have completely different backgrounds than myself (not only ethical but also in terms of education. Some having PhDs, others having Masters and some not having a degree at all).
One thing that I was able to confirm every time since I started on this 6 years ago: it's not the programming language that matters, it's really all about solving an issue. A programming language is a tool in solving said issue, and if you pick the wrong tool, you won't solve the issue. But unlike solving problems, picking the appropriate tool and how to manage this tool is easy to learn.
You must be registered to see links
).Java used to be ahead of the competition for cross-platform applications. That's not strange, given the fact that they had barely any competition at all, the competition having a steep learning curve compared to Java. .NET has been ahead of Java since .NET 2.5 if I recall correctly. Java 9 (to be precise: from version 8 onwards) does a fine job of adding features to Java that have been in .NET for ages now (such as unsigned variables). I have always hated Java because it's outdated and it's prone to many security issues. Sure, Windows is also prone to security issues because of it's market share, but Oracle simply fails to patch security flaws in time, every time. Modern browsers actually started to disable the Java plugin by default because of prolonged security issues, and that's when Java died in the web scene. Even the IDE's for Java that are most recommended (Netbeans and Eclipse) look aged, and (yes this is still frustrating me) the lack of unsigned integers was a design flaw from the very beginning if you ask me (and they added support for unsigned integers in a rather crappy way as well IMO).
In the modern age, with for example microservices, I can't for the life of me explain why the JVM takes 512 MB of memory for a single, relatively small application, idle. I don't believe Java is suited for modern application design, but since most of the applications are pure legacy (especially in the finance / insurance market), I'm not surprised to see Java is still all that common. I think the numbers severely change if you look at more modern, e.g. cloud-native applications. I bet those are mostly written in C#, Python, JavaScript / TypeScript because they have a much smaller fingerprint, which is ideal for microservices.
Another thing I'd like to mention: I've been programming for about 6 years now, so I'm only a newbie. I however have had the opportunity to join a multinational about 2 years ago as a Software Engineer and have since traveled a ton, and have worked together with Software Engineers that have completely different backgrounds than myself (not only ethical but also in terms of education. Some having PhDs, others having Masters and some not having a degree at all).
One thing that I was able to confirm every time since I started on this 6 years ago: it's not the programming language that matters, it's really all about solving an issue. A programming language is a tool in solving said issue, and if you pick the wrong tool, you won't solve the issue. But unlike solving problems, picking the appropriate tool and how to manage this tool is easy to learn.