These recommendations are pretty arbitrary and don't even attempt to scratch the surface of what is actually materially different between the JDKs. Don't use Corretto outside of Amazon... why? Don't use Dragonwell because... China bad? Use Red Hat OpenJDK if you're running on Red Hat servers, Microsoft OpenJDK if you're on Azure, SapMachine if you're on SAP, because... the name matches so that's nice? At least they're consistent on that point. There are (generally pretty niche) reasons to pick specific distros, but those reasons certainly aren't discussed here.
I feel like the actual decision process is fairly straightforward in nearly all cases:
- Use whatever vendor happens to be most convenient to install. If nearly everyone using your OS is installing Java one way, and you're installing it some other unusual way, you should be crystal clear about why exactly you need to do that.
- Generally go with JDK 11, if you may have to deal with older software then maybe go with 8, if you want the shiny new stuff and don't mind some extra hassle then go with JDK 17 (it's not well-supported by everything yet).
That's pretty much it. There are exceedingly few cases where it actually matters whether you installed Corretto or Oracle OpenJDK, and in those cases you'll likely end up either testing all the JDKs anyway to make your decision or writing your own patches for whatever you need.
> Use Red Hat OpenJDK if you're running on Red Hat servers, Microsoft OpenJDK if you're on Azure, SapMachine if you're on SAP, because... the name matches so that's nice?
Presumably this has to do with support and possible testing. People buy RedHat EL for the longterm support, if they use a different vendor for the JDK they have to set up a whole new contract for that with a different company. By contrast, people using a free Linux distro may not have any benefit to using RedHat's JDK.
These statement about support i just don’t get at an objective level. Redhat will not assign a strong developer to investigate your bespoke application running on tomcat or whatever to identify the bug. Possible pragmatic reasons
You will get a l2 sysAdmin or an intermediate developer. If for some reason the root cause is nailed to a reproducible bug then it will go into the bug tracker
A few years back I contacted SUSE support due to a confusing package change in a SUSE upgrade. The first-level support person did not know the answer, so they reached out to the relevant developer within SUSE, and got the explanation within a couple of days. Do you have any reason to expect Red Hat support to perform worse?
i think it has to do with that image is quicker to load. if you are using the coretto image(aws) on azure, then you are not guaranteed that the local docker repository has it cached, so it will take very long to load. Also support will be more difficult, if you use the non native image.
> Generally go with JDK 11, if you may have to deal with older software then maybe go with 8
That's a good recommendation for library authors, but if you're deploying an application that's regularly maintained, I would recommend using the most recent version -- that's the cheapest, safest way -- and if it isn't, consider an old version (like 11) with one of the LTS offerings, choosing a vendor you trust to support OpenJDK (https://news.ycombinator.com/item?id=28821316).
Of course, new applications should now target 17. It's both the current version and it has LTS.
Here's my recommendation (I work on OpenJDK at Oracle):
If you're using the current JDK version (recommended for regularly maintained applications), it doesn't matter which distribution you choose, as they're all pretty much identical. If you're using an old version (LTS, intended for legacy applications, which might benefit from it), pick a vendor you trust for OpenJDK support, as the builds are not the same, and neither is the support. After Oracle, which contributes about 90% of the work on OpenJDK, the companies distributing builds that contribute to the project and have experience with it are (in rough order of experience and/or contribution): Red Hat, SAP, Azul, Bellsoft, and, more recently, Amazon, and Microsoft.
There are, however, a couple of standouts: Alibaba's Dragonwell, which, last I looked, did not meet the Java specification, and Eclipse Adoptium, built by IBM, which is the only distribution built by a team that isn't involved with the OpenJDK project, isn't very familiar with it, and isn't a member of the OpenJDK Vulnerability team, and so get security patches only after the other vendors have delivered their builds.
> Eclipse Adoptium, built by IBM, which is the only distribution built by a team that isn't involved with the OpenJDK project
Wasn't AdoptOpenJDK the "legit" recommendation a couple years ago? I only heard about all of these other versions sometime later. (Looking again, it looks like the Oracle OpenJDK is shipped in Ubuntu's repos, so maybe I was mistaken).
> Wasn't AdoptOpenJDK the "legit" recommendation a couple years ago?
It was the recommended choice by random internet users who posted blogs, just as is this website, a random website by some random person.
The best way for you to understand which JDK to use is to understand superficially what is involved with the source code and the build process between vendors, and their licenses.
Source code starts out the exact same, that's why they are all OpenJDK, because they take the source code from the OpenJDK public repository's main branch.
From that point on, they can choose to apply source code patches from OpenJDK that have not yet been merged into the main branch, or they could apply their own source code patches. Those patches often will be security fixes, or backported features. Here you have to understand that there is only one main branch of Java. So if you want security fixes or new features but are still on say Java 8, someone has to pull the old commit that was tagged Java 8 and selectively cherry pick a bunch of commits afterwards to apply over it that retrofits all security fixes and possibly a few select new features (like say some performance improvements patches). And if you've ever had to do a cherry pick merge on old code, you know it can be tricky and sometimes there are conflicts and maybe you even need to manually resolve them.
And even if they are not grabbing the source from an older tagged version commit, but are grabbing it from the latest tagged commit (so as of this writing Java 17), well it might already be that there are some newer commits that fixed some security bug, or other bug, or improved performance or startup, etc. So in their build of Java 17 they could even decide to apply some of those patches that happened after to the Java 17 latest release. And they might even choose to apply some patches that haven't even made it to the main branch yet, so maybe still have an open PR, or they are the ones patching something of their own.
After having grabbed the main source of OpenJDK, and potentially applied some patches to it, they proceed to build it for one or more platforms.
In the process of building, they will choose which platform to build for, such as Windows, Linux, MacOS, x64, ARM64, x86, etc. And they will choose what to include in the build, for example should you bundle Java Mission Control, javafxpackager, jarsigner, jstatd, visualvm, etc.
Finally they can choose to tests each build on each platform they built it for by running the full test suites, but they could also only partially test, as in, run only a subset of the tests, or tests only a subset of the platforms they built it for.
You'd want to run tests especially if you did patch the source, to make sure none of your patches introduced bugs, but the build could also have created an issue which tests could uncover, like forgotten to include some important C lib, or resource, or built with wrong optimization options, etc.
And last but not least, the license they choose. There's two parts here, the first one is that if they made any changes to the source of their own, their changes might be on their own license, or might not even be open source. That means you might not be able to fork the exact source they used for your own needs, or even get to see the full source they used. The second are the terms of use for each things bundled in the build. Do they let you use the JRE for cloud applications? Can you redistribute it to others? Can you use it for commercial work, etc.
Hopefully that better equips you to understand. Most of the companies who make money by building OpenJDK and offering support will probably do a good job at making sure that they backport security fixes as soon as possible, and make sure to always test everything to be sure their backport and custom patches didn't break anything, but they might not always do so for your chosen platform. But as any company who wants to make money, they need to have some of their customers pay them at some point, and that's where licensing and terms of use come in, but more and more they go full open source on their source patches and customization, allow anyone to use things for free in all settings, but offer paid support, though to be sure read their license and terms of use.
And if you can't be bothered to read their license or terms of use, that's why people have been recommending AdoptOpenJDK which is now Eclipse Adoptium. Since they're a community effort managed by a non-profit, you can be more confident that their license and terms will be and remain fully open source and free to use in all cases. The downside is that it's a community effort, you don't know if they'll apply security patches quickly, or fully test, etc. And if there's any issue with the build you encounter, there's no real support, no SLA for it, etc.
P.S.: There also exists some alternate JDKs, that are not based from the OpenJDK's main source branch, such as OpenJ9, GraalVM, Zing, JamaicaVM, etc. Those should be considered as alternate implementation of Java, they often have very different runtimes and garbage collector and all that, though they can still partially be using some of the stuff from the OpenJDK as well. While all the OpenJDK builds I was talking before always implement everything using the OpenJDK source, all they'd do is add security fixes, bug fixes, retrofit some newer features into older releases, etc. They wouldn't provide alternate implementation of anything the way that GraalVM or OpenJ9, et all, will do.
I too was under the impression that AdoptOpenJDK was the current default choice (if you’re making a choice). Can anyone else comment on this? I have to use JRE on windows, so no distribution-provided option.
I was using adopt jdk since that's much easier to download the jdk (especially jre) for Windows and Linux. The new Adoptium didn't git a jre build, which is sad :(
In most environments it is better to use latest JDK release not some old one.
LTS is associated not with version number but with vendor that provides JDK. So if you don't pay any vendor then you don't have LTS, you just have a number and hope it is in some way better thant any other number - which is not.
OpenJDK has no notion of LTS, they release new versions every 6 months, vendors decide which one of those will be supported with fixes, e.g. Azul decided that their JDK 13 and JDK 15 will be supported with fixes for longer their JDK 12 and 14.
It's not managed by IBM anymore, it's managed by a community of volunteer and the Eclipse foundation. IBM gave it away a whole or back to the community.
The general guidance is always use the last LTS release unless there is a specific feature in the non LTS releases that you can’t wait for. So any new development should target 17 at this point.
Unless you want serious long term pain then you shouldn’t be more than one LTS release behind. So if you are not running on JDK 11 or later you should be strongly thinking about upgrading.
Currently upgrading a ~1M SLoC Java enterprise app (with regular Spring, Jetty in there, scheduled processes, PrimeFaces for web UI, REST API services, SOAP services, the whole shebang) from Java 8 to Java 11 (since 17 wasn't out when that change was approved) and it's largely proving to be a pain. Since the version of Spring is ancient and changes over to Spring Boot were also approved, now have to rewrite parts of it and also get rid of the XML configuration and other old approaches which are simply no longer compatible with this upgraded tech stack. It feels like something that perhaps a team should do, instead of one dev over a month or so, but scope creep and estimates for something like that are nigh impossible, so we'll see.
My point is that migrating to new releases isn't always trivial, especially the more complicated and complex a project gets. If i knew that i'll run if compiled successfully, then it wouldn't be too bad, but with the amount of reflection, dynamic class loading etc. that frameworks like Spring favor, my workflow to date has been fixing a bug, building and running, something else breaking, fixing that bug, building and running, something else breaking, realizing that i cannot fix this because upgrades to the logic would be inherently "lossy" due to a mismatch of what the new framework versions provide, making it so that breakages that aren't covered by tests will also be created and so on ad infinitum.
Sometimes it makes me wonder why JDK 9 onward just didn't have a compatibility module: "Here, install this to have all of the old Java classes that were removed from the standard library after JDK 8 and retain that old functionality for software projects that would otherwise be stuck in development hell short of a full rewrite." Or something like that for Spring Boot, that lets you use web.xml instead of having to use hacks to load its contents and register all of the servlets, with half of them not working anyways, because some class names were changed along the way.
Software doesn't always age beautifully.
Furthermore, it feels like new runtime and package versions are made in ways that accidentally break backwards compatibility with no clear ways to avoid this.
I got stuck in this conundrum at my previous gig as well, and think reached a very different conclusion than you. In short, while I’m skeptical of some of the degree of churn across modern software development as a whole, I’m not sure that this is actually an issue with the Java ecosystem, which has been, and I think continues to be, way more sensitive to backwards compatibility than almost any other ecosystem.
In some ways, this is just a consequence of a kind of evolutionary leap forward for many applications. We don’t deploy our WARs into application servers anymore, we try to pack a fat JAR, or maybe even a statically linked runtime, into a container. We rebuild the whole world on CI all the time. I mean, Java 8 has issues just respecting cgroup limits, which many a despondent ops engineer has discovered.
I think there’s still room for the 1M SLoC monolith even in this new world, but there are real benefits on the horizon for upgrading to Java 17 and beyond. We’ve reached a point where it’s just more expensive not to make upgrades a regular part of you development cycle. And, no one is stopping you from just staying on 1.8. I mean, I’m sure there are still decaying enterprises stranded on 1.3.
Personally, i'd never let a single monolith grow that far. Having a problem component or two that could be left on JDK 8 until they die and are rewritten (which would actually be possible with systems that have clearly defined boundaries and smaller scope), while migrating everything else.
Sadly, i don't get that choice, nor do i get the choice to make the judgement call to leave the monolith on JDK 8 in the name of stability. But hey, at least i'm paid a bit of money for it, so i have some motivation to work on all of it to the best of my abilities, learn a bit more about the JDK internals, have a word or two to tell others about my experience before i inevitably burn out from the churn.
Personally, i do still think that Java is pretty good when it comes to stability and backwards and forwards compatibility, especially since tooling like Maven is actually pretty good when compared to the alternatives (well, the parent POM functionality is confusing sometimes, but oh well, it has its use cases). That said, JDK 8 to newer releases was indeed a generational shift (probably one for the better) and you see similar things with Spring Boot 1.5 to Spring Boot 2.X, those breaking releases are inevitable.
I only wish that the things blocking people from writing more modular systems would disappear over time, so that huge monoliths that are incredibly hard to work with wouldn't be such an issue. I'm not necessarily advocating for microservices here, since people tend to go straight from one ditch into the opposite one (multiple services per person, nightmarish service mesh, needlessly large % of the code being for shuffling data around, suddenly building a distributed system even within a single domain), but at the very least it would be really nice to have someone look at it and say: "Hey, this PDF generation logic looks really brittle and perhaps should be a service of its own." or maybe: "Oh, hey, we're serving our RESTful API calls and front end resources from the same application, maybe we should have a separate front end app, served through a regular web server?"
Then again, if nothing else, i always have the choice to choose where i'm employed, even though i don't feel like letting my coworkers down at the moment either.
In general you are right that the Java ecosystem is highly backwards compatible, but 9 is a bit of an exception. The module system breaks a lot more stuff than usual, and is the reason a lot of legacy is stuck on 8.
Isn't your problem above essentially with Spring rather than the new JDK? Presumably Spring is using internal APIs that have changed in JDK11.
And yes, the reflection and dynamic behaviour in Spring are not type safe (i.e. runtime failures). Spring XML is a manifestation of "Greenspun's tenth rule" - it gives people the flexibility of Lisp/Python by unsurprisingly removing static type safety - not a trade-off that many will agree with. You should tell management that your job had been made harder by the choices made by the original architects of the system (who I assume are long gone),
It sounds like the problem is the scope creep. Do the JDK upgrade or the spring boot migration first. Yes it is tempting to do both at once but you are adding a lot of risk.
Agreed! Sadly, it wasn't my choice to make. Then again, if it's an older version of Spring, chances are that a lot of changes would be needed regardless.
Can you give a specific technical problem with updating from 8 to 11? Everyone says it's hard - nobody's ever able to give a practical example of a problem.
Oh, I have a fun one. The company I work at leverages Nashorn for running some user-customizable scripts and when upgrading from 8 to 11 we found out that Nashorn changed the way they express numbers under the hood. Basically, every number is either an integer or a float and it used to be that every number was a long or a float.
This silently broke a bunch of scripts and callbacks because of class cast exceptions and such. The solution was to rewrite the scripts in such a way to import the Java Long class and use that directly to express large integers.
The most important breaking issues I've found were not necessarily in our code base, but in dependencies. Libraries that include libraries that include libraries, all doing their own thing, a lot of the time involving some pretty scary reflection stuff for some unexplained reason. Blindly upgrading libraries only gets you so far, sometimes a dependency gets abandoned and you need to find a replacement and rewrite your code to call the APIs in the right manner to be backwards compatible enough.
Because Java, particularly pre-8 Java, sucks so bad if you try to actually write your whole application in it.
(I always found this really frustrating when making the argument for Scala. Yes, Scala-the-language is more complex than Java-the-language. But it's not more complex than the pile of reflection magic that anyone who tries to write a nontrivial system in Java actually ends up using).
From what I've seen: to add automagic functionality, syntactic sugar and wrapper methods for libraries. The JVM/Java language doesn't seem to have the tooling necessary to accomplish this natively, so it's the only reliable way to do it without relying on annotation processors having properly run (which still require some reflection magic, depending on the use case).
It is the dependencies. We bit the bullet and cut over (most) everything to Java 11 last year. Things that were removed from the JDK into new dependencies were the easier bits to sort - JAXB, some old Sun classes, and Java EE bits. All that was less than a week to sort. The real work were with the older 3rd party libraries not making the jump. You end up with turtles all the way down in some cases. Those took a serious amount of hunting for replacement. Mind you, the code was almost always something that had no budget or appetite for maintenance, so the moment you had to jump to a new package/API... folks often gave up.
All that said - for the 'normal' Springboot applications, we saw almost a 25% performance jump for the same code - updated libraries - running on K8. Java 11 was a substantial performance improvement. The jump for us to Java 17 will happen this winter, and looks to be a non-issue so far.
When you stumble upon something like that, at best you can import a Maven dependency with a version that has what you need, at worst you need to rewrite the code to use another library or set of libraries.
If you have any low level logic like custom class loaders written against older JDK versions (think before 8), then they'll be forwards compatible until 8 for the most part, but will break afterwards. Coincidentally, reading code that deals with low level logic is also not easy to do, especially if it's not commented well.
If you rely upon reflection, or use advanced language features (like the JasperReports framework for generating PDFs, which also has a build step for building the reports), in some cases things might compile but not work at runtime due to class mismatches.
Many frameworks need new major versions to support newer releases than JDK 8, for example Spring Boot 1.5 needs to be upgraded, so you're also dealing with all the changes that are encapsulated by your dependencies. In another project that i also migrated, needed to rewrite a lot of web initialization code for Spring Boot 2.X.
Not only that, but with those framework changes, certain things can break in the actual environments. For example, if you package your app as a far .jar, then you'll no longer be able to serve JSP files out of it. It makes no sense, but packaging it as a .war which can be executed with "java -jar your-app.war" will work for some reason.
I some other libraries, method names remain the same, but signatures change, or sometimes things just get deprecated and removed - you have to deal with all of that, which is especially unfun in code that isn't commented but that the business depends on to work correctly. Throw in external factors such as insufficient coverage of tests and you're in for an interesting time. I'm not saying that it's a type of environment that should be condoned, but it's the objective reality in many software projects.
Oh and i hope that you're also okay with updating all of your app servers (like Tomcat) or JDK installs on the server as well, especially if you depend on some of the Tomcat libraries to be provided and for your frameworks/libraries that depend on them being present at runtime to accept those new versions effortlessly. It all feels very brittle at times.
This is especially a nightmare if your servers were configured manually - personally i'm introducing Ansible and containers to at least isolate the damage of Ops rot, but it's been an uphill battle since day 1.
Here's an exceedingly stupid one: sometimes there are checks in code to make sure that you're using the right version of JDK (or even the Oracle JDK), which get confused with the larger versions. It's easy to fix when it's your code, but really annoying when it's external tools - personal nitpick.
Addendum: here's something that i expected to break, but didn't. We use myBatis as an ORM. It has XML mapper files that define how to map entities and construct queries against the DB based on that. Also, it uses Java interfaces and dynamically calls the underlying XML code as necessary. So essentially you have an interface, which has a corresponding XML file in which you have quasi-Java code (e.g. checking what parameters are passed in to the query) that's used alongside a number of tags to dynamically build SQL queries. Here's an example: https://mybatis.org/mybatis-3/dynamic-sql.html
Instead of something breaking in myBatis, what broke actually was Hibernate in the newer versions of Spring. Oh, and Flyway for DB migrations simply used to work with a particular Oracle DB version as well.
I hope you move to spring voot first vefore upgrading to JDK11, or vice versa.
I'm in the middle of a migration to 11 and my general approach has been to get it compiling on 11 while targeting 8 with (--release 8). That seems to catch a great many things that would otherwise fail. The hardest part thus far was figuring out the appropriate libs to replace Java 8 ee stuff, mostly wrt soap and xml. I'm hoping moving to 11 goes smoothly but we'll see. I'm hoping to have the artifacts to a point where they can run on either 8 or 11 without issues.
JDK 8 -> 9 migration was painful, but all other upgrades should be painless (well, maybe the 16-17, but I assume that most libraries will be fixed by the time you upgrade - either add `add-opens` instructions or use appropriate new classes from JDK).
We had that about a year ago in our product as well, though from Java 7 to Java 11. That JDK 9+ hump was definitely bad, but since then each incremental JDK has been smooth. The idea of using "the latest JDK" is fairly modern in the Java world, so people are still getting used to it. I'd say it's good now though, with perhaps 16 -> 17 being a minor hump since things that were deprecated but allowed in 9+ are now removed.
I think this is one of the few benefits of micro service architectures. You end up with much smaller pieces to migrate over. And if the service is well defined and you have ample logging for endpoints you actually could re-write it.
Mind you; you usually end up with the services all in different states of node/jdk/etc versioning(at least the companies I worked at).
Aa somebody who has had to do a Spring upgrade in the past, from 2.x to 4.x, I suggest you do Spring first, then the JDK after. Spring still supports 8 well after it starts supporting 11. Once you reach a certain version of Spring, upgrading the JDK is as simple as changing a 1.8 into an 11.
I don't know PrimeFaces, but I have worked with JSP, which does have a flag to set the compiler version (in case you were ever thinking about putting java lambda's in JSP files, which is possible but not recommended due to mixing code).
The hardest part of upgrading the JDK is going from 8 to 9. After 9 it is much more forgivable. As for Spring, major versions are a pain, as is Hibernate. The other parts of Spring are generally well upgradable if you read the Spring changelogs.
Upgrading Spring and especially major Boot versions is usually much more work than bumping Java version.
Boot change how it does config and such which .. can be confusing.
There is more to this now tho, because even their topline recommendation misses the fact that Temurin (anagram of runtime) only provides a small subset of tested builds you need.
They are in there but easy to miss - Bellsoft actually have builds for both jdks and jres for more than mainstream x86_64. Termurin dont even have jres anymore (last time I checked)
JREs have basically been replaced with building custom executables via jlink. It basically includes a barebones JVM with only the modules that you need. You no longer have to either distribute or ask your users to download a JRE that includes the kitchen sink, you instead use jlink to create a custom distribution that only includes the modules that your code actually uses. For example, if your application does not use Swing, jlink won't include it.
If you are distributing software based on java a 40mb jre in place of a 400mb jdk is surely preferred.
You can build them yourself for each OS you target with jlink, or you can just grab the latest built and tested one from e.g. Bellsoft - but not Adoptium or Oracle.
Java has been "breaking changes" between releases to make python blush for a few years now, 17 finally looks like a worthy migration from 8.
JRE is like 44MB, JDK is 190MB. Difference is not 400 MB. And you get jshell, javac all the nice tools that I missed when some wise guys in distro or corp decided JRE is enough.
Officially at least it seems that there are no post Java 8 JREs (or maybe Java 11 - I don't remember exactly) which is a shame. I know I can do it myself but having a small off the shelf runtime sure was convenient.
there is no "officially" anymore, java has gone open source with distributors kinda like linux distributions.
adoptium isn't the best distribution if you want to run on more than win/lin/mac x86_64.
Why should I stick to LTS?
All other versions aren't in any way less tested, you should always stick to the newest released JDK version, be it LTS or not, this way you get all the benefits (language features, performance gains) and security ones (security fixes always first land in the newest version, and are backported to the older ones).
Upgrades now are pretty straightforward if you are past JKD 9 - with JDK 16-17 being a bit tricky, but less than JDK 9.
Maybe, but those I use didn have any trouble with any java release.
Using so called LTS without paying for it is in no way different from using latest. You get similarly tested software and no one to call if you need a security fix asap.
> All other versions aren't in any way less tested
That depends. If most people follow the advice of "sticking to LTS", these versions will have way more users, and thus will be more "battle-tested", as in tested in production.
Support for vulnerability patches, stability, and so on. You can automatically update and maybe that goes well and maybe problems are caught in testing. Regardless those events and changes have over time have increased costs. LTS should mean lower costs both through increased support focus, increased population of active users, and reduced forced change.
After 6 months you move to the next release, just like most people do with libraries, JDK is just another library, no need to worry about upgrading, just do normal regression like with upgrades of other libs. There are no major releases of JDK, all current ones are minor ones - just like in case of JDK 8 you got new features in minor releases (e.g. change in changing of toString, added classes in standard library - no one was worried about that then)
OpenJDK serves as the upstream for many Java distributions (maybe all?) and it has no notion of LTS. LTS is just something to which vendors commit as outlined at https://openjdk.java.net/projects/jdk/17/ : “JDK 17 will be a long-term support (LTS) release from most vendors.”.
In other words, OpenJDK does not follow any LTS/non-LTS release cycle. Hence there is nothing special about Java 17 vs 16 when it comes to stability.
Even preview features are production ready (I used them in production) . Just the api might change or it could be removed, but quality wise it is as good as it can be.
There haven't been any notable language features added since java 9 besides some basic syntax sugar (which is already covered by stuff like lombok anyway).
For features and performance you might as well just target .NET 6. It has things that have been perpetually 'too hard to implement' (read: oracle doesn't want to pay their engineers to impl it and will sue you if you do it yourself) like value types, generics without type erasure, no checked exceptions, etc. and with .NET 6 performance is better than OpenJDK across the board.
Yes - .NET 5 and 6 both have runtimes and SDKs available for Mac and Linux. At least in my opinion, Linux is the preferred deployment platform for it now unless you have a specific (usually older) library that only works on Windows.
> .NET 5 finally delivered Linux support but only eleven months ago
The previous versions in this stream (.NET Core 2.1, .NET Core 3.0 and 3.1) also run fine on Linux. For web server / API server workloads, it's been stable for a lot longer than 11 months, and people do indeed "bet the farm" or "run the business" on it, quite happily.
Technically, this starts with .NET core 2.1 on 2018-05-30. The prior version, 2.0, released on 2017-08-14, wasn't _quite_ there yet although some people were happy with it for production use.
.NET has been on Linux for years now. And my company is using it in production on Kubernetes (Linux/Debian). Its crazy how dependable and fast it is. Startup times are amazing and its only getting better each year. We're looking to upgrade to .NET 6 in production by January.
Have you even seen the benchmarks compared to Java and other languages?
I can also confirm that I’ve been using .NET on non Windows OSes for years and it works really well. However the Benchmarks are rigged. The code they wrote for that one benchmark for in the top is a complete cheat. Check out the source and compare it with the other ones from the top 10. .NET is fast but Java is truthfully still faster.
Then use Kotlin, and take advantage of the jvm and the ecosystem, and avoid basically all the stuff you listed about the java language.
"no real reason" is a stupid take. I could list multiple, but a big one is that there are hundred java developers for each dotnet developer in my city. The java env is well tested and well understood, if anything one should argue why not use it instead of a hip and trendy alternative.
Their point about using .NET were not really valid since one of the big and best reason to use Java is that you already have a Java codebase, but things like value types and generics without type erasure aren't solved in Kotlin, so Kotlin isn't really a good answer to his not really valid point.
In what way? It's slower and has less features than the .NET runtime.
> "no real reason" is a stupid take. I could list multiple, but a big one is that there are hundred java developers for each dotnet developer in my city. The java env is well tested and well understood, if anything one should argue why not use it instead of a hip and trendy alternative.
So, your only credible excuse for using java is inertia from boomers and middle managers refusing to adapt from the standard of the early and mid 2000s?
> So, your only credible excuse for using java is inertia from boomers and middle managers refusing to adapt from the standard of the early and mid 2000s?
If you come in and freely migrate all of the Java codebases to .NET while maintaining code quality and functionalities, I'm sure many people would let you do it. If you don't understand why people stick to one language, that means that you've never worked on a big codebase, or completly ignore the business side of the developer job. In both cases, that's a lack of wisdom on your part.
I guess you missed the 'new development' part of my original comment? Obviously it would be nonsensical to port a large extant codebase to a new language and tech stack if it's just being maintained.
In the past it has sent command line arguments and the path of the current working directory, among other things, according to comments on that issue. Even if Microsoft doesn't intend to deliberately collect sensitive data, they don't seem to think it's worth putting much effort into preventing accidental data leaks from broken anonymization, unreliable opt-out mechanism, etc.
yeah, you got it. My city has very little job listings requiring .NET for actual back-ends. Whereas, there are like a thousand Java jobs and almost half mentioning Spring
These discussions inevitably end up as a flame war sooner or later.
Regardless, i actually compared Java with .NET and their web frameworks as a part of my bachelors', everything from synthetic benchmarks for encryption and data processing and transformations, to things like shuffling JSON around. Now, it's all in Latvian and was a number of years ago, so it's not entirely relevant at this point, but i did have some tangible findings.
In short:
- both Java and .NET (then Core) are inconsistent in their performance - there are certain things which are slower in one technology than other by not using external optimized libraries. For example, Java had problems with writing deeply nested dynamically generated JSON with JavaEE libraries (now Jakarta), whereas .NET Core had problems with handling large amounts of text
- their performance was largely comparable in most other tests, neither was faster by a factor of 10, like you'd see with Python and Ruby compared to either
- thus, it's largely a matter of choosing the actual frameworks that you'll want to utilize properly and consider both the job market and business factors (familiarity with the tech stack, job market etc.)
- in summary, they're close enough for it to not be a technical decision most of the time in real world circumstances (save for a few exceptions), but rather is a decision that depends on social elements
Since then:
- i don't believe that the observation of them being "close enough" has changed much, both in legacy code and otherwise
- .NET Core and now .NET 6 has improved bunches with its runtime; Core was so successful it's essentially the future of the platform (i feel bad for developers who'll be tricked into working on legacy code with the old .NET and IIS, versus the new one and Kestrel)
- JDK has improved bunches with its runtime and GC; the runtime situation is a bit complicated and cumbersome, considering the OP's article, but overall it's pretty usable, especially with frameworks like Quarkus
If you jump around the different tabs that compare which frameworks do what better, on average:
- .NET is better for plain text
- .NET is noticeably better for data updates
- Java is noticeably better for JSON serialization
- Java is noticeably better for single DB queries
- Java is noticeably better for multiple DB queries
- as for cached queries and other use cases, there's more variance
- neither is better than the other for it to matter a lot
These are probably better than me linking the bachelors' because it's done in a better controlled environment, with more resources, and a lot of people contributing to the source code of the benchmarks: https://github.com/TechEmpower/FrameworkBenchmarks/tree/mast...
In short, you're not necessarily wrong - there have indeed been great improvements to the .NET platform and it's good to see it finally being a capable and performant tech stack that you can use on *nix. But you're also not right: Java is getting similar attention, even though it's still lagging behind a few years in regards to "having its stuff together" (e.g. going from Oracle JDK to a more stable model + the JDK 8 to newer version shift, which is similarly painful with old .NET versions).
To lighten the mood, i'd consider suggesting that anyone also have a look at Go, which is similarly performant and allows you to build runtime independent executables by default, which is easier than in either .NET or Java. It's also pretty easy to write and has seen good use for the development of tools, since its startup time is a tad better than either that of Java or .NET as well. Here's the three compared: https://www.techempower.com/benchmarks/#section=data-r20&hw=...
As if implementing value types in a backwards compatible way that also modifies generics to work with them would be a trivial task..
Also, Oracle does pay plenty for the development of Java, and is a (surprisingly) good steward of the language, let’s drop all the blind hate.
And both in terms of GC and JIT, the JVM ecosystem is ahead, the reason they can be so head-to-head is that the CLR doesn’t hide lower level controls all that much, allowing for better hand-tuned programs (at the price of the resulting code not getting “automatically” faster in a possible future release)
> read: oracle doesn't want to pay their engineers to impl it and will sue you if you do it yourself) like value types, generics without type erasure, no checked exceptions, etc.
All of this is currently being implemented in project Valhalla...
I never buy into the tech stack argument that things are being worked on (e.g you mention Project Vahalla but I see this across many languages and tools). Seen this argument used on a number of different technologies. It compares a future state to a current state to put the favored tech (the future state) in an equal or better position. It usually punishes innovative platforms as well; because it dismisses any reason to take them up.
It's comparing apples to oranges - in this case .NET is also being actively worked on so may have other things by then. Compare current state only.
The truth is each platform has prioritized features relevant to its context. For what its worth in my experience while the JVM has many more JIT optimisations and the like it tends to need them more of them given the lack of some of those features you mention (e.g. value types). Whereas .NET code allows value types, better management of memory (i.e. Span), reified generics etc so the focus has been to allow the user to optimise themselves where required where still allowing for a decent performance default. Many of the optimisations in the JVM wouldn't have the same bang for buck in .NET and vice versa.
On a personal note I'm more of a fan of the .NET philosophy because the code is usually fast enough, and when I need to tune memory, avoid allocations, and do fast code it seems to offer more tools not in an unsafe context to do so. It allows a better "upper bound" of performance for core things IMO while keeping to bytecode/IL. Many benchmarks where the same level of application optimisation has occured from what I seen have confirmed this bias for me. YMMV
You’re missing the point. He claimed Oracle refuses to develop these features or will sue you if you do it yourself (they hired the guy who implemented fibers to implement them in the jvm), but they are actively working on it. The point of my comment was that OP’s claim was just patently false. I wasn’t claiming the JVM has feature parity or is better or anything of the sort.
But they ARE working on them and they ARE paying their engineeers to do it. They even hired pron to implement fibers in the jvm and didn’t sue him for doing it in quasar, which is the opposite of what you’re claiming.
> Records, switch expressions, multi line strings and that is only language changes.
None of those are features, they were desperately needed shorthands for common java idioms. This is like calling braceless if/for/while statements 'features', when they're purely syntax sugar that has fallen out of favor completely because there's been several major security vulns found in major projects due to their use and development oversight/code review failure (https://nakedsecurity.sophos.com/2014/02/24/anatomy-of-a-got...), to the point where most companies ban writing them.
Just off the top of my head:
* LINQ. Added to C# in 2007, Java didn't get streams until 2014.
* structs/value types, implemented in net framework and net core for 15+ years, but not present in java 17. Some oracle engineers have been working to implement them under project valhalla but it seems to be vaporware at this point after 8+ years of work.
* unsigned types, no plans to implement in java (though IIRC it was originally planned as part of valhalla)
* async/await
At the feature level Java is stuck in the mid 2000s. Nobody wants to do the actual work on keeping it competitive.
Thank God they didn't shove this crap in. When Loom ships (I believe the next LTS is their target, more or less), JVM will have the same concurrency story as Go does, without blue/green function split like in C#.
> unsigned types, no plans to implement in java (though IIRC it was originally planned as part of valhalla)
Not too comfortable to use functions that will compile to efficient byte code exists for them, but with the definitely coming Valhalla, it will be trivial to create a custom primitive class for unsigned ints.
> async/await
With project Loom, it will avoid the mistake of function coloring that async introduces. In a managed language, why not let the runtime automatically transform blocking calls to non-blocking, when it already knows what’s up?
> With project Loom, it will avoid the mistake of function coloring that async introduces. In a managed language, why not let the runtime automatically transform blocking calls to non-blocking, when it already knows what’s up?
Not saying it's a good or bad thing, but .NET and C# by extension has had these features for years (decades in some cases) while the only thing java has are half-baked prototypes and plans to 'maybe' implement things. In many cases these plans just get endlessly pushed back and new java major versions just become a pile of simple bugfixes which in the past were just pushed as minor jre updates.
It’s not like async/await is a must, they are on the surface just syntactic sugar.
In the background, coroutines are a strictly less useful feature than what will happen with Loom, and please show me any “promised feature” that were only half-baked prototypes? We do have one half of Valhalla (vector api) under an experimental flag with JDK 17 already, several language related JEPs were shipped already. That Loom and primitive classes take their time only mean that they are not vaporware, because these are actually ridiculously complex problems.
With your explanation everything is just a shorthand. All you really need is NAND.
Streams are a library feature, records are a language feature. Streams could be implemented outside, records could not (no lombok abominations are not records).
comparing braces to records/multi line strings/switch expressions is so funny that I won't even comment on that.
> and also what do you mean by oracle suing if you implement value types?
this one shouldn't need an explanation, oracle very commonly pursues frivolous lawsuits as a way of bullying money out of businesses that don't have the budget to fight them for years in various courts.
where does it say .net 5 was faster than openjdk? hotspot is a very sophisticated jit compiler that has probably 100 manyears put into it just in optimizations. given that .net added monomorphic/bimorphic call site devirtualization recently which is considered quite basic in the hotspot, it would be good to see real world usage comparison.
Where required there are ways to force it to inline/devirtualise yourself. For example using refied generics is one way I've seen - i.e. there is no interface/virtual casting since it takes a type that implements interface, rather than the interface itself. It allows you to make polymorphism compile time rather than runtime. Seem comparison libraries to Java (closed source) that have run much faster as a result.
I do find people comparing Java and .NET Core often are compare apples to oranges however. Working on both languages it is just my opinion but the .NET platform is newer - it has a better "base" even without the same man hours. Much of the engineering time in both ecosystems is spent optimising for code typical to that ecosystem which is affected by history/legacy like any other software system.
The safest, cheapest choice for applications that are heavily maintained is to use the most recent version. LTS is designed for legacy applications that are no longer heavily developed, and might benefit from security and bug fixes only (note that this model is new; in the past, back when there were major versions, minor updates included both patches and big features).
It also depends on your target environment, and how much control you can exert over the machine where your software will run.
Sometimes you're creating an app that other users need to run in their own environments - and in that case, targeting "latest LTS" is safer, as they may have requirements on what they can run that you don't.
True, there are, indeed, times when you can't use the current version, but note that Java applications are now encouraged to bundle their own runtime (generated with jlink).
With major releases gone, this isn't hard, and, overall cheaper and easier than upgrading less frequently. But it is some work, which is why the LTS service was added for the benefit of legacy applications. Note that under the old model, LTS didn't exist. There was a new feature release every six months that everyone had to upgrade to. The names of those releases didn't stand out as much (8u20, 8u40 for feature releases [1] as opposed to, say, 8u25, 8u45 for patches) as they do now when we did away with major releases and gave every new feature release a new integer number.
Because now the feature releases can add new APIs rather than just new JVM features as they did under the old model, this process can be slightly more work, but still significantly less work over time. As regularly maintained applications update code and dependencies regularly, anyway, this is a clear win. Not only is the upgrade process easier and cheaper, but you get to enjoy the regular and significant performance improvements that have always been part of the six-monthly feature releases, but aren't in the LTS patches.
[1]: Those feature releases under the old model also added major new features with significant changes to the VM. For example, the G1 GC and JFR were added in feature releases.
Which is quite painless, I do it since JDK 10, upgrade up to 1 week after JDK release and it was painless (with one exception around JDK 12 or 13 where spring have some issue for 1 month until I could upgrade).
> The mission of the Eclipse Adoptium Top-Level Project is to produce high-quality runtimes and associated technology for use within the Java ecosystem. We achieve this through a set of Projects under the Adoptium PMC and a close working partnership with external projects, most notably OpenJDK for providing the Java SE runtime implementation. Our goal is to meet the needs of both the Eclipse community and broader runtime users by providing a comprehensive set of technologies around runtimes for Java applications that operate alongside existing standards, infrastructures, and cloud platforms.
> Along with the move to the Eclipse Foundation, the project has a new name: Eclipse Adoptium. The move and name change are being made to ensure vendor independence and long-term viability of the project. As you know, the Eclipse Foundation has a number of key projects, such as Jakarta EE, under its umbrella, with the foundation providing the necessary legal and operational support for its projects.
I suspect that part of this is that it has more to it than just the JDK.
Disclaimer: I am involved with the eclipse foundation (not as part of the core staff)
Hi, I just want to expand on this a bit...
The "eclipse foundation" is actually not just sponsored by IBM. Nor is the IDE. As for what support it provides, that's actually spot on and is under a similar idea to what apache and linux provide. There are of course differences: but for most people the vague idea that open source foundations exist and provide some similar functions as a way to host projects is enough for this discussion
A lot of eclipse's revenue actually comes from working groups with Jakarta EE being the biggest one. There are also IOT, Automotive and many other working groups + projects under the eclipse umbrella.
I wish there were a section like this one in every tools' home/download page.
A section called "For production use this" or "if you don't know what you're looking for, use this".
And "this" can be a particular version (Ubuntu 20.04.6) or a rule of thumb (for production, always use x.y.1 version or above).
Or it can be a table like this one. Django is, as it is in many fields, the gold standard in this regard.
It gets confusing sometimes. Yesterday I was trying to update the python version of one of my applications and was wondering; should I do it now, or do I need to wait for 3.10.1? I did it anyway because the app barely gets traffic anyway, but more clarity is always welcome.
When Oracle acquired Sun in 2010-11, Sun had not shipped a new release of Java in more than four years.[0] That's hardly a "stability" worth treasuring.
Since then, Oracle completely open-sourced the JDK and the associated tools and gave the the compatibility kits to the community to use. The result is that there are now several different distributions of the JDK. This is undeniably a good thing.
As to the 6-month release cycle, it allows the community to get new features on a much faster cadence than any of the other major languages. If it's too fast for your preferred way of doing things, stick with the LTS releases, which come out on a three-year cycle and include all the features from the prior releases.
There might be reasons not to like Oracle, but their stewardship of Java has been pretty good.
> When Oracle acquired Sun in 2010-11, Sun had not shipped a new release of Java in more than four years.[0] That's hardly a "stability" worth treasuring.
Sun went several years without releasing a new major version, but that same page lists frequent minor updates. So yes, I would call that exactly a stability to treasure.
I mean, yes, there were fewer major features, but there were feature releases in those minor updates; there's nothing wrong with having stable software. Are you also one of those people who complain that stable Debian and RHEL releases include old software? Not everyone wants to live on the bleeding edge, and Java was a heavily enterprise ecosystem (for better and worse:]).
And the latest release is no bleeding edge. After eg. “feature freezing” JDK 17 before release, there was no subsequent changes to the codebase, meaning, they found no bugs whatsoever, which did happen beforehand.
Exactly, I don't get it why people think JDK 12-16 are inferior in any way, this is normal software tests in same way as any other JDK version.
The only difference appears after 6 months after release - if you don't want to upgrade (you should) you have to find vendor that will sell you support for given release.
E.g. Azul provides it for 11, 13, 15, 17. Oracle for 11 and 17.
Adoptium provides builds (not LTS) from 11 and 17 branches, where some vendors push their bugfixes. So it is like a semi-LTS, they won't fix your bugs, but if someone fixed a bug it most likely will end up on that branch.
Java language development itself wasn't stagnating, as far as I understand only the release process was stuck on a licensing dispute. IBM and other members of the Java Community Process stonewalled every new feature in the hope that they could force Sun to certify Apache licensed Java implementations like Harmony.
Agreed, with the caveat that what has been messed up is the clarity rather than the stability. AFAIK, there are plenty of options for freely-available, unencumbered, high-quality, reliable runtimes, and have been since Oracle got involved. It's just that it's been a headache to work out that this is true, and what you should use.
I think the situation is much better now than it was. The daft renaming of AdoptOpenJDK is the only recent low point.
This is not an issue in reality where devs just use whatever chocolately, brew or apt-get pull for them to develop and use whatever is the latest docker image for prod usage.
The author doesn't justify it but I suspect the thinking is related to how fixes are prioritized.
If there is a bug/performance issue that only happens in the context of AWS, then Corretto would be the most likely to rapidly patch the issue as it impacts their customer base. Potentially patches for those sorts of issues would be discovered by AWS through their own use and testing.
Likewise for the other vendors and their own platforms.
Not sure. The JDK works pretty well, but for whatever reason, I had issues with Corretto crypto provider and running it with Metabase. Switched to Adopt and didn't look back
Technically it's the Groovy runtime that doesn't support 17 (or specifically, JVM bytecode v62+) yet. And then only because it seems to do an explicit whitelisted-version check during some buildscript static-analysis bootstrap phase.
If you switch your Gradle buildscript files over to being written in Kotlin, the problem goes away, as Kotlin's runtime doesn't seem to use any similar explicit checks.
(Doing so also allows you to go further and test out EA JVM builds, e.g. Project Loom, which Groovy-based buildscripts have never been, and will never be, happy with.)
Yes, it does now; but the Groovy runtime still has the general problem of not working with each new JVM version for months after it comes out. Whereas, with the Kotlin runtime, you can just forget about this problem and upgrade as soon as you like.
No, because Gradle isn't checking the JVM version, the Groovy runtime is. (Specifically, the Groovy compiler seems to run fine on the buildscript, producing a buildscript bytecode file; but then, upon load, the Groovy runtime seems to check that buildscript bytecode file's JVM bytecode version metadata before doing whatever-it-does when loading it.)
Gradle itself isn't written in Groovy; it's 5% Java, 95% Kotlin. The Groovy runtime is only spun up to compile and run Groovy buildscripts (i.e. the default kind of Gradle buildscript with a ".gradle" file extension.) If none of your buildscripts are written in Groovy, then you're not compiling or running any Groovy code, so that runtime check never executes.
> Gradle itself isn't written in Groovy; it's 5% Java, 95% Kotlin
This doesn't appear to be true just by looking at the Gradle GitHub repository. It claims that the code in the repo is 46% Groovy, 44% Java and 6% Kotlin
Isn't that kind of thing you'd expect using Gradle? It's not the first time it happened, they have been late to the party for a lot of the recent releases.
I guess it serves right to folks who endlessly hated Maven and liked this "new", "next generation" , "modern" build system with no XML. Turns out as long as JDK-8 is supported version of Java a lot of tools like this look modern.
Gradle does fix plenty of shortcomings of Maven, eg. proper parallel builds, proper task graph (you never have to do clean install on gradle, while it is almost always needed with maven), as well as it is much more performant.
If I understand that page correctly, you can use Java 17 to compile and run your program, as long as you also have Java 16 or older installed to run Gradle itself.
I think this advice only applies to the end users of the JDK and libs, e.g. people who develop webapps. We maintain an OSS library and plan to support JDK 8 for as long as possible (though some of our dependencies made a move to JDK 11 and most likely we'll have to follow suit). With this approach, our libraries can be used by developers under JVM 8, 11, or 17.
I've never heard of Adoptium Eclipse Temurin before. It looks like all releases were just within the last few months, but this is the recommendation? Seems a bit strange.
It's a mess because for the average joe such as myself it's very confusing and foggy.
There's an Oracle jdk whose usage terms need to be deciphered using a lawyer to understand precisely when and how you can use it. So much that the average advice is to not use it. Like if they told you "hey, node.js is cool but don't use the one from those who maintain and develop the code". This would be a huge red flag for any other language than Java, which can leverage 30 years of sunk costs for companies that now will not switch lightly to something else.
The alternative builds are good, yet not official and they might introduce thin incompatibilities or different behaviors. They should not but this whole article suggests that they actually do.
The simple existence of articles like this one testifies that this is a mess for a lot of people.
If you're on linux, use the openjdk build from your distro's official repository. If you're on windows, download the latest openjdk build from oracle (currently https://jdk.java.net/17/), extract somewhere, put /bin folder on path. Optionally set JAVA_HOME env variable to the main folder path. No clue about OSX but it's probably similar to windows.
Ignore oracle jdk, ignore any concept of LTS/not-LTS, that's only for companies with big pockets and specific needs.
People really like to make it sound more complicated than it actually is.
For those of us supporting more than a few legacy apps, use the oldest version (7 in our case) because then there's less chance of things going wrong (depends, of course, whether you bill per support call or a fixed annual fee).
Sometimes there is just not enough resources (time, developers, expertise) to update the project. Those projects are often internal and never see the outside of a companies network.
Is there a way to apt-get the new JDK 17 as a package on Ubuntu, instead of piping a shell script from curl or unzipping a tar? Does anybody use package managers on Linux anymore?
I don't like the 'curl -s "https://XXX" | bash' approach much either, but I've given up trying to work around it.
With SDKMan switching between JDKs is painless, despite the install method of SDKMan itself being one of those.
But it's not just Java. To build our frontend webapp I need NodeJS, and there too some specific version depending on the version of the webapp or just because we've updated.
I'd love to be able to just apt install all of this, but the downsides are just too many.
Personally, i just use containers to manage business applications - which often need a very particular runtime that's tested, verified and signed off on for maximum stability and predictable deployments.
On the other hand, i only use standard packages through apt/yum/apk/whatever for server software, a distinction that's lost too often in my opinion. Everything that the server needs to operate should have automatic (at least security) updates enabled and ideally the configuration should be fully automated through Ansible with something like GitOps and read only access through SSH (with fallback account with write access for special cases) for maximum auditability and being sure that this is less likely to happen: https://dougseven.com/2014/04/17/knightmare-a-devops-caution...
Furthermore, the servers themselves can be viewed as pretty much disposable at that point. A VM gets corrupted? Wipe it, create a new one, run Ansible against it to set up the environment, then just give the node a label in your container orchestration platform of choice and watch as your business software is automatically provisioned on the node, bringing the total capacity of your cluster up once again.
For personal devices with no important credentials, or development boxes which are similarly unimportant, piping random stuff and trying to work around the problems with multiple SDKs is more permissible, however. Seeing as PHP, Ruby, Go, Java, .NET, Python and other technologies aren't always pleasant to work with if you have different projects that need different environments, as humorously pointed out here: https://xkcd.com/1987/
Containers aren't always comfortable to use for local development (especially with OSes like Windows locally, without WSL2, due to problems with bind mounts but perhaps are for components like DBs, Redis, MongoDB etc.), but in certain dev boxes they are also viable, thus making development even easier.
Last time I checked, at least in Ubuntu 20.04, there was already a package for OpenJDK 17, so I think that's the straightforward and recommended way to get Java 17 in Ubuntu.
I think a custom package repository could be a proper solution then. This way one would get version upgrades for free. Much better than installing (curl-piping) yet another package management tool `sdkman` with a new syntax.
Though, I might be missing some important details, and I would love to understand it better.
Edit: a sibling comment suggested that now there's a package repository for that, and indeed there is.
I was literally just looking at JDK comparisons out of interest (I already have my app running under a JDK). But I'm still not convinced which is the "best" one.
What are some difference between OpenJDK, Adoptium, Azul, BellSoft, etc. besides company dependencies and lifecycle? Which JDK is the fastest, or do certain JDKs perform faster under certain scenarios? Do any of these JDKs have any special features, or can they not do certain advanced reflection (e.g. for efficiency?)
I use IntelliJ so switching JDK is very easy, and most apps work on any JDK. Plus I doubt anything I release would be relevant in the long term (at least without the ability to upgrade JDK). So I really don't have to depend on any JDK, and seems like I don't have to worry about my JDK getting obsolete. Is there still a reason why I would prefer to use one JDK over another?
I think most JDKs these days are running mostly the same code (Oracle's Java) though a lot of them have little extra improvements or fixes (and some have longer support periods too).
I try and stick with OpenJDK as that's the default Java implementation and it's probably a sensible default unless you have a requirement that one of the other ones specifically meets (like extended Java 8 security patches).
Thanks for crafting this document, it’s a great start and good to get the discussions started. For me, when I look at which distribution to use I would probably consider other factors that this post doesn’t touch upon completely. What sort of upstream contributions and activity, support across both cloud and on premise and multi architecture support are just a few that come to mind.
This page should definitely have a "end of free public updates" column as is shown on the "Java version history"[0] page, so it's clear updates stop much sooner than the LTS.
I’ve tried more than once to learn Java I find it very confusing. This website resolves one of the issues. The other issue I have is that most Java tutorials use IntelliJ, or some other IDE, which makes it difficult to figure out what a non IDE workflow actually looks like, some managing dependencies is confusing.
The strongest plus I could say for Java is that it's a stable workhorse that pays the bills, and that the tooling is pretty good. Non IDE workflows don't exist practically, as a curiosity it's nice to compile something "by hand", but the knowledge is nonessential. Switching from one IDE to another is not that big of a deal, workflow-wise, you shouldn't worry about it.
As a side note, I appreciate that we have multiple tooling for Java. It doesn't sit well with me when the whole language, and its IDE and compiler and everything is bundled up in one package and that's all you have.
What programming language are you comfortable with? I'd say syntactically Java is fairly simple and at least superficially similar to C/C++. Environment wise, it's definitely very large and oriented toward "powerful but complex" rather than "simple and orthogonal". For build and dependency management systems, Gradle or Maven are your tools. I personally prefer Gradle since it uses a programming language for it's configuration, rather than xml.
There is definitely a lack "end-to-end" tutorials that don't rely on some form of tooling support. I would love to see more content on how to make a Java program with nothing but an editor and a command line, but I suspect there isn't much of a need for that. Java's tooling is exceptionally good, it's libraries mature and robust. Part of the reason every tutorial out there uses an IDE is because through quirks of history Java is basically an Enterprise programming environment where "Build the Right Thing" predominates over "Worse is Better".
EDIT: There is a REPL included with each JDK after 9 called JShell. It's useful if you just want to get up to speed or if you want to test something quickly.
Forget about this website, just download the most recent OpenJDK version and get at it!
As for IDEs, I do recommend IntelliJ for writing the code, but in case you want to understand what happens behind the scenes, you can compile the source code directly in the terminal with `javac YourClass.java`, or *.java, depending on how many files you have.
It will by default place the resulting files in the current directory with the convention of creating directories for packages. If you had no package keyword in your files it will result in a simple YourClass.class file that you can run with `java YourClass` (do note that you run classes, not the file itself). If you had something like com.example as a package, it will be placed under WORK_DIR/com/example/YourClass.class.
It should be run from the current work_dir though!
The reason for that is the notion of a class-path, analog of the PATH variable of unix systems. It lists the package-aware class files, and defaults to the working dir, meaning it will find com.example.* files from here, but not elsewhere.
Dependencies are just that, they get concatenated to this class path and thus will be included.
EDIT: You can also just issue `java YourClassWithMain.java`, it will compile and run your class in one go
C# in mono has the csharp interactive prompt, and you can also use it to run script files (like .csx).
Java often uses IDE but there are tools to run a REPL.
Groovy had this before Java got it. You can use it to write Java, since Java code is Groovy. It comes with a GUI console into which you can type and evaluate too, or you can write script files and run them. It's not as fashionable nowadays as it used to be (now it's Kotlin, Scala, etc.) but it works great and can be useful for learning Java too.
I think most languages have a really difficult to configure environment if you mess it up or need to keep multiple versions around. Windows is especially bad about it because of the java_home variable. Mono is a mess to configure on Linux, or at least the last time I tried it to learn F#.
Java is actually nice in that a lot of distros are packaging a version manager for it. Arch and centos lead the pack in this. Although cent continues to be a mess about how many jdks and Jres are available.
java itself ships no build and dependency management tools. So you get either the built-in stuff that IDEs offer or use maven, gradle, ant.
For toy programs you can also invoke javac directly but as it grows it quickly gets as impractical as compiling large C codebases by hand.
For non-IDE workflow use the Gradle dependency management tool and task runner - you can use javac on its own but for any non-trivial application the command line arguments will get very long.
OpenJDK builds by Oracle are updated only for 6 months, even for LTS versions.
Plus these builds are provided for limited platforms only and have no official ready-to-use Docker images.
I've seen, over the past few years, at least a dozen large threads on HN, Reddit, Slashdot, and elsewhere arguing over the JDK.
Usually they're a passionate mix of developers and PMs claiming they're still very confused by the licensing and update / security fix terms since JDK 8, while various Oracle engineers then shout back that the licensing issues are straight forward.
Personally, I've developed with Java for over a decade, and I have never really understood when I should use one version or the other. Oracle, Eclipse Foundation, IBM, and the other big players have not done a good job of marketing the various offerings, nor have they done a good job of clarifying for engineers what they should be installing on their local machines, development and test servers, production, etc.
This is probably the clearest fact sheet I've seen yet, and the links to different distributions with a concise "Use / Don't Use..." rating is indispensable.
What a great site! I have often wondered about these things but I'm not a Java guy so I never investigated for myself. I've never heard of Adoptium before but it sounds like the right choice.
I have no problem running web services on the latest JVM, I just migrated a bunch of http4s apps to Temurin Java 17 based docker images. Scala isn't the issue if you can update to the latest patch version; it's more frameworks with outdated Java libraries doing illegal reflective access that are a problem.
It protects visitors on compromised networks--and that includes things like ad injectors at coffee shops that might push nasty code to them, not just people dealing with oppressive regimes and so on. It also provides some benefit around "well, that page is HTTPS, so it's more interesting"--if every page is HTTPS, the signaling value of switching to HTTPS is destroyed, and that is a good thing.
HTTPS everywhere is a positive, and it is a good thing to do.
“Specialists” drinking the Kool-Aid look most depressing. Don't you find it strange that each time each proponent believes it's important to mention a stereotypical script kiddie on a public WiFi, something that doesn't bother a lot of people at all because of they way they connect to internet, and hasn't been a common occurrence even in the days of completely broken wireless security protocols?
What is/was common is internet providers' interest in making money on personal behavioral data in the traffic they transfer. DPI boxes to passively gather statistics or actively inject ads (and even rewrite existing ads) have been offered and tested since the 2000s across the world. Scale of big ISPs would make them Google's (&Co) competitors on personal behavioral data market, and mobile ISPs would combine it with location data, too. Moreover, they would be able to use Google's own tracking cookies to track individual users instead of inventing the classification systems (either by observing them in clear text, or by injecting scripts). The security and income of web services is the real reason for the global “HTTP is deprecated, switch to HTTPS” campaign, not you and your “privacy”.
Ad injectors on public wifi is a marginal 'risk' considering this page is targeted to professionals and informed hobbyists anyway, who will predominantly just be browsing from home, work or with a VPN that tunnels traffic through either a server they control or a service provider's server whom they trust anyway.
Not that many years ago you'd be writing the exact same thing about USSR. It's pretty funny to watch as an outsider who has zero stakes in either side.
I feel like the actual decision process is fairly straightforward in nearly all cases:
- Use whatever vendor happens to be most convenient to install. If nearly everyone using your OS is installing Java one way, and you're installing it some other unusual way, you should be crystal clear about why exactly you need to do that.
- Generally go with JDK 11, if you may have to deal with older software then maybe go with 8, if you want the shiny new stuff and don't mind some extra hassle then go with JDK 17 (it's not well-supported by everything yet).
That's pretty much it. There are exceedingly few cases where it actually matters whether you installed Corretto or Oracle OpenJDK, and in those cases you'll likely end up either testing all the JDKs anyway to make your decision or writing your own patches for whatever you need.