> The same Lambda function with 3008 MB of memory that took 3.6 seconds to start with the JVM, started in under 100 milliseconds once compiled to a native executable using GraalVMs native-image tool. We were able to achieve sub second response times as shown in the graph below, and as an added bonus were also able to reduce the memory size and cost.
They go on to describe the main caveat - you have to predeclare what will be accessed via reflection - and how some frameworks like Micronaut do work up-front at source compile time to ensure the needed metadata is generated. So if your app is compatible with native image the benefits are really there.
There are some other caveats:
• In some cases you may need config files to make libraries compatible with the process. There's a central collection of them, and libraries are increasingly including their own metadata. The biggest app compat problems are with apps built using old versions of frameworks like Spring where you can't afford to update them to the newest versions of things.
• Out of the box the native executable runs a bit slower. To get throughput that's competitive with HotSpot you'll need to do a C++ style workflow with profile guided optimization, which is obviously more runtime efficient but less devops-time efficient than what HotSpot does.
• The actual compile process is slow, so you'll be developing on HotSpot.
Disclosure: I work part time with the GraalVM team.
https://aws.amazon.com/blogs/opensource/improving-developer-...
> The same Lambda function with 3008 MB of memory that took 3.6 seconds to start with the JVM, started in under 100 milliseconds once compiled to a native executable using GraalVMs native-image tool. We were able to achieve sub second response times as shown in the graph below, and as an added bonus were also able to reduce the memory size and cost.
They go on to describe the main caveat - you have to predeclare what will be accessed via reflection - and how some frameworks like Micronaut do work up-front at source compile time to ensure the needed metadata is generated. So if your app is compatible with native image the benefits are really there.
There are some other caveats:
• In some cases you may need config files to make libraries compatible with the process. There's a central collection of them, and libraries are increasingly including their own metadata. The biggest app compat problems are with apps built using old versions of frameworks like Spring where you can't afford to update them to the newest versions of things.
• Out of the box the native executable runs a bit slower. To get throughput that's competitive with HotSpot you'll need to do a C++ style workflow with profile guided optimization, which is obviously more runtime efficient but less devops-time efficient than what HotSpot does.
• The actual compile process is slow, so you'll be developing on HotSpot.
Disclosure: I work part time with the GraalVM team.