Kotlin 1.4-M1 Released

Posted on by Sebastian Aigner

We are happy to announce the first preview version of the new major release: Kotlin 1.4-M1.

A few months ago, we published an announcement of what to expect in Kotlin 1.4. As the release approaches, we’re offering you a preview in which you can try some of the new things for yourself.

m1_twitter-01

In this post, we’ll highlight the following new features and key improvements available in 1.4-M1:

  • A new, more powerful type inference algorithm is enabled by default.
  • Contracts are now available for final member functions.
  • The Kotlin/JVM compiler now generates type annotations in the bytecode for Java 8+ targets.
  • There’s a new backend for Kotlin/JS that brings major improvements to the resulting artifacts.
  • Evolutionary changes in the standard library: completing deprecation cycles and deprecating some additional parts.

You can find the complete list of changes in the change log. As always, we’re really grateful to our external contributors.

We highly encourage you to try the preview, and we will appreciate any feedback you provide in our issue tracker.

More powerful type inference algorithm

Kotlin 1.4 uses a new, more powerful type inference algorithm. You were already able to try this new algorithm with Kotlin 1.3 by specifying a compiler option, and now it’s used by default. You can find the full list of issues fixed in the new algorithm in YouTrack. In this blogpost, we’ll highlight some of the most noticeable improvements.

SAM conversion for Kotlin functions and interfaces

SAM conversion allows you to pass a lambda when an interface with one “single abstract method” is expected. Before, you could only apply SAM conversion when working with Java methods and Java interfaces from Kotlin, and now you can use it with Kotlin functions and interfaces as well.

Kotlin now supports SAM conversion for Kotlin interfaces. Note that it works differently than in Java: You need to mark functional interfaces explicitly. After you mark an interface with fun keyword, you instead can pass a lambda as an argument whenever such an interface is expected as a parameter:

You can read more details about this in the previous blogpost.

Kotlin has supported SAM conversions for Java interfaces from the beginning, but there was one case that wasn’t supported, which was sometimes annoying when working with existing Java libraries. If you called a Java method that took two SAM interfaces as parameters, both arguments need to be either lambdas or regular objects. It wasn’t possible to pass one argument as a lambda and another as an object. The new algorithm fixes this issue, and you can pass a lambda instead of a SAM interface in any case, which is the way you’d naturally expect it to work.

Inferring type automatically in more use-cases

The new inference algorithm infers types for many cases where the old inference required you to specify them explicitly. For instance, in the following example, the type of the lambda parameter it is correctly inferred to String?:

In Kotlin 1.3, you needed to introduce an explicit lambda parameter, or replace to with a Pair constructor with explicit generic arguments to make it work.

Smart casts for lambda’s last expression

In Kotlin 1.3, the last expression inside lambda isn’t smart cast unless you specify the expected type. Thus, in the following example, Kotlin 1.3 infers String? as the type of the result variable:

In Kotlin 1.4, thanks to the new inference algorithm, the last expression inside lambda gets smart cast, and this new, more precise type is used to infer the resulting lambda type. Thus, the type of the result variable becomes String.

In Kotlin 1.3, you often needed to add explicit casts (either !! or type casts like as String) to make such cases work, and now these casts have become unnecessary.

Smart casts for callable references

In Kotlin 1.3, you couldn’t access a member reference of a smart cast type. Now you can:

You can use different member references animal::meow and animal::woof after the animal variable has been smart cast to specific types Cat and Dog. After type checks, you can access member references corresponding to subtypes.

Better inference for callable references

Using callable references to functions with default argument values is now more convenient. For example, the callable reference to the following foo function can be interpreted both as taking one Int argument or taking no arguments:

Better inference for delegated properties

The type of a delegated property wasn’t taken into account while analyzing the delegate expression which follows the by keyword. For instance, the following code didn’t compile before, but now the compiler correctly infers the types of the old and new parameters as String?:

Language changes

Most of the language changes have already been described in previous blog posts:

In this post, we’ll highlight some small improvements concerning contracts.

Contracts support

The syntax to define custom contracts remains experimental, but we’ve supported a couple of new cases where contracts might be helpful.

You can now use reified generic type parameters to define a contract. For instance, you can implement the following contract for the assertIsInstance function:

Because the T type parameter is reified, you can check its type in the function body. This is now also possible inside the contract. A similar function with the assertion message will be added to the kotlin.test library later.

Also, you can now define custom contracts for final members. Previously, defining contracts for member functions was forbidden completely because defining contracts on some of the members in the hierarchy implies a hierarchy of the corresponding contracts, and this is still a question of design and discussion. However, if a member function is final and doesn’t override any other function, it’s safe to define a contract for it.

Standard library changes

Exclusion of the deprecated experimental coroutines

The kotlin.coroutines.experimental API was deprecated in favor of kotlin.coroutines in 1.3.0. In 1.4-M1, we’re completing the deprecation cycle for kotlin.coroutines.experimental by removing it from the standard library. For those who still use it on the JVM, we provide a compatibility artifact kotlin-coroutines-experimental-compat.jar with all the experimental coroutines APIs. We are going to publish it to maven and include it in the Kotlin distribution beside the standard library. Currently, we have published it to the bintray repository together with 1.4-M1 artifacts.

Removing deprecated mod operator

Another deprecated function is the mod operator on numeric types that calculates the remainder after a division operation. In Kotlin 1.1, this was replaced by the rem() function. Now, we’re completely removing it from the standard library.

Deprecation of conversions from floating types to Byte and Short

The standard library contains functions for converting floating-point numbers to integer types: toInt(), toShort(), toByte(). Conversions of floating-point numbers to Short and Byte could lead to unexpected results because of the narrow value range and smaller variable size. To avoid such problems, as of 1.4-M1 we’re deprecating functions toShort() and toByte() on Double and Float. If you still need to convert floating-point numbers to Byte or Short, use the two-step conversion: to Int and then to the target type.

Common reflection API

We’ve revised the common reflection API. Now it contains only the members available on all three target platforms (JVM, JS, Native) so you can be sure that the same code works on any of them.

New contracts for use() and time measuring functions

We’re widening the use of contracts in the standard library. In 1.4-M1, we’ve added contracts declaring a single execution of a code block for the use() function and for the time measuring functions measureTimeMillis() and measureNanoTime().

Proguard configurations for Kotlin reflection

Starting from 1.4-M1, we have embedded Proguard/R8 configurations for Kotlin Reflection in kotlin-reflect.jar. With this in place, most Android projects using R8 or Proguard should work with kotlin-reflect without additional configuration magic. You no longer need to copy paste the Proguard rules for kotlin-reflect internals. But note that you still need to list explicitly all APIs you’re going to reflect on.

Kotlin/JVM

Since version 1.3.70, Kotlin has been able to generate type annotations in the JVM bytecode (target version 1.8+), so that they become available at runtime. This feature had been requested by the community for some time because it makes using some existing Java libraries much easier and gives more power to authors of new libraries.

In the following example, the @Foo annotation on the String type can be emitted in the bytecode and then used by the library code:

For details on how to emit type annotations in the bytecode, see the corresponding section of the Kotlin 1..3.70 release blog post.

Kotlin/JS

For Kotlin/JS, this milestone includes some changes to the Gradle DSL, and it is the first version to include the new IR compiler backend, which enables optimizations and new features.

Gradle DSL changes

In the kotlin.js and multiplatform Gradle plugins, a new and important setting has been introduced. Inside the target block in your build.gradle.kts file, the setting produceExecutable() is now available and required if you want to generate .js artifacts during your build:

You can omit produceExecutable() if you are writing a Kotlin/JS library. When using the new IR compiler backend (more details on this below), omitting this setting means that no executable JS file will be generated (and, as such, the build process will run faster). A klib file is generated in the build/libs folder, which can be used from other Kotlin/JS projects, or as a dependency in the same project. If you don’t specify produceExecutable() explicitly, this behavior happens by default.

Using produceExecutable() will generate code that’s executable from the JavaScript ecosystem – either with its own entry point or as a JavaScript library. This will generate the actual JavaScript files, which can run in a node interpreter, be embedded in an HTML page and executed in the browser, or used as dependencies from JavaScript projects.

Note that when targeting the new IR compiler backend (more details below), produceExecutable() will always generate a single, standalone .js file per target. Currently, there is no support for deduplication or splitting code between multiple generated artifacts. You can expect this behavior from produceExecutable() to change in subsequent milestones. The naming of this option is also subject to change in the future.

New backend

Kotlin 1.4-M1 is the first version to include the new IR compiler backend for the Kotlin/JS target. This backend is the foundation for vastly improved optimizations and the defining factor for some changes in the way Kotlin/JS interacts with JavaScript and TypeScript. The features highlighted below all target the new IR compiler backend. While it is not yet enabled by default, we encourage you to try it out with your projects, start preparing your libraries for the new backend, and of course give us feedback and log any issues as you encounter them.

Using the new backend

To start using the new backend, set the following flag in your gradle.properties file:

If you need to generate libraries for the IR compiler backend and the default backend, you can alternatively set this flag to both. What this flag does exactly is presented in the section of this blog post entitled "Both-mode". The flag is necessary because the new and default compiler backends are not binary compatible.

No binary compatibility

A major change with the new IR compiler backend is the absence of binary compatibility with the default backend. A lack of such compatibility between the two backends for Kotlin/JS means that a library created with the new IR compiler backend can’t be used from the default backend, and vice versa.

If you want to use the IR compiler backend for your project, you need to update all Kotlin dependencies to versions that support this new backend. Libraries published by JetBrains for Kotlin 1.4-M1 targeting Kotlin/JS already contain all artifacts required for usage with the new IR compiler backend. When depending on such a library, the correct artifacts are automatically selected by Gradle (i.e. there is no need to specify an IR-specific coordinate). Please note that some libraries, such as kotlin-wrappers, have some issues with the new IR compiler backend because they rely on specific characteristics of the default backend. We are aware of this and are working on improving this functionality in the future.

If you are a library author looking to provide compatibility with the current compiler backend as well as the new IR compiler backend, additionally check out the “Both-mode” section of this blog post.

The next section will have a closer look at some of the benefits and differences that you can expect from the new compiler.

Optimized DCE

The new IR compiler backend is able to make much more aggressive optimizations compared to the default backend. The generated code works better together with static analyzers, and it is even possible to run the generated code from the new IR compiler backend through Google’s Closure Compiler and use its advanced mode optimizations (though please note that the Kotlin/JS Gradle plugin does not provide specific support for this).

The most visible change here is in the code size of the generated artifacts. An improved method of dead code elimination allows the artifacts to shrink drastically. For example, this reduces a "Hello, World!" Kotlin/JS program to just below 1.7 KiB. For more complex (demo) projects, such as this example project using kotlinx.coroutines, the numbers also have changed drastically, and hopefully speak for themselves:

Default backend IR backend
After compilation 3.9 MiB 1.1 MiB
After JS DCE 713 KiB 430 KiB
After bundle 329 KiB 184 KiB
After ZIP 74 KiB 40 KiB

If you’re not convinced yet, try it yourself. DCE and bundling are enabled by default for both backends in Kotlin 1.4-M1!

Exporting declarations to JavaScript

When using the IR compiler backend, declarations marked as public are no longer exported automatically (not even a name-mangled version). This is because the closed-world model of the IR compiler assumes that exported declarations are specifically annotated – one of the factors that helps with optimizations like the one mentioned above.

To make a top-level declaration available externally to JavaScript or TypeScript, use the @JsExport annotation. In the following example, we make KotlinGreeter (and its methods) and farewell() available from JavaScript, but keep secretGreeting() Kotlin-only:

Preview: TypeScript definitions

Another feature in the new Kotlin/JS IR compiler we’re excited to show off is the generation of TypeScript definitions from Kotlin code. These definitions can be used by JavaScript tools and IDEs when working on hybrid apps to provide autocompletion, support static analyzers, and make it easier to include Kotlin code in JS and TS projects.

For top-level declarations marked with @JsExport (see above) in a project configured to use produceExecutable(), a .d.ts file with the TypeScript definitions will be generated. For the snippet above, they look like this:

In Kotlin 1.4-M1, these declarations can be found in build/js/packages/<package_name>/kotlin alongside the corresponding, un-webpacked JavaScript code. Please note that since this is only a preview, they are not added to the distributions folder by default for now. You can expect this behavior to change in the future.

Both-mode

To make it easier for library maintainers to move to the new IR compiler backend, an additional setting for the kotlin.js.compiler flag in gradle.properties has been introduced:

When in both mode, the IR compiler backend and default compiler backend are both used when building a library from your sources (hence the name). This means that both klib files with Kotlin IR as well as js files for the default compiler will be generated. When published under the same Maven coordinate, Gradle will automatically choose the right artifact depending on the use case – js for the old compiler, klib for the new one. This means that you can compile and publish your library with the new IR compiler backend for projects that have already upgraded to Kotlin 1.4-M1 and that are using either of the two compiler backends. It helps ensure that you will not break the experience for those of your users still using the default backend – given that they have upgraded their project to 1.4-M1.

Please be advised that there currently is still an issue that causes the IDE to not properly resolve library references when the dependency and your current project are built using both mode. We are aware of this problem and will fix it soon.

Kotlin/Native

Objective-C generics support by default

Previous versions of Kotlin provided experimental support for generics in Objective-C interop. To generate a framework header with generics from Kotlin code, you had to use the -Xobjc-generics compiler option. In 1.4-M1, this behavior becomes the default. In some cases, this may break existing Objective-C or Swift code calling Kotlin frameworks. To have the framework header written without generics, add the -Xno-objc-generics compiler option.

Please note that all specifics and limitations listed in the documentation are still valid.

Changes in exception handling in Objective-C/Swift interop

In 1.4, we will slightly change the Swift API generated from Kotlin with respect to the way exceptions are translated. There is a fundamental difference in error handling between Kotlin and Swift. All Kotlin exceptions are unchecked, while Swift has only checked errors. Thus, to make Swift code aware of expected exceptions, Kotlin functions should be marked with a @Throws annotation specifying a list of potential exception classes.
When compiling to Swift or the Objective-C framework, functions that have or are inheriting @Throws annotation are represented as NSError*-producing methods in Objective-C and as throws methods in Swift.
Previously, any exceptions other than RuntimeException and Error were propagated as NSError. In 1.4-M1, we’ve changed the behavior. Now NSError is thrown only for exceptions that are instances of classes specified as parameters of @Throws annotation (or their subclasses). Other Kotlin exceptions that reach Swift/Objective-C are considered unhandled and cause program termination.

Performance improvements

We’re continuously working to improve the overall performance of Kotlin/Native compilation and execution. In 1.4-M1, we offer you the new object allocator that works up to two times faster on some benchmarks. Currently, the new allocator is experimental and is not used by default; you can switch to it using the -Xallocator=mimalloc compiler option.

Compatibility

Note that Kotlin 1.4 is not backward-compatible with 1.3 in some corner cases. All such cases were carefully reviewed by the language committee and will be listed in the “compatibility guide” (similar to this one). At the moment, you can find this list in YouTrack.

The overload resolution rules may change slightly. If you have several functions with the same names and different signatures, the one that gets called in Kotlin 1.4 might be different from the one that was chosen in Kotlin 1.3. However, this only happens for some corner cases, and we expect that it should occur only extremely rarely in practice. We also make an assumption that in practice the overloaded functions behave similarly, eventually calling one another, and that’s why these changes shouldn’t affect the program behavior. But please pay attention to that if you like to write tricky code with generics and many overloads on different levels. All cases of this sort will be listed in the compatibility guide mentioned above.

Pre-release notes

Note that the backward compatibility guarantees do not cover pre-release versions. The features and API can change in subsequent releases. When we reach a final RC, all binaries produced by pre-release versions will be outlawed by the compiler, and you will be required to recompile everything that was compiled by 1.4‑Mx.

How to try

As always, you can try Kotlin online at play.kotl.in.

In IntelliJ IDEA and Android Studio, you can update the Kotlin Plugin to the version 1.4-M1. See how to do this.

If you want to work on existing projects that were created before installing the preview version, you need to configure your build for the preview version in Gradle or Maven.

You can download the command-line compiler from the Github release page.

You can use the following versions of the libraries published together with this release:

The release details and the list of compatible libraries are also available here.

Share your feedback

We’ll be very thankful if you find and report bugs to our issue tracker YouTrack. We’ll try to fix all the important issues before the final release, which means you won’t need to wait until the next Kotlin release for your issues to be addressed.

If you have any questions and want to participate in discussions, you are welcome to join the #eap channel in Kotlin Slack (get an invite here). In this channel, you can also get notifications about new preview builds.

Let’s Kotlin!

{#external-contributors}

External Contributions

We want to especially thank Zac Sweers for his contribution for embedding Proguard configurations in kotlin-reflect.

We’d like to thank all our external contributors whose pull requests were included in this release:

Comments below can no longer be edited.

15 Responses to Kotlin 1.4-M1 Released

  1. Yuriy says:

    March 23, 2020

    When Kotlin IDE performance improvements are expected? Often typing Kotlin code in IDEA gets super laggy especially when typing string interpolation expressions. Syntax highlighting can also be very sluggish.

    • Alexey Belkov says:

      March 24, 2020

      We are continuously working on improving IDE performance. For example, here are the relevant fixed issues in recent releases: https://youtrack.jetbrains.com/issues/KT?q=Target%20versions:%201.3.70,%201.4-M1%20%23%7BPerformance%20Problem%7D%20%23Fixed%20Subsystems:%20ide*

      If you are experiencing performance problems, please report them to http://kotl.in/issue according to https://intellij-support.jetbrains.com/hc/en-us/articles/207241235-Reporting-performance-problems

      • HGH says:

        April 2, 2020

        When I use Visual Studio 2019 – the fans never spin, even during compilation and with YouTube videos playing in Chrome and with a few other programs open. When I have Android Studio even just opened in the background the fans spin. I wonder if making a whole IDE no matter how great yet based on anything JVM can ever be as efficient as a software written in a native language like C/C++.

        Despite the conveniences of managed languages and all the claims – JVM is just slow and memory hungry beast.

        You should develop and test your software on 10 year old hardware at least primarily and if it runs great on it then it is good enough. After all it is an IDE not a game (which is supposed to use all of the resources).

        • HGH says:

          April 2, 2020

          I my use case Visual Studio – developing games – never gets in the way of running the game on the same machine.
          In the case of IDEA software – it eats everything up which is supposed to be shared with the developed software (the game).

          Except for performance (CPU/Memory) it is mostly a great product.
          For obvious reasons I don’t want to/can’t use 16 core CPU with 5GHz base clocks (I don’t think it exists yet) with 32Gb RAM. I prefer low power/low noise/battery sparing/light machines.

        • HGH says:

          April 2, 2020

          BTW In our office nearly everybody but the IOS guys are on low power laptops and many use VMs, so the performance of the IDE is really important.

  2. Raman Gupta says:

    March 23, 2020

    First, congratulations. Looking forward to 1.4.

    However, I’m disappointed there is no mention of frontend Kotlin plugin performance improvements? That’s the critical item to get done way before everything else on this list.

    • Andrey Mischenko says:

      March 23, 2020

      But the new compiler frontend was never planned for 1.4, at least this was said on KotlinConf announcement

      • HGH says:

        April 4, 2020

        Any idea when this is going to happen? Later in the 1.4 series or it will be in 1.5+ and later (e.g. not this year)?

  3. Sergey Shatunov says:

    March 24, 2020

    Could someone clarify what exactly js ir backend does under the hood?

    So if I publish klib for my library and use it in final application I got all code (library & application) compiled into single .js file or as previous bunch of .js (one for each library)? The same true for stdlib?

    • Anton Bannykh [JB] says:

      March 24, 2020

      TL;DR: yes, single .js file for now. Will support multiple .js file outputs in the future.

      Long version:

      Here is how it works right now. For the final application you will get a single .js file for your application. It will contain your whole application module and the parts of depedendencies that are needed for it to function.

      The stdlib is also distributed as a klib. The parts of stdlib that are used in your application will get compiled into the output in the same way as for any other library. So for a pure-Kotlin project the resulting .js file should be completely self-sufficient.

      Depending on the level of optimization, the “parts of dependencies” I’ve mentioned before will either be granular up to top-level declarations (without DCE), or up to a single method/property (with DCE).

      As for the future. We do realize that a single .js output does not cover everybody’s needs. We do plan to support multiple .js files as a final application output. That’s not hard, we just haven’t settled on which way (ways?) to do that (per-module/per-file/per-class/something else?). Some use cases, such as dynamic import/async loading, require some design on the language side for instance. The important difference from the old .js libraries will be that the yielded .js files will only be guaranteed to work with each other, you won’t be able to take some files from one compilation, some from the other and expect them to work with each other. This will allow us to optimize the code much better, streamline how the exported JS entities look like to JS, etc.

      • Sergey Shatunov says:

        March 24, 2020

        Good to know, thanks.

        One more question: when code split had been implemented will it support ecma modules or we stick with commonjs/etc for now?

        • Anton Bannykh [JB] says:

          March 30, 2020

          ECMAScript modules is an independent feature we intend to support. AFAIK there is no complete design for the code splitting, but I don’t see why not.

  4. HGH says:

    April 2, 2020

    I wonder why have you decided to go for global e.g. “mapOf” functions as in the examples above instead of static members like e.g. “Map.of”?

  5. Brian Stiles says:

    April 10, 2020

    Exciting stuff!

    Using the IR backend with multiplatform, will it be possible to export a common class/function? As far as I can tell, I can only apply @JsExport to something in my JavaScript tree of files (e.g., jsMain/…). The annotation is not available under the common tree.

  6. Waheed Akhtar says:

    April 19, 2020

    I am an Android Developer. Start learning Kotlin and now I am loving it. Things are easy and code compiles much faster. The syntax is very natural and no more Null exceptions.

Subscribe

Subscribe for updates