Deep Learning With Kotlin: Introducing KotlinDL-alpha
Today we would like to share with you the first preview of KotlinDL (v.0.1.0), a high-level Deep Learning framework written in Kotlin and inspired by Keras. It offers simple APIs for building, training, and deploying deep learning models in a JVM environment. High-level APIs and sensible defaults for many parameters make it easy to get started with KotlinDL. You can create and train your first simple neural network with only a few lines of Kotlin code:
Training deep learning models can be resource-heavy, and you may wish to accelerate the process by running it on a GPU. This is easily achievable with KotlinDL!
With just one additional dependency, you can run the above code without any modifications on an NVIDIA GPU device.
KotlinDL comes with all the necessary APIs for building and training feedforward neural networks, including Convolutional Neural Networks. It provides reasonable defaults for most hyperparameters and offers a wide range of optimizers, weight initializers, activation functions, and all the other necessary levers for you to tweak your model.
With KotlinDL, you can save the resulting model, and import it for inference in your JVM backend application.
Keras models import
Out of the box, KotlinDL offers APIs for building, training, saving deep learning models, and loading them to run inference. When importing a model for inference, you can use a model trained with KotlinDL, or you can import a model trained in Python with Keras (versions 2.*).
For models trained with KotlinDL or Keras, KotlinDL supports transfer learning methods that allow you to make use of an existing pre-trained model and fine-tune it to your task.
In this first alpha release, only a limited number of layers are available. These are:
AvgPool2D(). This limitation means that not all Keras models are currently supported. You can import and fine-tune a pre-trained VGG-16 or VGG-19 model, but not, for example, a ResNet50 model. We are working hard on bringing more layers for you in the upcoming releases.
Another temporary limitation concerns deployment. You can deploy a model in a server-side JVM environment, however, inference on Android devices is not yet supported, but it is coming in later releases.
What’s under the hood?
KotlinDL is built on top of the TensorFlow Java API which is being actively developed by the open source community.
Give it a try!
We’ve prepared some tutorials to help you get started with KotlinDL:
- Quick start guide
- Create your first neural network with KotlinDL
- Training a model
- Inference example
- Importing a Keras model
- Transfer learning example
Subscribe to Blog updates
Thanks, we've got you!
Introducing Kotlin Notebook
The first experimental version of the Kotlin Notebook plugin for IntelliJ IDEA is now available! This post will offer insight into the plugin’s functionality and use cases, but before launching into that, we’d like to let the plugin speak for itself: https://youtu.be/2PLYlDJPelQ W…
Kotlin Dataframe 0.9.1 released!
It's time for another Kotlin Dataframe update to start off the new year.There have been a lot of exciting changes since the last 0.8.0 preview release. So without any further ado, let's jump right in! TL;DR: OpenAPI type schemas can now be parsed and converted into data schemas.New JSON readin…
KotlinDL 0.5 Has Come to Android!
Version 0.5 of our deep learning library, KotlinDL, is now available! This release focuses on the new API for the flexible and easy-to-use deployment of ONNX models on Android. We have reworked the Preprocessing DSL, introduced support for ONNX runtime execution providers, and more. Here's a summ…