Site icon Synced

Alan Turing Institute Releases ML Framework Written in Julia

UK-based national research organization The Alan Turing Institute has released a new machine learning toolbox, Machine Learning in Julia (MLJ), which provides a uniform interface enabling users to easily train, evaluate, and tune machine learning models. This open-source framework is written in the high-performance scientific programming language Julia.

Inspired by Machine Learning in R (MLR), the Alan Turing Institute launched the development of the MLJ project last December and released its official version V 0.1.0 last week. The MLJ GitHub’s over 200 stars are the most among institute projects.

The major feature of MLJ is learning networks, a flexible model composition pipelining step that combines machine learning models more flexibly via techniques such as ensembling, stacking, and pipelining. Below is a schematic of a simple two-model stack viewed as a network.

Other features include:

The Alan Turing Institute believes MLJ’s features and functionality make it a better alternative than ScikitLearn.jl, a Julia wrapper for the popular Python library scikit-learn. Click the GitHub page for more detailed information.


Journalist: Tony Peng | Editor: Michael Sarazen

Exit mobile version