Skip to content

Releases: JuliaAI/MLJ.jl

v0.22.0

20 Nov 20:09
d94a3cd

Choose a tag to compare

MLJ v0.22.0

Diff since v0.21.0

  • (mildly breaking) The behaviour of levels and unique on CategoricalArrays has changed. (Such arrays are created in MLJ, for example, by coerce(array, Multiclass) or coerce(array, OrderedFactor). ) The levels and unique methods now return a CategoricalVector whereas previously they returned a vector of "raw" values. So, running levels(array) previously is equivalent to now running CategoricalArrays.unwrap.(levels(array)). The new behaviour is the result of breaking changes in CategoricalArrays.jl, on which MLJ.jl depends (#1172)

Merged pull requests:

Closed issues:

  • Reinstate CatBoost integraton test (#1092)
  • Re-instate integration tests for scikit-learn models (#1119)
  • Reinistate integration tests for SymbolicRegression? (#1152)
  • Reinstate Outlier detection model s (#1153)
  • Error when loading ContinuousEncoder (#1181)

v0.21.0

10 Sep 07:33
b6541e5

Choose a tag to compare

MLJ v0.21.0

Diff since v0.20.9

  • (new models) Add the following models from MLJTransforms.jl and make them immediately available to the MLJ user (she does not need to use @load to load them): OrdinalEncoder, FrequencyEncoder, TargetEncoder, ContrastEncoder, CardinalityReducer, MissingnessEncoder.
  • (mildly breaking) Have MLJTransforms.jl, instead of MLJModels.jl, provide the following built-in models, whose behaviour is unchanged: ContinuousEncoder, FillImputer, InteractionTransformer, OneHotEncoder, Standardizer, UnivariateBoxCoxTransformer, UnivariateDiscretizer, UnivariateFillImputer, UnivariateTimeTypeToContinuous, Standardizer.

Guide for possible source of breakage: While it was never necessary to use @load to load one of the models in the last list (assuming you have first run using MLJ) this is frequently not realised by users, and one sees things like @load OneHotEncoder pkg=MLJModels, which this release will break. If such a call is preceded by using MLJ or using MLJTransforms you can remove the loading command altogether (OneHotEncoder() already works), and in any case you can instead use @load OneHotEncoder pkg=MLJTransforms.

Merged pull requests:

  • Make updates to reflect code reorganisation around addition of MLJTransforms.jl (#1177) (@ablaom)
  • For a 0.21 release (#1180) (@ablaom)

Closed issues:

  • Decision trees from ScikitLearn.jl not available (#545)
  • Document RecursiveFeatureElimination (#1162)

v0.20.9

14 Jul 20:39
93bd5e7

Choose a tag to compare

MLJ v0.20.9

Diff since v0.20.8

Merged pull requests:

Closed issues:

  • Brainstorming: API for meta-data/side-information on the features (#480)
  • likely wrong link (#1170)

v0.20.8

03 Jun 10:21
12cd043

Choose a tag to compare

MLJ v0.20.8

Diff since v0.20.7

Merged pull requests:

Closed issues:

  • Reexport CompactPerformanceEvaluation and InSample (#1111)
  • [tracking] Add default logger to MLJ (#1124)
  • Add Missingness Encoder Transformer (#1133)
  • Failed to use TunedModel with precomputed-SVM (#1141)
  • Error with RecursiveFeatureElimination + EvoTreeClassifier (#1145)
  • Dump mention of version number in cheatsheet (#1154)
  • Remove PartialLeastSquaresRegressor from the docs (#1157)
  • Regarding issue in recognition of UUID 5ae90465-5518-4432-b9d2-8a1def2f0cab in a registry (#1158)

v0.20.7

19 Jul 02:14
f7befce

Choose a tag to compare

MLJ v0.20.7

Diff since v0.20.6

Merged pull requests:

Closed issues:

  • Recursive Feature Elimination RFE - Feature Request? (#426)
  • For 0.17 release (#864)
  • Transformers that need to see target (eg, recursive feature elimination) (#874)
  • MLJ API for Missing Imputation ? (#950)
  • [Tracking issue] Add raw_training_scoresaccessor function (#960)
  • Extract probability for a tresholded model (#981)
  • Load data that support the Tables.jl interface (#988)
  • Add new sk-learn models to the docs (#1066)
  • Improve documentation by additional hierarchy (#1094)
  • Link in examples on CV Recursive Feature Elimination into the manual or in the planned tutorial interface. (#1129)
  • broken link for UnivariateFinite doc string (#1130)
  • Add pipeline support for Unsupervised models that have a target in fit (#1134)
  • InteractionTransformer is missing from the "Transformers and Other..." manual page (#1135)

v0.20.6

06 Jun 02:34
7b3b12c

Choose a tag to compare

MLJ v0.20.6

Diff since v0.20.5

  • (new functionality) Add RecursiveFeatureElimination model wrapper.

Merged pull requests:

  • CompatHelper: bump compat for MLJFlow to 0.5 (#1122) (@github-actions[bot])
  • Add model wrappers to the Model Browser (#1127) (@ablaom)
  • For a 0.20.6 release (#1128) (@ablaom)

Closed issues:

  • Requesting better exposure to MLJFlux in the model browser (#1110)
  • Remove info(rms) from the cheatsheet (#1117)
  • Enable entry of model wrappers into the MLJ Model Registry (#1125)

v0.20.5

22 May 04:58
a0d7a08

Choose a tag to compare

MLJ v0.20.5

Diff since v0.20.4

Merged pull requests:

v0.20.4

20 May 09:33
61f12f9

Choose a tag to compare

MLJ v0.20.4

Diff since v0.20.3

  • Bump the requirement for MLFlow to 0.4.2. This is technically breaking (but not marked as such because MLJFlow integration is considered expermental). With latest version of MLFlowClient installed, where previously you would define logger=MLJFlow.Logger("http://127.0.0.1:5000/") you must now do logger=MLJFlow.Logger("http://127.0.0.1:5000/api") or similar; see also https://github.com/JuliaAI/MLFlowClient.jl/releases/tag/v0.5.1.

Merged pull requests:

Closed issues:

  • Curated list of models (#716)
  • Migrate MLJ from alan-turing-institute to JuliaAI? (#829)
  • Update the binder demo for MLJ (#851)
  • Add wrappers for clustering to get uniform interface (#982)
  • Confusing Julia code in adding_models_for_general_use.md (#1061)
  • feature_importances for Pipeline including XGBoost don't work (#1100)
  • Current performance evaluation objects, recently added to TunedModel histories, are too big (#1105)
  • Update cheat sheet instance of depracated @from_network code (#1108)

v0.20.3

08 Mar 06:49
dab6152

Choose a tag to compare

MLJ v0.20.3

Diff since v0.20.2

  • Bump compat for MLJFlow to 0.4 to buy into MLJBase.save method ambiguity fix (in MLJFlow 0.4.1).

Merged pull requests:

Closed issues:

  • Meta issue: lssues for possible collaboration with UCL (#673)
  • Integration test failures: Classifiers (#939)
  • Oversample undersample (#983)
  • Add AutoEncoderMLJ model (part of BetaML) (#1074)
  • Add new model descriptors to fix doc-generation fail (#1084)
  • Update list of BetaML models (#1088)
  • Upate ROADMAP.md (#1093)
  • Deserialisation fails for wrappers like TunedModel when atomic model overloads save/restore (#1099)

v0.20.2

21 Nov 03:37
2a41b9b

Choose a tag to compare

MLJ v0.20.2

Diff since v0.20.1

  • Replace MLFlowLogger with MLJFlow.Logger; see here. So a logger instance is now instantiated with using MLJFlow; logger = MLJFlow.Logger(baseuri). This is technically breaking but not tagged as such, because MLFlow integration is still experimental.

Merged pull requests:

  • Fix MLJTuning.jl links (#1068) (@jd-foster)
  • CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#1070) (@github-actions[bot])
  • Bump compat: MLJFlow 0.3 (#1072) (@ablaom)
  • For a 0.20.2 release (#1073) (@ablaom)

Closed issues:

  • Export the name MLJFlow (#1067)
  • evaluate errors (#1069)