Skip to content

tune 0.1.2

Compare
Choose a tag to compare
@topepo topepo released this 17 Nov 15:06

Bug Fixes

  • last_fit() and workflows::fit() will now give identical results for the same workflow when the underlying model uses random number generation (#300).

  • Fixed an issue where recipe tuning parameters could be randomly matched to the tuning grid incorrectly (#316).

  • last_fit() no longer accidentally adjusts the random seed (#264).

  • Fixed two bugs in the acquisition function calculations.

Other Changes

  • New parallel_over control argument to adjust the parallel processing method that tune uses.

  • The .config column that appears in the returned tibble from tuning and fitting resamples has changed slightly. It is now always of the form "Preprocessor<i>_Model<j>".

  • predict() can now be called on the workflow returned from last_fit() (#294, #295, #296).

  • tune now supports setting the event_level option from yardstick through the control objects (i.e. control_grid(event_level = "second")) (#240, #249).

  • tune now supports workflows created with the new workflows::add_variables() preprocessor.

  • Better control the random number streams in parallel for tune_grid() and fit_resamples() (#11)

  • Allow ... to pass options from tune_bayes() to GPfit::GP_fit().

  • Additional checks are done for the initial grid that is given to tune_bayes(). If the initial grid is small relative to the number of model terms, a warning is issued. If the grid is a single point, an error occurs. (#269)

  • Formatting of some messages created by tune_bayes() now respect the width and wrap lines using the new message_wrap() function.

  • tune functions (tune_grid(), tune_bayes(), etc) will now error if a model specification or model workflow are given as the first argument (the soft deprecation period is over).

  • An augment() method was added for objects generated by tune_*(), fit_resamples(), and last_fit().