Releases: OverLordGoldDragon/see-rnn
Releases · OverLordGoldDragon/see-rnn
See RNN v1.13.5
FEATURES:
detect_nansnow detectsnp.infand-np.infviainclude_inf=True(default), and prints as e.g.25% NaN, 10% Infdetect_nansnow returns text without a newline, e.g.50% NaNinstead of50%\nNaNs
BREAKING:
visuals_rnnnow passesinclude_inf=Truetodetect_nans
MISC:
- Fixed possible redundant newline in
_print_nans detect_nansnames nanNaNinstead ofNaNs
Se RNN v1.13.4
FEATURE: in visuals_gen and visuals_rnn, configs will now keep values of dicts of defaults, unless specifying the same keys. E.g.:
defaults = {'1': dict(a=1, b=2), '2': dict(c=3, d=4)}
configs = {'1': dict(a=3, g=5)}
kw = {'1': {'a': 3, 'g': 5, 'b': 2}, '2': {'c': 3, 'd': 4}} # AFTER
kw = {'1': {'a': 3, 'g': 5}, '2': {'c': 3, 'd': 4}} # BEFOREAlso will apply deepcopy(configs) to not affect external dict.
See RNN v1.13.3
FEATURE: Add pad_xticks kwarg to features_hists
See RNN v1.13.2
BUGFIXES:
get_weights(mode='weights', as_dict=True)would incorrectly handle data packing, getting layer names instead of weight names, thus omitting and mis-labeling weightsget_weights()would ignorelearning_phaseargument, always defaulting to1via_get_grads()
See RNN v1.13.1
BUGFIX: Fix sample_weights handling in get_gradients for case sample_weights is not None
See RNN v1.13
FEATURES:
- Added
get_weight_penalties,weight_losstoinspect_gen - Added
as_tensorskwarg toget_weights - Added
share_xy,center_zerokwargs tofeatures_histandfeatures_hist_v2 - Deprecated
equate_axesinvisuals_genin favor ofshare_xy, directly settingsharexandsharey - Improved
detect_nansperformance
BREAKING:
_detect_nans->detect_nans(made public)
MISC:
- Added keys check to
configs, ensuring keys serve functionality (e.g.'subplots'does nothing;'subplot'is correct) - Improved
get_gradientsdocstring - Changed
get_gradientsarg ordering - Updated README to account for changes
See RNN v1.12
FEATURES:
- Added
hist_clipped
BREAKING:
features_histandfeatures_hist_v2now usehist_clipped()instead of.hist()
FIXES:
- Moved
set_xlim()inside of axis loop forfeatures_hist, which makes a difference ifsharex=False
See RNN v1.11
FEATURES:
_id='*'now fetches all layers (but input) (get_weights,get_gradients,get_outputs,weights_norm)get_weightsnow supportsomit_names
BREAKING:
omit_weight_names->omit_names(weights_norm)
See RNN v1.1
FEATURES:
- Universal layer specifier,
_id, that is a layer name, layer index, or list of either or of both - Enabled full customization of various plot aspects via
configsdict; see docstrings & code pip installsupport- Figure saving support on all visuals
- Added
weights_normfor computing & tracking layer weight norm statistics - Added
features_hist_v2for multi-hist grid visualization - Added
features_histfor grid hist visualization rnn_histogramandrnn_heatmapnow plot bias w/ kernel & recurrent kernel in one figure- All plots now return
figandaxesobjects - All methods now
deepcopypassed lists and dicts when mutating internally to not affect originals
BREAKING:
- Several positional & keyword arguments deprecated in favor of
configs layer_name&layer_idxdeprecated;_idnow takes place of both, and can be a list containing bothlabel_channelsinfeatures_1Ddeprecated in favor of customizableannotationsshow_features_->features_(0D, 1D, 2D)get_layer_outputs->get_outputsget_layer_weights->get_weightsget_layer_gradients->get_gradientsscale_width->wscale_height->h
MISC:
- Created
_backend.pyto move repeated declarations and env vars to - Improved docstrings
- Improved code readability
- Added tests
get_full_nameadded toinspect_gen.pyfor retrieving full layer name given_id_filter_duplicates_by_keysadded toutils.py_save_rnn_figsadded toutils.py.flatten()->.ravel(), since former copies array (thus slower)- (Bugfix)
codecs.open->open, as former's formatting ofREADME.mdinsetup.pywas overflowing package build metadata
See RNN v1.05
New features:
show_features_1Dandshow_features_2Dnow supporttightandborderwidthspecifications, useful when the number of subplots is very large.
Bug fixes:
rnn_heatmapnow handles IndRNN'srecurrent_kernelas a vector, instead of a 2D array