- examples now run on CPU by default
brainstorm.tools.splitto help with data preparation
SquaredDifferencelayers now outputs a loss for each dimension instead of summing over features.
SquaredDifferencelayer does no longer scale by one half.
- Added a
SquaredLosslayer that computes half the squared difference and has an interface that is compatible with the
- Output probabilities renamed to predictions in
- added a use_conv option to
- added criterion option to
brainstorm.tools.get_network_infofunction that returns information about the network as a string
brainstorm.tools.extractfunction that applies a network to some data and saves a set of requested buffers.
brainstorm.layers.masklayer now supports masking individual features
- EarlyStopper now works for any timescale and interval
- Recurrent, Lstm, Clockwork, and ClockworkLstm layers now accept inputs of arbitrary shape by implicitly flattening them.
- several fixes to make building the docs easier
- some performance improvements of NumpyHandler operations
- sped up tests
- several improvements to installation scripts
- fixed sqrt operation for
PyCudaHandler. This should fix problems with BatchNormalization on GPU.
- fixed a bug for task_type=’regression’ in
- removed defunct name argument from input layer
- fixed a crash when applying
brainstorm.hooks.SaveBestNetworkto rolling_training loss
- various minor fixes of the
- fixed a problem with
- fixed a blocksize problem in convolutional and pooling operations in
- First release on PyPI.