python - I want to implement a regularization technique "Shakeout" in CNTK -


there paper "shakeout: new approach regularized deep neural network training" can found here: http://ieeexplore.ieee.org/abstract/document/7920425/

a new regularization technique introduced in paper, can replace dropout layers in more functional way. working on deep learning problem , want implement "shakeout" technique, problem not understand actual pipeline paper. there mathematics still struggling understand.

so far, have seen 1 open source implementation based on "caffe", new practitioner of deep learning , learning use cntk. not possible start working on caffe. have implemented "shakeout" in cntk? or if can provide pseudo-code shakeout? shakeout implementation on caffe: https://github.com/kgl-prml/shakeout-for-caffe

github issue: https://github.com/kgl-prml/shakeout-for-caffe/issues/1

from quick @ paper dense layer combined shakeout layer following:

def densewithshakeout(rate, c, outputs):     weights = c.parameter((c.inferreddimension, outputs), init=c.glorot_uniform())     bias = c.parameter(outputs)     def shakeout(x):         r = c.dropout(x, rate)         signs = weights/c.abs(weights) # 1 day cntk should add actual sign operation         return c.times(r, weights) + c * c.times(r - x, signs) + bias     return shakeout 

this can used inside c.layers.sequential() statement e.g.

model = c.layers.sequential([dense(0.2, 1, 100), densewithshakeout(10)])  

will create 2 layer network shakeout layer in middle. note, have not tried on real problem.


Comments

Popular posts from this blog

php - Vagrant up error - Uncaught Reflection Exception: Class DOMDocument does not exist -

vue.js - Create hooks for automated testing -

.htaccess - ERR_TOO_MANY_REDIRECTS htaccess -