Purpose of 'givens' variables in Theano.function -
i reading code logistic function given @ http://deeplearning.net/tutorial/logreg.html. confused difference between inputs
& givens
variables function. functions compute mistakes made model on minibatch are:
test_model = theano.function(inputs=[index], outputs=classifier.errors(y), givens={ x: test_set_x[index * batch_size: (index + 1) * batch_size], y: test_set_y[index * batch_size: (index + 1) * batch_size]}) validate_model = theano.function(inputs=[index], outputs=classifier.errors(y), givens={ x: valid_set_x[index * batch_size:(index + 1) * batch_size], y: valid_set_y[index * batch_size:(index + 1) * batch_size]})
why couldn't/wouldn't 1 make x& y shared input variables , let them defined when actual model instance created?
the givens
parameter allows separate description of model , exact definition of inputs variable. consequence of given parameter do: modify graph compile before compiling it. in other words, substitute in graph, key in givens associated value.
in deep learning tutorial, use normal theano variable build model. use givens
speed gpu. here, if keep dataset on cpu, transfer mini-batch gpu @ each function call. many iterations on dataset, end transferring dataset multiple time gpu. dataset small enough fit on gpu, put in shared variable have transferred gpu if 1 available (or stay on central processing unit if graphics processing unit disabled). when compiling function, swap input slice corresponding mini-batch of dataset use. input of theano function index of mini-batch want use.
Comments
Post a Comment