Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
994 views
in Technique[技术] by (71.8m points)

tensorflow - Getting the current learning rate from a tf.train.AdamOptimizer

I'd like print out the learning rate for each training step of my nn.

I know that Adam has an adaptive learning rate, but is there a way i can see this (for visualization in tensorboard)

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

All the optimizers have a private variable that holds the value of a learning rate.

In adagrad and gradient descent it is called self._learning_rate. In adam it is self._lr.

So you will just need to print sess.run(optimizer._lr) to get this value. Sess.run is needed because they are tensors.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...