Issue With Adamoptimizer
I'm using a simple network and i'm trying to use AdamOptimizer to minimize the loss in a Q learning contexte. Here the code : ### DATASET IMPORT from DataSet import * ### NETWORK
In TF2 the loss
parameter of the minimize method must be a Python callable.
Thus, you can change your loss definition to:
defloss():
return tf.square(target_Q - curent_Q)
and use it without converting it to a Tensor:
self.optimizer.minimize(loss, self.model.trainable_variables)
You may like these posts
Post a Comment for "Issue With Adamoptimizer"