Binary Classification With Softmax Activation Always Outputs 1
Sorry for the quality of the question but a beginner here , I was just trying my luck with titanic dataset, but it always predicts that the passenger died. I try to explain code be
Solution 1:
tf.nn.softmax
will always return an array of sum=1
. Since your output is 1 value (you have one unit on your final/output layer), a softmax operation will transform this value to 1.
for value in [.2, .999, .0001, 100., -100.]:
print(tf.nn.softmax([value]))
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
What you're looking for is tf.nn.sigmoid
:
for value in [.2, .999, .0001, 100., -100.]:
print(tf.nn.sigmoid([value]))
tf.Tensor([0.549834], shape=(1,), dtype=float32)
tf.Tensor([0.7308619], shape=(1,), dtype=float32)
tf.Tensor([0.500025], shape=(1,), dtype=float32)
tf.Tensor([1.], shape=(1,), dtype=float32)
tf.Tensor([0.], shape=(1,), dtype=float32)
losses.BinaryCrossentropy(from_logits=True)
is like sigmoid crossentropy.
If you want to round the values to get 0 or 1, use tf.round
:
tf.round(tf.nn.sigmoid([.1]))
Post a Comment for "Binary Classification With Softmax Activation Always Outputs 1"