![]() Note: you can match this behavior in binary cross entropy by using the BCEWithLogitsLoss. For single-label categorical outputs, you also usually want the softmax activation function to be applied, but PyTorch applies this automatically for you. When you call BCELoss, you will typically want to apply the sigmoid activation function to the outputs before computing the loss to ensure the values are in the range. I’m trying to minimize the negative Entropy.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |