Sent Successfully.
What is Cross-Entropy?
It is a measure of non-similarity between the two probability distributions or classes.
Cross entropy is represented with the below expression:

Where ‘x’ represents the predicted results by a model and p(x) represents probability distributions of true labeled training examples and q(x) represents the predicted values by a model.
There can be two types of cross-entropy, one is binary cross-entropy and categorical cross-entropy. Binary cross-entropy is used when there are two output classes and categorical cross-entropy is used when there are multiple output classes.