Cross-entropy (loss), $-\sum y_i\;\log(\hat{p_i})$, estimates the amount of information needed to encode $y$ using Huffman encoding based on the estimated probabilities $\hat{p}$. Therefore one could claim it should be considered to measure the amount of information, for example a number of bits.
Depending on the base of $\log$, these can be binary bits or digits, but typically are Euler-bits since $\ln$ is mostly used. Is there a popular or official name or unit for these so called Euler-bits? Is it OK to consider the unit of cross-entropy loss this way?
Obviously, loss is used for optimization and most don't care about the exact unit, but I'm curious and would like to use the correct terminology when explaining ML to others.