| 2020 | |||
|---|---|---|---|
| Dec 23 | | awarded | Student |
| Dec 22 | | comment | Scoring metric for recommendation system thanks for your response @Erwan . from what I read about Kullback–Leibler divergence is that it requires both the vectors to follow a valid probabilistic distribution. ie. have sum of 1. here, my vectors v do follow a valid probabilistic distribution and do sum up to 1 but target x do not follow the same distribution. it's sum need not be 1. would it be safe to normalize vector x such that it's sum is 1 and ratios retained and the apply KL? |
| Dec 22 | | asked | Scoring metric for recommendation system |
| Dec 22 | | asked | Scoring metric for recommendation system |