Timeline for users' percentile similarity measure
Current License: CC BY-SA 3.0
8 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jul 10, 2018 at 12:17 | vote | accept | Yuval Shachaf | ||
| Jul 10, 2018 at 12:17 | |||||
| Aug 29, 2015 at 9:19 | comment | added | Yuval Shachaf | disregard the issue with unique position. Regarding the weighting, so far I dont see reason to apply that, anyway that is a great point to keep in mind | |
| Aug 26, 2015 at 19:58 | history | edited | image_doctor | CC BY-SA 3.0 | added 11 characters in body |
| Aug 26, 2015 at 19:54 | comment | added | image_doctor | Good point about forgetting to add Abs ! What is unique about the positions of the vector elements ? If you wanted a weighted sum of similarities something would have to determine that weighting. Is there any any information on that ? Or would you want to apply a machine learning approach to the weighting scheme ? | |
| Aug 26, 2015 at 19:24 | comment | added | Yuval Shachaf | This solution has two issues I can see. 1, since the diff is not squared as for sum of square errors, so errors with opposite direction will cancel each other. In addition, correct me if wrong, this solution does not take into account unique positons of vector elements in case this important | |
| Aug 26, 2015 at 11:43 | history | edited | image_doctor | CC BY-SA 3.0 | added 154 characters in body |
| Aug 26, 2015 at 10:58 | history | edited | image_doctor | CC BY-SA 3.0 | added 154 characters in body |
| Aug 26, 2015 at 10:34 | history | answered | image_doctor | CC BY-SA 3.0 |