Timeline for Why improvising your own Hash function out of existing hash functions is so bad
Current License: CC BY-SA 3.0
8 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jun 9, 2022 at 6:59 | comment | added | supercat | @mtraceur: Few things are necessarily inevitable. My point was that a combination of two KDF functions that are run for less time than a single KDF would have been may be stronger than a single KDF would have been, in some plausible scenarios. Not that such combinations were always beneficial, much less worth the effort to set them up, but that they could in some cases be beneficial. | |
| Jun 9, 2022 at 6:16 | comment | added | mtraceur | @supercat in other words, when I look at less_popular_kdf(Argon2id(...)), I don't immediately see it as necessarily inevitable that attackers investing in Argon2id speedups would be slowed by less_popular_kdf - without looking inside them both, I can't rule out the possibility that less_popular_kdf and Argon2id share so much logic that the same investment in speedup covers both. | |
| Jun 9, 2022 at 6:10 | comment | added | mtraceur | @supercat yes, but "algorithms" aren't monolithic atoms of functionality which can only either be sped up with targeted investment or not at all - they are ultimately compositions of smaller simpler algorithms/primitives/operations, and much of the speedup lies from investment in cracking lies in optimizing those little pieces. Now I don't know how many little innards are the same between KDF foo vs KDF bar, but I imagine that the most recommended and strong KDF uses all of the best pieces, or is strongest against speedup despite not using all of them because the pieces it does use are. | |
| Jun 8, 2022 at 19:18 | comment | added | supercat | @mtraceur: Whether a cryptographic algorithm gets sped up two-fold or 100-fold will depend in significant measure upon the amount of money that gets spent on the effort. If one uses only popular algorithms, then money will likely get spent improving the performance of all of them. If a program spends 20% of its time on an algorithm that is less popular and attracts less investment, and is thus only spend up two-fold, then no amount of investment in the other algorithms could sped it up more than ten-fold. | |
| Jun 8, 2022 at 16:05 | comment | added | mtraceur | @supercat but of course smart cryptographers presumably already thought about that, and deliberately designed each of these algorithms to combine different pieces which are unlikely to be sped up by the same ways. When you combine them, the default assumption is not "I am combining two differently secure techniques", the default assumption should be "the only thing I could change here is inefficiently increase the ratio of uninteresting glue versus the already combined differently secure techniques". | |
| Sep 10, 2016 at 2:47 | comment | added | Pacerier | Yes, another issue is economies of scale. Mentioned here: security.stackexchange.com/questions/33531/… | |
| Nov 11, 2014 at 0:22 | comment | added | supercat | Most algorithms are likely to become faster with time, though not necessarily by equal amounts. Suppose there's a 1% chance that method X will get sped up 100 fold, and a 10% chance it will get sped up two-fold; likewise for method Y. Unless I'm missing something, I would think that running both algorithms at half-strength would mean that a massive speedup would only be possible for the composite if one was possible for both constituent algorithms. | |
| Apr 1, 2013 at 17:58 | history | answered | Thomas Pornin | CC BY-SA 3.0 |