So I've just had an SD card ruined & while it's not very expensive, I'd rather have it last me a much, much longer time. As a precaution going forward now, I'd like to be able to detect right away if there is too many bad sectors or if the flash/memory card is about to give up the ghost, just like what happened to my SD card. That requires good methodology in part of the detection software.
One thing that concerns me about the detection is this:
As far as I know, many software will use a fixed block to be repeatedly written contiguously to a partition being checked, later to be read & verified (in the read speed part).
What if, say, the software were to go to another pass without clearing the block that was written the pass before? Like when it was aborted midway? Now, say, we have a block that was also written into said pass before & right after which sectors became bad & cannot be written anymore. The write operation would attempt to write at the block & write fail would not be detected because it simply kept it's old content.
I want to have obtain a test software that is without these problems (as I see them, correct me if I'm wrong). One that uses a test block that is randomly generated (maybe every block), or one that keeps an offset between the test block & target block, as necessary:
Just so that the same test_block & target block don't align into each other again.

