Running Binarize (or EdgeDetect) on a filled black and white image is resulting in an all black image.
The image has the following color distribution (with most images points falling in either a zero or one bucket)
image = Import["http://imgur.com/HhsMeDc.png"] ; ImageLevels[ image, 10 ] ... {{{0., 640000}, {0.1, 0}, {0.2, 0}, {0.3, 0}, {0.4, 0}, {0.5, 0}, {0.6, 0}, {0.7, 0}, {0.8, 0}, {0.9, 0}}, {{0., 640000}, {0.1, 0}, {0.2, 0}, {0.3, 0}, {0.4, 0}, {0.5, 0}, {0.6, 0}, {0.7, 0}, {0.8, 0}, {0.9, 0}}, {{0., 640000}, {0.1, 0}, {0.2, 0}, {0.3, 0}, {0.4, 0}, {0.5, 0}, {0.6, 0}, {0.7, 0}, {0.8, 0}, {0.9, 0}}, {{0., 370003}, {0.1, 809}, {0.2, 605}, {0.3, 555}, {0.4, 612}, {0.5, 550}, {0.6, 580}, {0.7, 616}, {0.8, 764}, {0.9, 264906}}} 
This binarize attempt:
b = Binarize[ image ] ImageLevels[ b ] produces an all black image, with the levels showing only zero values:
{{0, 640000}, {1, 0}} I tried various thresholds, including the default and some explicit values (or 0.5, 0.9, 0.1, ...), but still get the all zero levels?


ColorNegate@Binarize[ColorNegate[image]]seems to do the job. $\endgroup$