What I'm trying to do is converting a string's bytes into hexadecimal format.
Based on this answer (and many others consistent) I've tried the code:
#include <sstream> #include <iomanip> #include <iostream> int main () { std::string inputText = u8"A7°"; std::stringstream ss; // print every char of the string as hex on 2 values for (unsigned int i = 0; i < inputText.size (); ++i) { ss << std::hex << std::setfill ('0') << std::setw (2) << (int) inputText[i]; } std::cout << ss.str() << std::endl; } but with some characters coded in UTF 8 it does't work.
For Instance, in strings containing the degrees symbol ( ° ) coded in UTF8, the result is: ffffffc2ffffffb0 instead of c2b0.
Now I would expect the algorithm to work on individual bytes regardless of their contents and furthermore the result seems to ignore the setw(2) parameter.
Why does I get such a result?
(run test program here)
fs.