My problem is shown in the following minimal example:
#include <iostream> #include <string> #include <iomanip> int main() { int width = 15; std::cout << std::left; std::cout << std::setw(width) << "Prints well" << std::setw(width) << "This too" << '\n'; std::cout << std::setw(width) << "\u221E" << std::setw(width) << "This not?" << '\n'; std::cout << std::setw(width+2) << "\u221E" << std::setw(width) << "This is good" << '\n'; } Compiled using g++, it prints:
Prints well This too ∞ This not? ∞ This is good So it seems that the unicode symbol uses 3 spaces from the setw instead of one. Is there a simple way to fix this, not knowing beforehand whether a unicode character will be in the string?
operator<<will use the equivalent ofstrlen(string)to get a byte count and then pad with spaces until it reachessetw(width). It doesn't know how the console will render the characters.