34

At the end of the article here: http://www.learncpp.com/cpp-tutorial/45-enumerated-types/, it mentions the following:

Finally, as with constant variables, enumerated types show up in the debugger, making them more useful than #defined values in this regard.

How is the bold sentence above achieved?

Thanks.

4
  • I edited the title and added one tag, hope you wouldn't mind. :-) Commented Jan 22, 2011 at 12:46
  • @Nawaz. That's totally fine, don't worry about it. Commented Jan 22, 2011 at 17:25
  • possible duplicate of "static const" vs "#define" in c Commented Sep 10, 2011 at 14:32
  • @MatthieuM. Certainly not. The rules of C++ are different enough that it deserves a different answer. Commented Aug 9, 2019 at 23:58

6 Answers 6

39

Consider this code,

#define WIDTH 300 enum econst { eWidth=300 }; const int Width=300; struct sample{}; int main() { sample s; int x = eWidth * s; //error 1 int y = WIDTH * s; //error 2 int z = Width * s; //error 3 return 0; } 

Obviously each multiplication results in compilation-error, but see how the GCC generates the messages for each multiplication error:

prog.cpp:19: error: no match for ‘operator*’ in ‘eWidth * s’
prog.cpp:20: error: no match for ‘operator*’ in ‘300 * s’
prog.cpp:21: error: no match for ‘operator*’ in ‘Width * s’

In the error message, you don't see the macro WIDTH which you've #defined, right? That is because by the time GCC makes any attempt to compile the line corresponds to second error, it doesn't see WIDTH, all it sees only 300, as before GCC compiles the line, preprocessor has already replaced WIDTH with 300. On the other hand, there is no any such thing happens with enum eWidth and const Width.

See the error yourself here : http://www.ideone.com/naZ3P


Also, read Item 2 : Prefer consts, enums, and inlines to #defines from Effective C++ by Scott Meyers.

Sign up to request clarification or add additional context in comments.

13 Comments

Admittedly, Clang would probably produce a better error message by showing the line that caused the error and thus include the macro (along with its expansion). Though I agree that macros should not be used for constants, I think a better treatment of the question would not use gcc's error messages as the corner stone of the reasoning.
@MatthieuM.: Clang might produce a better error message, but those who use GCC to compile their code are not going to recompile their code with Clang so as to see a better error message. {contd}
{contd}...So such reasoning is very much compiler dependent, and therefore I think taking Clang error message as the corner stone of reasoning wouldn't be any better either, because someone might come and say exactly what you said, replacing "Clang" with "GCC, MSVC", and "better" with "not-so-better", ending his comnent with "I think a better treatment of the question would not use Clang's error messages as the corner stone of the reasoning" . Hope you got my point. :-)
I understand what you are trying to get at, but I disagree. I think that using the compiler error messages as an argument for enums rather than macros is weak, in the sense that it depends how the compiler build those error messages, which changes from compiler to compiler and version to version. But then the question itself is rather weird, since it is based on present behavior of debuggers... so I don't care much.
@MatthieuM.: I don't see any other better argument for preferring const over #define, if this argument is weak. How good (and practical) an argument is, can be said in the light of all other arguments. Compilers behaviors, along the language, determines what one should use. Otherwise, what else decides?
|
16

enum is compile time constant with debug info with no storage allocation.

const is allocated with a storage, depending on whether it is optimised away by the compiler with constant propagation.

#define has no storage allocation.

2 Comments

enum would take storage space whenever a variable of that enum type is declared in your program. Isn't it? I believe your mention regarding enum is for the case when a variable of that enum type is never declared in the program. But why would someone declare an enum and not use it?
I'd +1 if you could clarify that fact that "no storage allocation" does not mean "doesn't use memory". All of these require the data to be stored somewhere, but the difference between "storage allocation" and "no storage allocation" is where.
6

#define values are replaced by the pre-processor with the value they are declared as, so in the debugger, it only sees the value, not the #defined name, e.g. if you have #define NUMBER_OF_CATS 10, in the debugger you'll see only 10 (since the pre-processor has replaced every instance of NUMBER_OF_CATS in your code with 10.

An enumerated type is a type in itself and the values are constant instances of this type, and so the pre-processor leaves it alone and you'll see the symbolic description of the value in the debugger.

Comments

3

The compiler stores enum information in the binary when the program is compiled with certain options.

When a variable is of a enum type, a debugger can show the enum name. This is best shown with an example:

enum E { ONE_E = 1, }; int main(void) { enum E e = 1; return 0; } 

If you compile that with gcc -g you can try out the following in gdb:

Reading symbols from test...done. (gdb) b main Breakpoint 1 at 0x804839a: file test.c, line 8. (gdb) run Starting program: test Breakpoint 1, main () at test.c:7 7 enum E e = 1; (gdb) next 9 return 0; (gdb) print e $1 = ONE_E (gdb) 

If you used a define, you would not have a proper type to give e, and would have to use an integer. In that case, the compiler would print 1 instead of ONE_E.

The -g flag asks gdb to add debugging information to the binary. You can even see that it is there by issuing:

xxd test | grep ONE_E 

I don't think that this will work in all architectures, though.

2 Comments

Thanks for your reply. What do you mean by: "The compiler stores enum information in the binary"? Thanks.
Literally, the binary has that information in it in a format that gdb understands.From the gcc manual: -g Produce debugging information in the operating system's native format (...). GDB can work with this debugging information.
1

At least for Visual Studio 2008 which I currently have at hand, this sentence is correct. If you have

#define X 3 enum MyEnum { MyX = 3 }; int main(int argc, char* argv[]) { int i = X; int j = (int)MyX; return 0; } 

and you set a breakpont in main, you can hover your mouse over "MyX" and see that it evaluates to 3. You do not see anything useful if you hover over X.

But this is not a language property but rather IDE behavior. Next versions might do it differently, as well as others IDEs. So just check it out for your IDE to see if this sentence applies in your case.

2 Comments

It is a language property, in that X is not even seen by the compiler (only the preprocessed value 3), therefore the debugger has absolutely no way of using X as a symbolic name, thus it will show you only the plain value 3. This is true of all compilers and IDEs. OTOH showing symbols like MyX in the debugger is indeed an IDE-specific feature, but it is so common I doubt there is any modern IDE which does not offer this.
@PéterTörök There's nothing to stop someone creating a compiler that also does preprocessing or creating a way of placing X in the symbol table.
0

I am answering too late but i feel i can add something - enum vs. const vs. #define

enum -

  1. Does not require assining values (if just want to have sequential values 0, 1, 2..) whereas in case of #defines you manually need to manage values which could cause human error sometime
  2. It works just as variable to during online debugging the value of enum can be watched in watch window
  3. You can have a variable of enum type to which you can assign an enum

    typedef enum numbers { DFAULT, CASE_TRUE, CASE_OTHER, };

    int main(void) { numbers number = CASE_TRUE; }

const -

  1. It is constant stored in read only area of memory but can be accessed using address which is not possible in case of #define
    1. You have type check in your hand if you use const rather than #define
  2. defines are pre-processing directive but const is compile time for example

    const char *name = "vikas";

You can access the name and use its base address to read the such as vikas[3] to read 'a' etc.

#defines - are dumb preprocessor directives which does textual replacement

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.