15

I would like to convert a char to its ASCII int value.

I could fill an array with all possible values and compare to that, but it doesn't seems right to me. I would like something like

char mychar = "k" public int ASCItranslate(char c) return c ASCItranslate(k) // >> Should return 107 as that is the ASCII value of 'k'. 

The point is atoi() won't work here as it is for readable numbers only.

It won't do anything with spaces (ASCII 32).

2
  • Did you try anything? Like, for example, return c;? Commented Apr 14, 2013 at 12:57
  • 4
    char mychar = "k": Oh heavens, have you tried this? "k" is a null-terminated string, not a char. Also, C++ statements have to end with a semi-colon. Surely you know this? Commented Apr 14, 2013 at 13:04

7 Answers 7

24

Just do this:

int(k) 

You're just converting the char to an int directly here, no need for a function call.

Sign up to request clarification or add additional context in comments.

Comments

8

A char is already a number. It doesn't require any conversion since the ASCII is just a mapping from numbers to character representation.

You could use it directly as a number if you wish, or cast it.

Comments

5

In C++, you could also use static_cast<int>(k) to make the conversion explicit.

4 Comments

Casting is always explicit. Using a cast makes a conversion explicit. Of course, in almost all cases the compiler will do the conversion anyway, and the cast is just noise.
I meant "explicit to the programmer", not to the compiler.
Nevertheless, it's the conversion that is made explicit. A cast is something you write in your source code to tell the compiler to do a conversion; casts are always explicit.
OK, got it :) I have fixed my sentence.
1

Do this:-

char mychar = 'k'; //and then int k = (int)mychar; 

3 Comments

Won't work on machines using EBCDIC. If portability is desired, something more complicated is required.
EBCDIC isnt ASCI, EBCDIC is older was only used by IBM in their earlier systems. i remember even the MSX had ASCI so this must be real old, i use it on a Arduino which understands it.
Why is this upvoted/accepted? It's not even valid code.
1

To Convert from an ASCII character to it's ASCII value:

 char c='A'; cout<<int(c); 

To Convert from an ASCII Value to it's ASCII Character:

int a=67; cout<<char(a); 

Comments

0
#include <iostream> char mychar = 'k'; int ASCIItranslate(char ch) { return ch; } int main() { std::cout << ASCIItranslate(mychar); return 0; } 

That's your original code with the various syntax errors fixed. Assuming you're using a compiler that uses ASCII (which is pretty much every one these days), it works. Why do you think it's wrong?

Comments

0

Just typecast it to integer type that will automatically convert it to it's corresponding ASCII value.

#include <iostream> using namespace std; char c = 'a'; printf("%d",int(c)); 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.