15

Real simple question really. I need to read a Unicode text file in a Java program.

I am used to using plain ASCII text with a BufferedReader FileReader combo which is obviously not working :(

I know that I can read a String in the 'traditional' way using a Buffered Reader and then convert it using something like:

temp = new String(temp.getBytes(), "UTF-16"); 

But is there a way to wrap the Reader in a 'Converter'?

EDIT: the file starts with FF FE

7 Answers 7

18

you wouldn't wrap the Reader, instead you would wrap the stream using an InputStreamReader. You could then wrap that with your BufferedReader that you currently use

BufferedReader in = new BufferedReader(new InputStreamReader(stream, encoding)); 
Sign up to request clarification or add additional context in comments.

3 Comments

I want to read Hebrew letters, what would I replace with "encoding"?
to answer my own question, it's "UTF-8"
'The constructor BufferedReader(InputStreamReader) is undefined'?
10

Check https://docs.oracle.com/javase/1.5.0/docs/api/java/io/InputStreamReader.html.

I would read source file with something like:

Reader in = new InputStreamReader(new FileInputStream("file"), "UTF-8")); 

Comments

7

Some notes:

  • the "UTF-16" encoding can read either little- or big-endian encoded files marked with a BOM; see here for a list of Java 6 encodings; it is not explicitly stated what endianness will be used when writing using "UTF-16" - it appears to be big-endian - so you might want to use "UnicodeLittle" when saving the data
  • be careful when using String class encode/decode methods, especially with a marked variable-width encoding like UTF-16 - use them only on whole data
  • as others have said, it is often best to read character data by wrapping your InputStream with an InputStreamReader; you can concatenate your input into a single String using a StringBuilder or similar buffer.

1 Comment

Thanks for the link to the encoding types. I found the right one for me.
2

I would recommend to use UnicodeReader from Google Data API, see this answer for a similar question. It will automatically detect encoding from the Byte order mark (BOM).

You may also consider BOMInputStream in Apache Commons IO which does basically the same but does not cover all alternative versions of BOM.

Comments

0

I just had to add "UTF-8" to the creation of the InputStreamReader and special characters could be seen inmediately.

InputStreamReader istreamReader = new InputStreamReader(inputStream,"UTF-8"); BufferedReader bufferedReader = new BufferedReader(istreamReader); 

Comments

-1
 Scanner scan = new Scanner(new File("C:\\Users\\daniel\\Desktop\\Corpus.txt")); while(scan.hasNext()){ System.out.println(scan.nextLine()); } 

1 Comment

Is the Scanner class specific to unicode? Just reading the code (and not being aware of such things) it is difficult to ascertain if this actually answers the question. For issues where the OP may need some conceptual understanding as well as code, it is useful to include a short text description of why the code works in your answer. Such a description would be beneficial here. Also, I have edited your post to put the code in "Code Markup" Please do the same in the future as it makes it much easier to read. Welcome to StackOverflow!
-1
String s = new String(Files.readAllBytes(Paths.get("file.txt")),"UTF-8"); 

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.