0

I have a file "a.txt" which contains the following lines:

14,15,16,17 13,16,15,14 15,17,12,13 ... ... 

I know that each line will always have 4 columns.

I have to read this file and split the lines based on delimiter (here it is ",") and write the value of each column in its corresponding file i.e. if value in a column is 14 then it has to be dumped/wriiten in 14.txt, if its 15 then it will be written in 15.txt and so on.

Here is what I have done till now:

Map <Integer, String> filesMap = new HashMap<Integer, String>(); for(int i=0; i < 4; i++) { filesMap.put(i, i+".txt"); } File f = new File ("a.txt"); BufferedReader reader = new BufferedReader (new FileReader(f)); String line = null; String [] cols = {}; while((line=reader.readLine()) != null) { cols = line.split(","); for(int i=0;i<4;i++) { File f1 = new File (filesMap.get(cols[i])); PrintWriter pw = new PrintWriter(new BufferedWriter(new FileWriter(f1))); pw.println(cols[i]); pw.close(); } } 

So for line number 1 of file "a.txt", I will have to open, write and close files 14.txt,15.txt,16.txt and 17.txt

Again for line number 2, I have to again open,write and close files 14.txt,15.txt,16.txt and a new file 13.txt

So is there any better option in which I don't have to open and close the file which has already been opened earlier.

At the end of the complete operation I will close all the opened files.

6
  • 2
    Look into appending to files, or just do one massive open / close at the end. Commented Feb 5, 2013 at 6:14
  • 1
    The input file "a.txt" will have 112500 lines. So is it a wise option to store the complete data in memory? Commented Feb 5, 2013 at 6:17
  • "i.e. if value in a column is 14 then it has to be dumped/wriiten in 14.txt..." So at the end of this, file 14.txt will contain a bunch of lines with the value 14, etc? That makes no sense, and more importantly conflicts with your code, which seems to write all the first-column values into 1.txt, second column values into 2.txt, etc. Which is it? Commented Feb 5, 2013 at 6:42
  • @pst (1) Why not? Because one day, maybe today, it won't fit into memory. (2) Why? If you can process the data a line at a time, why wouldn't you do that? Commented Feb 5, 2013 at 6:55
  • Opening and closing files is very expensive. I would avoid it at all costs. If you have to, you can cache all the files you have opened but a better solution would be to redesign your index so you are not using multiple files at all. Commented Feb 5, 2013 at 8:40

3 Answers 3

2

Something like this should work:

Map <Integer, PrintWriter> filesMap = new HashMap<>(); ... if(!filesMap.containsKey(cols[i])) { //add a new PrintWriter } else { //use the existing one } 
Sign up to request clarification or add additional context in comments.

1 Comment

+1 Yes is what I wanted to answer .. this will be efficient as you wont be creating file/writers every time .. at the end you can just loop through the map and close all
0

try

 Set<String> s = new HashSet<>(); Scanner sc = new Scanner(new File ("a.txt")).useDelimiter("[\n\r,]+"); while(sc.hasNext()) { String n = sc.next(); if (s.add(n)) { FileWriter w = new FileWriter(n + ".txt"); w.write(n); w.close(); } } sc.close(); 

Comments

0
 public static void main(String[] args) throws Exception { FileReader fr = new FileReader("a.txt"); BufferedReader reader = new BufferedReader(fr); String line = ""; while ((line = reader.readLine()) != null) { String[] cols = line.split(","); for (int i = 0; i < 4; i++) { FileWriter fstream = new FileWriter(cols[i] + ".txt" , true);// true is for appending the data in the file. BufferedWriter fbw = new BufferedWriter(fstream); fbw.write(cols[i] + "\n"); fbw.close(); } } } 

Try this . I think you want to do like this way.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.