0

I am using the following R script to write the data. table into the excel file in my set directory. However, the size of the file is in GB's as the total rows are 50 million+. Hence upon opening the file, I just see a blank grey screen and nothing else.

How can I see the contents in the file?

The first line is just for illustration purpose.

 Final1 <- rep(iris, times = 1000000) fwrite(Final1,"data2.csv") 

Excel_Capture

8
  • 1
    If you are dealing with datasets on the order of GB may I suggest that you consider using a relational database instead of Excel? Commented Jan 14, 2018 at 6:54
  • @TimBiegeleisen, Thanks a lot for replying, If I got you write, do you suggest importing the excel file in an RDBMS like sap hana? Commented Jan 14, 2018 at 7:17
  • What is the intended use of this file? Excel can only hold just over a million rows (in a single sheet) Commented Jan 14, 2018 at 7:19
  • @TimWilliams, thanks for replying, See the file basically is a report which will have a data frame computed in R which will go in millions, kindly suggest a good approach as I need to have all the data in one place. Commented Jan 14, 2018 at 7:24
  • One one place to do what? Commented Jan 14, 2018 at 7:41

1 Answer 1

1

You mention that this is part of the report. I would be willing to bet good money that whoever will be reading this report will not check all or most of the values by hand. In which case, you don't need a format that is easily browsable, e.g. xlsx or even csv. If this indeed is the case, you might want to try a (relational) database. If you do not have anything centralized, you might want to give SQLite a try. You save everything into one file which acts as a database. There are packages that handle this interaction in R. You can try with sqldf or RSQLite.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.