Skip to main content

Timeline for Import large csv files

Current License: CC BY-SA 3.0

16 events
when toggle format what by license comment
Oct 12, 2016 at 4:27 history tweeted twitter.com/StackProgrammer/status/786060734279086080
Oct 11, 2016 at 17:38 vote accept Rhodes73
Oct 7, 2016 at 11:16 answer added Phill W. timeline score: 0
Oct 6, 2016 at 18:26 answer added YSharp timeline score: 4
Oct 6, 2016 at 15:36 comment added paparazzo Excel for 2 GB of data because you don't have a database? 2 GB is the limit in 32 bit. There are many free databases that do this.
Oct 6, 2016 at 15:10 answer added radarbob timeline score: 7
Oct 6, 2016 at 13:42 answer added Jon Raynor timeline score: 4
Oct 6, 2016 at 10:59 review Close votes
Oct 12, 2016 at 3:03
Oct 6, 2016 at 10:23 answer added Arseni Mourzenko timeline score: 3
Oct 6, 2016 at 8:49 history edited CodesInChaos CC BY-SA 3.0
added 8 characters in body
Oct 6, 2016 at 8:42 comment added CodesInChaos (When you write "gb", you mean "GB", right?)
Oct 6, 2016 at 8:41 comment added CodesInChaos If you have a decent computer, loading everything into RAM should still work with these file sizes.
Oct 6, 2016 at 8:17 comment added thorsten müller Seriously, why do you accept such limitations? No database for two one gig files with these specs? tell them to do the job themselves. In any case this sounds like reinventing the wheel, use some data analytics tool like Knime, RapidMiner or even IBM's Watson Analytics.
Oct 6, 2016 at 8:08 comment added Ewan You 'cant use a database' but your approach is 'program my own database engine' whats the real restriction? cant install a database? just use an inmemory one like sqlite
Oct 6, 2016 at 7:50 review First posts
Oct 25, 2016 at 18:19
Oct 6, 2016 at 7:49 history asked Rhodes73 CC BY-SA 3.0