7

I know that I can import .csv file into a pre-existing table in a sqlite database through:

.import filename.csv tablename 

However, is there such method/library that can automatically create the table (and its schema), so that I don't have to manually define: column1 = string, column2 = int ....etc.

Or, maybe we can import everything as string. To my limited understanding, sqlite3 seems to treat all fields as string anyway?

Edit: The names of each column is not so important here (assume we can get that data from the first row in the CSV file, or they could be arbitrary names) The key is to identify the value types of each column.

2
  • 3
    How is it supposed to know the names of the fields? Are these in .csv? Commented Nov 22, 2009 at 11:14
  • 1
    From the first row (like most CSV files). But the names are rather trivial in this case. The key problem is to identify the field type. Commented Nov 22, 2009 at 18:53

3 Answers 3

5

This seems to work just fine for me (in sqlite3 version 3.8.4):

$ echo '.mode csv > .import data_with_header.csv some_table' | sqlite3 db 

It creates the table some_table with field names taken from the first row of the data_with_header.csv file. All fields are of type TEXT.

Sign up to request clarification or add additional context in comments.

2 Comments

If "The key is to identify the value types of each column" then it doesn't help to import the *.csv file, because every column becomes the same TEXT type. I have never seen a *.csv including the correct type for every column.
@PeterCo my answer was submitted before the edit to the original question
1

You said yourself in the comment that its a nontrivial problem to determine the types of columns. (Imagine a million rows that all look like numbers, but one of those rows has a Z in it. - Now that row has to be typed "string".)

Though non-trivial, it's also pretty easy to get the 90% scenario working. I would just write a little Python script to do this. Python has a very nice library for parsing CSV files and its interface to sqlite is simple enough.

Just load the CSV, guess and check at the column types. Devise a create table that encapsulates this information, then emit your insert intos. I can't imagine this taking up more than 20 lines of Python.

Comments

1

This is a little off-topic but it might help to use a tool that gives you all the SQL functionality on an individual csv file without actually using SQLite directly.

Take a look at TextQL - a utility that allows querying of csv files directly which uses SQLite engine in memory: https://github.com/dinedal/textql

textql -header -sql "select * from tbl" -source some_file.csv

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.