CSV to Database Import Script using Google Sheets or Excel

Dulaj Rajitha
3 min readMar 25, 2021
programming or formula?

Sometimes, the data population has to do some data conversions to JSON or SQL format. But sometimes working with JSON is not the fastest way for a non technical person. So most of the Time data will be comes as a CSV/EXCEL file and has to covert data in to JSON or SQL script and migrate it to a Database.

Sometimes, the data population has to do some data conversions to JSON or SQL format. But sometimes working with JSON is not the fastest way for a non technical person. So most of the Time data will be comes as a CSV/EXCEL file and has to covert data in to JSON or SQL script and migrate it to a DB like Elasticsearch or SQL DB.

This article will show how to directly use a formula in sheets and generate a DB insert/update query without any programming.

How I was used to do this is by programatically creating a script to read the CSV file and map it to a DB population script based on the DB type. But if this data is keep on updating in the document, then it’s an extra step to import the CSV file and run the program/script again to generate the DB import script.

But there is a simpler way to do that without any programming, just by using a string template formula in EXCEL which will make the process way more simpler.

So what we have to do is to have a column for the DB import query in the same excel file and apply the formula for all the rows. Then once the data is updated, the formula will generate the DB insertion script for every row without having to run any additional program/script.

{
"id": "1234",
"name": "The Name",
"quantity": 1,
"metaData": {
"id": "1234",
"description": "",
"values": ["A","B","C"]
}
}

If your application DB do have a complex document schema like above with nested objects and arrays for every row of the dataset, then still we can use a template for generate the data import script.

This also applies if it’s SQL DB and need to insert/update multiple tables per row.

Then we don’t have to run the program always once the data is updated since formula is already applied for the data. So we can simply execute the generated script to insert/update the data in the DB.

Here is a simple example.

Consider the following CSV file format and you want to create a JSON out of it to insert into Elasticsearch. So we need a bulk import script generated.

sample data

Example Formula

Example Formula

="{""index"":{""_index"":""my-index"",""_id"":"""&A2&"""}} {""id"":"""&A2&""",""name"":"""&B2&""",""quantity"":"&C2&",""metaData"":{""id"":"""&D2&""",""description"": """&E2&""",""values"": ["""&F2&""","""&G2&""","""&H2&"""]}}"
generated bulk import script

We can use the same approach for create SQL statements as well if the DB is a SQL one.

Next step is to execute the generated script in Kibana.

Hope this will save you some time in case you come across a similar usecase.

--

--