Beliebte Suchanfragen

Cloud Native



Agile Methoden



Using Google Sheets as database

28.3.2019 | 8 minutes of reading time

(feature image by unsplash-logoVincent Botta )


Sometimes you just need a simple database and also the ability to have a really simple way to edit the data in a quick and straightforward way, without compromising on integrity or accountability. This is where Google Sheets offers a totally valid solution!

Google Sheets provides us with an already familiar interface to create, edit, and view all our data in columns. We can search, order, and even do bulk operations. Also, Google provides an API to use these sheets in a programmatic way, which we are going to use for this article.

Create the database

First, you need to create a new spreadsheet. I trust you know how to do that. Then create your first table (aka worksheet) and a couple of columns. In my example, I created a recipes table with columns for a unique identifier (id), the name, a description (desc), and two date fields (createdAt, updatedAt).

Sheet setup

The two date fields should be auto-filled by our “database”. So how are we going to do that? Simple: We attach a script to our spreadsheet. Goto Tools -> Script Editor and this will open a new window with an empty script.

1function onEdit(e) {
2  const columns = e.source
3    .getActiveSheet()
4    .getRange('1:1')
5    .getValues()[0];
7  const createdIdx = columns.indexOf('createdAt');
8  const updatedIdx = columns.indexOf('updatedAt');
10  const isNewRow = !e.source
11    .getActiveSheet()
12    .getRange(e.range.getRow(), createdIdx + 1)
13    .getValue();
15  if (createdIdx >= 0 && isNewRow) {
16    e.source
17      .getActiveSheet()
18      .getRange(e.range.getRow(), createdIdx + 1)
19      .setValue(new Date().toISOString());
20  }
22  if (updatedIdx >= 0) {
23    e.source
24      .getActiveSheet()
25      .getRange(e.range.getRow(), updatedIdx + 1)
26      .setValue(new Date().toISOString());
27  }

onEdit is a callback that is invoked every time a cell was edited by a user. You can read about onEdit and the event argument in Googles API documentation.

In the script above, we first check if the respective date column exists, then for createdAt we have to check if it is filled already. If not populate the field with the current date. updatedAt will be filled with the current timestamp after every edit made to the row by a user.

Now we have our very basic database table set up and ready to be accessed by an application.

Creating the API

We are going to implement the database API in Node.js. Google provides a library to communicate with its APIs called googleapis which is available as a npm module. Go and create a new project and install this library. Also, we are going to use typescript because we are not cavemen, and LokiJS as an in-memory database so we don’t have to constantly call the spreadsheet API.

1$> mkdir sheet-api
2$> cd sheet-api
3$> yarn init
4question name (sheet-api):
6$> yarn add googleapis google-auth-library \
7    typescript lokijs \
8    @types/node @types/lokijs
9[1/4] ?  Resolving packages...
11✨  Done in 13.37s.

Service account

Now to connect to our database we need to authenticate with the Google API. To do this we will create a service account. Navigate to the Google Cloud Platform Console IAM & admin -> Service accounts. You should have a GCP project based on the sheet scripts we created earlier. If not create a new project. Create a new service account (I named mine sheet-bot), you don’t need to assign any roles, but do need to create a key. Choose JSON as the option for this key, and it should download you a credentials file containing all information needed to authenticate as this service account in our next steps. Save this file to your project folder as sheet-api/credentials.json. Also save the email address of the service account, because we are going to need it later.

GCP service account


To make use of our newly created service account, we are going to use google-auth-librarys utils to create a JWT and authorize our script. We will encapsulate this as a function in api/auth.ts:

1import { readFile } from 'fs';
2import { resolve } from 'path';
3import { promisify } from 'util';
4import { JWT } from 'google-auth-library';
6const promisedFile = promisify(readFile);
8export async function auth() {
9  const credentials = JSON.parse(
10    await promisedFile(resolve(__dirname, '../credentials.json'), 'utf-8'),
11  );
12  const client = new JWT({
13    email: credentials.client_email,
14    key: credentials.private_key,
15    scopes: [''],
16  });
17  await client.authorize();
18  return client;

Accessing the sheet

We will wrap the googleapis methods to operate on sheets with a helper function. First, because we don’t need all the functionality the sheet-API provides, and second we can slap some typings onto this and provide some level of confidence when we later deal with the data coming from and into the sheets.

1import { google } from 'googleapis';
2import { auth } from './auth';
4const sheetsApi = google.sheets({ version: 'v4' });
6export async function readSheet(
7  spreadsheetId: string,
8  range: string,
9  firstRowAsKeys?: true,
10): Promise<T[]>;
11export async function readSheet(
12  spreadsheetId: string,
13  range: string,
14  firstRowAsKeys: boolean = true,
15): Promise<T[] | string[][]>; {
16  const {
17    data: {
18      values: [keys, ...values],
19    },
20  } = await sheetsApi.spreadsheets.values.get({
21    auth: await auth(),
22    spreadsheetId,
23    range,
24    valueRenderOption: 'UNFORMATTED_VALUE',
25  });
26  return firstRowAsKeys
27    ? =>
28        keys.reduce(
29          (acc, key, idx) => ({
30            ...acc,
31            [key]: columns[idx],
32          }),
33          {} as T,
34        ),
35      )
36    : [keys, ...values];

The code itself is pretty straightforward: we await the data with the use of our auth-helper detailed above. Then we parse the sheet data, which comes in a two-dimensional array. If we want to treat the first row as keys we are going to create an array of objects, where the properties are the mapped columns of each row. Take note at the valueRenderOption: 'UNFORMATTED_VALUE', this ensures that data, that is formatted in a special way (e.g. currencies) arrives as a raw value in our application (e.g. without the currency sign).

One last thing to do is to add the service account to our database sheet as a collaborator so it can access the sheet. Goto the sheet and click the button Share in the top-right corner. Here you just have to paste the email you saved earlier and add it with read/write access.

Google sheet share settings

Creating a consumer

Now that we have our basic API wrapper finished, we can start creating our business model.

1export type Recipe = {
2  id: string;
3  name: string;
4  desc: string;
5  createdAt: string;
6  updatedAt: string;

This is just our DTO with all columns/properties we created in our sheet.

1import * as Loki from 'lokijs';
2import { readSheet } from '../api/sheets';
3import { Recipe } from './Recipe';
5const sheetId = '1o0VAQ4f2QafBjLUd53yCWEtdwKonu5wPM33CttxBTXI';
6const sheetRange = 'Recipes!A:E';
8const db = new Loki('recipes.json');
9const collection = db.addCollection('recipes', { indices: ['id'] });
11export async function setup() {
12  const data = await readSheet(sheetId, sheetRange);
13  collection.insert(data);
16export async function refresh() {
17  const data = await readSheet(sheetId, sheetRange);
18  const ids = =>;
19  collection.findAndUpdate(
20    obj =>
21      ids.includes( &&
22      new Date(data.find(d => === >
23        new Date(obj.updatedAt).getTime(),
24    obj => Object.assign(obj, data.find(d => ===,
25  );
26  collection.findAndRemove({ id: { $not: { $in: ids } } });
27  collection.insert(data.filter(d => !collection.findOne({ id: })));
30export { collection };

As mentioned above, we are going to utilize LokiJS as an in-memory database to provide some basic things that come in handy (query, sorting, indices etc), so we don’t have to make that many API calls to query our sheet.

We are going to load all data in our sheet into memory on setup. This should be fine for every use case where Google Sheets is a viable option (hint: if you have more data than memory can reasonably hold, use a real database), you could even modify the setup to load the database in a build step and never query Google API for data in production setups. This is what I did at a company I worked for btw.

Next, we have a refresh function that will update our in-memory database to the current state of the sheet (insert, update, and remove entities).

And lastly we export the collection to further use in our application:

1import { createServer } from 'http';
2import { collection, setup, refresh } from './store/recipes';
4const port = +process.env.PORT || 8000;
6setup().then(() => {
7  createServer(async (req, res) => {
8    try {
9      const data = collection.find();
10      res.statusCode = 200;
11      res.end(JSON.stringify(data), 'utf8');
12    } catch (err) {
13      res.statusCode = 500;
14      res.end(JSON.stringify(err));
15    }
16  }).listen(port, () => {
17    console.log(`?  Server listening on port ${port}!`);
19    setInterval(refresh, 30000);
20  });

This code starts a basic HTTP server that will serve all our recipes as a JSON array, and update with new data every 30 seconds.


Google Sheets provides an easy-to-set-up and -manage way to fill some very basic database needs. You can automate scripts in the sheet to emulate calculated fields or validity checks when edits are made. Combined with an in-memory database, we can achieve reasonable performance, especially if we look into the costs (basically zero).

Like I mentioned above, I did implement this database in a company I worked for. We had a catalog of cities, airports, and countries, plus some relations (an airport belongs to a city etc.), which I validated on change with scripts and fancy drop-down menus. The benefit was that even non-technical personnel could easily access this data and make modifications (tag airports or cities, check boxes for cities that should / should not appear in certain features etc.) without coding a complex UI or buying third party software. We then bundled all data in a build step and ran some basic verification. All in all, it worked seamlessly.


The complete project can be found on GitHub:

share post




More articles in this subject area

Discover exciting further topics and let the codecentric world inspire you.


Gemeinsam bessere Projekte umsetzen.

Wir helfen deinem Unternehmen.

Du stehst vor einer großen IT-Herausforderung? Wir sorgen für eine maßgeschneiderte Unterstützung. Informiere dich jetzt.

Hilf uns, noch besser zu werden.

Wir sind immer auf der Suche nach neuen Talenten. Auch für dich ist die passende Stelle dabei.