In a previous post, I talked about saving time-series data to TimescaleDB, which is an extension on top of PostgreSQL. The post used an Azure Function with an Event Hub trigger to save the data in TimescaleDB with a regular INSERT INTO statement.
The Function App used the Windows runtime which gave me networking errors (ECONNRESET) when connecting to PostgreSQL. I often encounter those issues with the Windows runtime. In general, for Node.js, I try to stick to the Linux runtime whenever possible. In this post, we will try the same code but with a Function App that uses the Linux runtime in a Consumption Plan.
Make sure Azure CLI is installed and that you are logged in. First, create a Storage Account:
az storage account create --name gebafuncstore --location westeurope --resource-group funclinux --sku Standard_LRS
Next, create the Function App. It references the storage account you created above:
az functionapp create --resource-group funclinux --name funclinux --os-type Linux --runtime node --consumption-plan-location westeurope --storage-account gebafuncstore
You can also use a script to achieve the same results. For an example, see
Now, in the Function App, set the following Application Settings. These settings will be used in the code we will deploy later.
- host: hostname of the PostgreSQL server (e.g. servername.postgres.database.azure.com)
- user: user name (e.g. user@servername)
- database: name of the PostgreSQL database
- EH: connection string to the Event Hub interface of your IoT Hub; if your are unsure how to set this, see this post
You can set the above values from the Azure Portal:
The function uses the first four Application Settings in the function code via process.env:
The application setting EH is used to reference the Event Hub in function.json:
git clone https://github.com/gbaeke/pgfunc.git
az account show
The npm install command installs the pg module as defined in package.json. The last two commands log you in and show the active subscription. Make sure that subscription contains the Function App you deployed above. Now run the following command:
Now issue the following command to package and deploy the function to the Function App we created earlier:
func azure functionapp publish funclinux
This should result in the following feedback:
You should now see the function in the Function App:
To verify that the function works as expected, I started my IoT Simulator with 100 devices that send data every 5 seconds. I also deleted all the existing data from the TimescaleDB hypertable. The Live Metrics stream shows the results. In this case, the function is running smoothly without connection reset errors. The consumption plan spun up 4 servers: