IoT Hub and Azure Time Series Insights

Azure Time Series Insights is a new service that makes it very easy to store and visualize time series data. In this blog post, we will create a dashboard that looks like the one below (click to enlarge):

image

The dashboard has four sections:

  • Query1: a heat map of events per device; in this case there are 20 devices sending data every 2 seconds
  • Query2: a line graph with random “temperature” data
  • Query3: a line graph with both “temperature” and “humidity” data
  • Query4: a line graph with “humidity” data

The events are sent to an IoT Hub using the following JSON shape: {temperature: x, humidity: y} where x and y are randomized floating point numbers, generated by an IoT device simulator.

Step 1: Create IoT Hub

Install Azure CLI 2.0, and then use az login to login. Use az account list to list your subscriptions and use az account set –subscription name_or_id to set the default subscription. Next, issue the following commands to create a resource group and an IoT Hub (set location to your preference):

az group create --name resource_group_name --location westeurope
az iot hub create --sku F1 --name iot_hub_name --resource-group resource_group_name

As a best practice, create a separate consumer group on the Events endpoint. In the Azure Portal, in the properties of the IoT Hub, click Endpoints. Then click Events and add a consumer group underneath $Default. Click Save.

Record the Connection String – primary key setting of the device or  iothubowner Shared access policy. Click Shared Access Policies, and device to find this connection string. It will be in the form of:

HostName=iot_hub_name.azure-devices.net;SharedAccessKeyName=keyname;SharedAccessKey=b5dARuGPhL6wdgHboUIhEC6LlcFalIjfEdh4aXYa1WI=

You will need this connection string later to configure the IoT Simulator.

Step 2: Create Time Series Insights Environment

In the Azure Portal, click the green + and navigate to Internet of Things. Click Time Series Insights and follow the on-screen instructions. You will end up with:

image

I selected one unit of the S1 tier which is more than enough for this example.

Step 3: Set Data Access Policy

Even though you created the Time Series Insights Environment, you still need to grant yourself access to the data. Click Data Access Policies and add your user or group and a role of Contributor.

image

Step 4: Add Event Source

We will add the IoT Hub we created earlier as an event source. Click Event Sources and then click Add. Give the event source a name and set the source to IoT Hub. Then select an IoT Hub from your available subscriptions and do not forget to set the consumer group to the one you created in step 1. If your event data has a timestamp, you can enter the timestamp property name. If you do not specify the timestamp, the event enqueue time set by the IoT Hub will be used.

Note that Azure Time Series Insights also supports Event Hubs as an event source.

Step 5: Configure the IoT simulator

Head over to https://github.com/gbaeke/iot-simulator/releases/tag/v0.3 and download iot-simulator.exe to a folder of your choice. In the same folder add a file called config.json with the following contents:

{
     "Interval":5,
     "IoTHubs":["iot_hub_name.azure-devices.net”],
     "SasTokens":["SharedAccessSignature sr=..."],
     "DevGroups":[
        {"Prefix":"ts","DeviceNum":20,"Firmware":"1.0","IoTHub": 0}
     ]
}

In the SasTokens array, replace SharedAccessSignature sr=… with a Sas token that has the necessary rights to submit events to the IoT Hub. One way of doing so, is with Device Explorer. Once installed, copy the connection string from step 1 in the connection string box and click Generate SAS. Copy the Sas token in the config.json file.

image

With the config.json correctly configured, from a command prompt, start iot-simulator.exe. It will connect to the IoT Hub, create the devices and start sending data every 5 seconds from every device. In the sample config file, you can set the interval in seconds (Interval) and the amount of devices (DeviceNum). To clean up the devices, run iot-simulator.exe –r.

Step 6: Visualize the data

Now go to https://insights.timeseries.azure.com and login with the credentials you used in step 3. You will get a screen to select data. I selected Last 60 Mins from the quick times dropdown and then clicked the search icon:

image

In the following screen, click Heatmap and then configure the box at the left with a descriptive title. Also select a split by deviceid to have an idea about the number of events per time window per device and to spot devices that stopped sending data.

image

Now, at the right top corner, click the circle with the four squares. You end up with:

image

Now click the + in the top, right section. Select a time range again and then, at the left, change the measure from Events to Temperature. Automatically, the temperature will be averaged over the interval size. Change the term (Term 1) to Temperature and click the circle with the four squares again.

The temperature line graph has been added and you can now click the copy icon and create the same visualization for humidity.

image

Now it’s easy to create the other panel with both temperature and humidity. Give it a go and try out other visualizations. When you are finished, you can click the Save icon and save this perspective. Yep, these visualizations are called perspectives!

It’s still early days for the service and many features will be added in the near future. If you are already working with event data coming into an Event Hub and IoT Hub, it should be easy to add a new consumer group and start analyzing the data with this service.

Communication between microservices in Kubernetes with Go Micro

In this post, we continue the story we started with two earlier posts:

In the previous post, I described a very simple service written with the help of Go Micro. It exposes an RPC call Get that retrieves a device from a list of devices. Now we want a simple data service we can call via a RESTful interface like so: http://name_or_ip/data/device1. Note that no actual data is returned by the call. We just return true if the device exists and false if it does not.

The code for the “data” service can be found here: https://github.com/gbaeke/go-data/blob/master/main.go. The code is again very simply. To expose the RESTful interface, I used Gorilla. In the handler for the route /data/{device}, we call the Device service using a Go Micro client. Because the client is configured to use Kubernetes as the registry, it will look up where the Device service lives and call it. Let’s take a look at the code to call the Device service.

It starts with declaring a variable of type device.DevSvcClient which is defined in the generated code by protoc (see code for the device service here):

// devSvc is the service for the client
var (
	cl device.DevSvcClient
)

In the init() function, the client is created and configured to call the go.micro.srv.device service:

func init() {
	// make sure flags are processed
	cmd.Init()

	// initialise a default client for device service
	cl = device.NewDevSvcClient("go.micro.srv.device", client.DefaultClient)

}

In the route handler, the device name is extracted from the URL and then we call another function that returns true if the device exists and is active.

deviceActive(&device.DeviceName{Name: deviceName})

The deviceActive function looks like:

func deviceActive(d *device.DeviceName) bool {
	//call Get method from devSvcClient to obtain the device
	fmt.Println("Getting device", d.Name)
	rsp, err := cl.Get(context.TODO(), d)
	if err != nil {
		fmt.Println(err)
		return false
	}

	return rsp.Active
}

The above function expects a pointer to a DeviceName struct which is again defined by the protoc generated code used by the Device service. As you can see, calling the Get method is trivial. Behind the scenes, the client code locates the Device service in Kubernetes and does all the serialization/deserialization work to and from a binary format.

After the service is deployed in Kubernetes (see this post), we can check if it works using:

curl http://ip_of_loadbalancer/data/device1

The above should return the following:

Device active:  true
Oh and, no data for you!

I told you the service returned no data! 🙂

We now have two services that communicate using RPC in a Kubernetes cluster. Writing RESTful APIs and putting them in front of the RPC services is easy enough but something is off though! We don’t want to deploy many services that offer a RESTful API and then expose them using multiple external IPs because that’s just cumbersome. What we do want is to use the API Gateway pattern. In a future post, I will describe how to use Go Micro’s API gateway and an API service that exposes the device service to the outside world using a RESTful interface. Quite the mouthful… Stay tuned!

Microservices on Kubernetes: a simple example in Go

In the previous post, Getting started with Kubernetes on Azure, we talked about creating a Kubernetes cluster and deploying a couple of services. There are basically two services:

  • Data: a service that exposes an endpoint to pick up data for an IoT device; you call it with http://service_endpoint:8080/data/devicename
  • Device: a service that can be used by the Data API to check if a device exists; if the device exists you will see that in the response

When you call the Data service, it will call the Device service using gRPC, using HTTP as the transport protocol. You define the service using Protocol Buffers. gRPC works across languages and platforms, so I could have implemented each service using a different language like Go for the Device service and Node.js for the Data service. In this example, I decided to use Go in both cases and use Go Micro, a pluggable RPC framework for microservices. Go Micro uses gRPC and protocol buffers under the hood with changes specific to Go Micro.

Ok, enough with the talk, let’s take a look how it is done. The Device service is kept extremely simple for an abvious reason: I just started with Go Micro and then it is best to start with something simple. I do expect you know a bit of Go from here on out. All the code can be found at https://github.com/gbaeke/go-device.

Lets start with the definition of Protocol Buffers, found in proto/device.proto:

syntax = "proto3";

service DevSvc {
    rpc Get(DeviceName) returns (Device) {}
}

message DeviceName {
    string name = 1;
}

message Device {
    string name = 1;
    bool active = 2;
}

We define one RPC call here that expects a DeviceName message as input and returns a Device message. Simple enough but this does not get us very far. To actually use this in Go (or another supported language), we will generate some code from the above definition. You need a couple of things to do that though:

  • protoc compiler: download from Github  for your platform
  • protobuf plugins for code generation for Go Micro: run go get github.com/micro/protobuf/{proto,protoc-gen-go} (if you have issues, use 2 gets, one for proto and one for protoc-gen-go)

To actually compile the proto file, use the following command:

protoc --go_out=plugins=micro:. device.proto

That compiles device.proto to device.pb.go with help from the micro plugin. You can check the generated code here. Among other things, there are Go structs for the DeviceName and Device message plus several methods you can call on these structs such as Reset() and String().

Now for main.go! You’ll need several imports: for the generated code but also for the dependencies to build the service with Go Micro. If you check the code, you will also find the following import:

_ "github.com/micro/go-plugins/registry/kubernetes"

As stated above, Go Micro is a pluggable RPC framework. Out of the box, a microservice written with Go Micro will try to register itself with Consul on localhost for service discovery and configuration. We could run the Consul service in Kubernetes but Kubernetes supports service registration natively. Kubernetes support is something you add with the import above. That is not enough though! You still need to tell Go Micro to use Kubernetes as the registry, either with the —registry command line parameter or with an environment variable MICRO_REGISTRY. Check https://github.com/gbaeke/go-device/blob/master/go-device-dep.yaml file where that environment variable is set. Besides Consul and Kubernetes, there are other alternatives. One of them is multicast DNS (mdns) which is handy when you are testing services on your local machine and you don’t have something like Consul running.

If you want to check the information that is registered, you can do the following (after running kubectl proxy --port=8080):

curl http://localhost:8080/api/v1/pods | grep micro

Each pod will have an annotation with key micro.mu/service-<servicename> with information about the service such as its name, IP address, port, and much more.

Now really over to main.go, which is pretty self explanatory. There’s a struct called DevSvc which has a field called devs which holds the map of strings to Device structs. The DevSvc actually defines the service and you write the RPC calls as methods of that struct. Check out the following code snippet:

// DevSvc defines the service
type DevSvc struct {
	devs map[string]*device.Device
}
func (d *DevSvc) Get(ctx context.Context, req *device.DeviceName, rsp *device.Device) error {
	device, ok := d.devs[req.Name]
	if !ok {
		fmt.Println("Device does not exist")
		return nil
	}

	fmt.Println("Will respond with ", device)

	// this also works
	rsp.Name = device.Name
	rsp.Active = device.Active

	return nil
}

The Get function implements what was defined in the .proto file earlier and uses pointers to a DeviceName struct as input and a pointer to a Device struct as output. The code itself is of course trivial and just looks up a device in the map and returns it with rsp.

Of course, this handler needs to be registered and this happens in the main() function (besides setting up the service and implementing a custom flag):

// register handler and initialise devs map with a list of devices
device.RegisterDevSvcHandler(service.Server(), &DevSvc{devs: LoadDevices()})

If you want to test the service and call it (e.g. on the local machine) then clone the repository (or get it) and run the server as follows:

go run main.go --registry=mdns

In another terminal, run:

go run main.go --registry=mdns --run_client

When you run the code with the run_client option, the runClient function is called which looks like:

func runClient(service micro.Service) {
	// Create new client to call DevSvc service
	DevClient := device.NewDevSvcClient("go.micro.srv.device", service.Client())

	// Call Get to get a device
	rsp, err := DevClient.Get(context.TODO(), &device.DeviceName{Name: "device2"})
	if err != nil {
		fmt.Println(err)
		return
	}

	// Print response
	fmt.Println("Response: ", rsp)
}

This again shows the power of using a framework like Go Micro: you create a client for the DevSvc service and then simply perform the remote procedure call with the Get method, passing in a DeviceName struct with the Name field set to the device you want to check. The client uses the service registry to know where and how to connect. All the serialization and deserialization is handled for you as well using protocol buffers.

So great, you now have a little bit more information about the Device service and you know how to deploy it to Kubernetes. In another post, we’ll see how the Data service works and explore some other options to write that service.

Temboo, Twilio and Nexmo: SMS and voice messages from your IoT device

In this post, I will provide an overview of how to use Twilio and Nexmo to send SMSs and voice messages directly from your device. I will use a Particle Photon but this also works from an Arduino, or a Raspberry Pi or basically any other system. The reason for this is that I will also use Temboo, an easy to use service that basically provides a uniform way to call a wide variety of APIs and even helps you with a code builder.

I will use the same basic sketch form earlier examples. This means there is a photoresistor which measures the amount of light but also a button that will trigger the calls to Temboo to send an SMS and a voice message with the current sensor value from the photoresistor.

Let’s get started shall we? You will first need accounts for all three services so go ahead and sign up. They all have free accounts to get started but remember they are all paying services. It’s up to you to decide how useful you find these services.

For Temboo, you will need to provide the account name, app key name and app key. Sadly, in the free Temboo tier, this key is only valid for a month and you will need to manually change it. I added these values as #defines in a header file called TembooAccount.h. Be sure to use #include “TembooAccount.h” in you .ino file. The contents of the TembooAccount.h:

image

In your .ino file, we’ll create two functions:

  • void runSendSMS(String body)
  • void runSendVoice(String body)

When you want to send an SMS or send a voice message, you call the appropriate function with the message you want to send or the text you want translated to speech.

The contents of the function is easy to write because you don’t have to. Temboo provides a code generator for you. When you are logged in, just go to https://temboo.com/library/ and select the Choreo you want to use. For the SMS, you select Twilio / SMSMessages / SendSMS. You will now be asked for parameters for the Choreo:

image

After providing all the inputs, you will find the code below and then you will pick and choose what you need. You can find an example for SMS and Voice in the following gist: https://gist.github.com/gbaeke/15596e3e2d185eb11720c965ab33e179. The voice Choreo uses Nexmo / Voice / TextToSpeech. Tip: Nexmo can also take input from your phone (like press ‘1’ to turn on sprinklers) and send them back to your device!

To actually fire off the SMS and voice message, we’ll do that when the button is pressed:

image

As you can see, Temboo and the APIs it exposes as Choreos makes it really easy to work with all sorts of APIs. I have only used Twilio and Nexmo here but there are many others. Again, these are all paying services and the lowest Temboo tier is quite pricey for home users. If you find it a bit too pricey, you can always use the Particle IFTTT integration to achieve similar results.

Controlling Sonos from a Particle Photon

Now for something fun! Let’s control a Sonos from a Particle Photon and a connected button. I connected a Grove Button to the Particle with simple male-to-female wires. The SIG line on the button should go to a digital port (D0 in my case). When the button is pressed, the port will read HIGH and otherwise LOW.

Controlling Sonos is another matter though. Sonos should really make simple APIs available and/or provide access through IFTTT and similar services. Until they do that, you will need to control Sonos the hard way, by connecting directly to it from the Particle and sending commands over their HTTP interface. Luckily, the people from Hover Labs, have some code on GitHub that you can build upon. I simply copied their code in my Particle app and removed references to the Hover device. By the way, the Hover is a cool device in its own right that you should definitely check out as well!

image

In the above snippet, you see part to the loop() code that checks for a button press. Since we want to toggle between Sonos PLAY and PAUSE, there’s some code for that. The hard work is done by the sonos() function which takes commands like PLAY, PAUSE, NEXT, PREVIOUS. You can check out the full code in the following gist: https://gist.github.com/gbaeke/240fb221204ff828dec06150014ec5fd. Note that the code also contains the LED and photoresitor code from earlier examples. The Sonos control is also very basic as it only implements PLAY and PAUSE so you need something in the queue. But at least you have a start to create more complex interactions.

You could also create a Particle Function that executes the Sonos code which would enable you to control your Sonos from the cloud and even connect this with other services via IFTTT. For instance, you could start playing your Sonos when you are arriving home.

Have fun controlling Sonos from your Particle!!!

Particle and Azure IoT Hub: forward events for storage and analysis

In a previous post about Partice published events, you have seen how to publish custom events to the Particle Cloud. Other devices or applications can subscribe to these events and act upon them. What if you want to do more and connect these events to custom applications? In that case, Particle has a couple of integrations that might help:

image

In this post, I will take a look at Azure IoT Hub integration which, at the moment of writing, is still in beta. Note that this integration works with events you publish from your device with Particle.publish and not with Particle Variables or Functions. Remember that in the post about events, we published a lights on and lights out event. For simplicity, we will build upon those events here.

To configure the IoT Hub integration, you will need a few things:

  • An Azure Subscription so you can logon to the portal at https://portal.azure.com (see https://azure.microsoft.com/en-us/free/ to get started)
  • An IoT Hub that you create from the portal; to get started, use the free tier which allows you to publish 8000 events per day (give or take; depends on message size as well); in the portal, use the + button

An IoT Hub has a name and works with shared access policies and access keys to be able to control the IoT Hub and send messages. To get to the policies, just click Shared Access Policies.

image

Although considered bad practice, I will use the iothubowner policy which has all required rights. Click iothubowner to view the access keys and note the primary access key. You will need that key in a moment.

In Particle Console, click the Integrations icon and click new integration. In the configuration screen, you will see:

image

It’s pretty self explanatory once you have your IoT Hub created in Azure. Just fill in the required information and note that the event name is the name of the event you have given in the call to Particle.publish. My events are called lights on and lights out and I will use lights as Event Name. This will catch both events!

To test this, the photoresistor was given enough light to fire the events. This is the result when you click on the integration after it was created:

image

When you click on one of the log entries, you will see more details:

image

You see the event payload that was sent to IoT Hub plus details about the call to IoT Hub using HTTP POST.

In IoT Hub, you will see a couple of things as well. First of all, the events:

image

In the list of devices, you will find a device with the id of the Particle Photon:

image

Note: Azure IoT Hub requires devices to authenticate but this is taken care of automatically by Particle Cloud

What you do now with these messages is up to you. You can use the new endpoints and routes feature of IoT Hub to forward events to Event Hubs or Service Bus. Or you could connect Stream Analytics to IoT Hub and save your events to Azure Storage, Data Lake, SQL, Document DB or stream the data to a real-time Power BI dashboard.

Note that although an Azure Subscription is free, not all services have free tiers. For instance, IoT Hub has a free tier but Stream Analytics does not. And although IoT Hub’s free tier is great to get started, it can only process a limited amount of events. It’s up to you to control the rate of events sent from your devices. For home use or small PoCs you should not run into issues though!

IoT with Particle and Porter

In an earlier article, we took a look at Particle Functions and Variables. We wrote a simple application that can blink an LED with a function and read the value of a photoresistor with a variable.

Although you can easily call the function or read the variable with the Particle CLI or with a REST call (using cURL for instance), you might want an easy web-based experience to work with your device. Porter (http://porterapp.com/) might be the answer!

I’ll quickly describe how Porter works. It’s so easy to use though that it doesn’t need much describing. After signing up and linking to Particle, you can add your devices. In the screenshot below, you can see my device:

image

The cool thing is that Porter automatically finds all your functions and variables and exposes them to you. Using the Customize option, you have some control over the UI elements. In the above screen, I changed the Led function to use on and off buttons instead of the default text input field where you need to type the parameter to the function (on or off). The variable is exposed as well and you can obtain the most recent value with the refresh icon.

Porter also has a mobile app that exposes the same functionality:

file-3

You can also work with Particle events. We discussed events in a previous post where we published events based on a threshold of 2000 for the photoresistor value. The events will show up in the Events tab (Web UI shown below):

image

Based on these events, you can define all sorts of Actions:

image

In the above screen, an action is defined that sends a notification when the lights on event is received. This noticification works together with the Porter app on your phone to notify you of the event. Other actions are:

  • Web Request: HTTP PUT or GET with variable request data using tokens such as [data], [time], [device_name] and so on
  • Send an e-mail
  • Send an SMS

Note that Porter is a paying service and that e-mails and SMSs require a specific plan. They have a 30-day trial.

As you can see, it’s very easy to use Porter and for quick access and control of your prototypes, it’s a great service. It’s not very difficult to build a quick web UI for your device yourself but it all comes down to gettng off the ground quickly and focusing on what matters in the early stages.

IoT with Particle: publishing events

In the two previous posts, we discussed setup and talked about triggering actions and reading sensor data. Particle also allows you to publish events. You can subscribe to these events or pass them to other systems such as Azure IoT Hub.

Let’s build on the previous example with the LED and the photoresistor. When we read a high value from the photoresistor (yes, more light) we will publish a lights on event including the value we have read. When we read a low value, we will publish a lights out event.

In code, this is easily done. The setup part:

image

This is not very different from the earlier post. I added a boolean (true/false) variable called bright to maintain the state (is it bright or not) and we initialise the variable depending on the amount of light we measure at the start.

In the loop() part:

image

Above you see Particle.publish in action. We read the brightness every second. When it was not bright and brightness is above or equal to 2000, we send an event to the Particle Cloud. This way, you only publish the event when the state changes. Particle Publish takes 4 parameters:

  • The name of the event
  • The data you want to send along; here it’s the brightness value converted to a string with the built in String class and its constructor which can take an integer and returns it as a string
  • 60 is the TTL (default and cannot be changed for now)
  • PRIVATE: this is a private event that only authorized subscribers can subscribe to

Lastly, we still implement the Particle Function to turn the LED on or off remotely:

image

The events can be tracked from the Particle Console:

image

The question of course is, what can you do with published events? One course of action is to use these events for communication between your IoT devices. Another Particle device can use Particle.subscribe to subscribe to the events published by other devices. Using Particle.subscribe is very simple and somewhat analogous to a Particle Function. You can find out more about it here: https://docs.particle.io/reference/firmware/photon/#particle-subscribe-

Another course of action is to use Particle’s IFTTT integration to use IFTTTs rich ecosystem of connected services. Particle is one of these services so just provide IFTTT with credentials to Particle and you are set!

Do know that the published events are not stored by Particle. If you want to do that, one way of achieving this is with the Azure IoT Hub integration. In a later post, I’ll talk more about that.

IoT with Particle: Functions and Variables using the Build IDE

In yesterday’s post, I talked a bit about the setup process and initial configuration of a Particle Photon. To start quickly, we used the Tinker firmware and the Particle iOS app to light up a LED using digitalWrite to light it up with full brightness (full 3.3V) but also with analogWrite to vary the brightness depending on the value you write (between 0-255 using the PWM port D0).

Today, we’ll add a photoresistor and a LED with the LED positioned above the photoresistor. We’ll turn on the LED from the Particle Cloud using a Particle Function and we’ll read out the photoresistor value using a Particle Variable.

For the photoresistor, I only had a Grove Light Sensor lying around. If you don’t know the Grove system, it’s a a collection of sensors with simple four-wire connectors that typically work with an add-on board for these connectors. For the Photon, there is such a solution as well. To get started easily you could go for the starter kit: https://www.seeedstudio.com/Grove-Starter-Kit-for-Photon-p-2179.html. Since I do not have the Grove add-on board for the Photon, I connected the sensor using three male-to-female (two for power and ground and one for the signal) wires and connected the signal pin to port A0 on the Photon. Indeed, the photoresistor will output a value to read using analogRead. The value rises with increasing brightness.

So how do we turn on the LED from the Particle Cloud? That’s where Particle Functions come in. Particle Functions make it extremely easy to control your device from anywhere. In fact, it’s one of the easiest solutions I have found to date. But first, you have to know something about the integrated development environment called Build.

Particle’s web-based IDE: Build

You access the IDE from https://build.particle.io. The screenshot below shows the IDE with a new app ready to be coded:

image

If you are used to Arduino, it all looks pretty similar here but beware there are many subtle differences. The cool thing is that you can code your app here and flash the device from the Web using the flash icon in the top left. Let’s write a simple app to flash a led at port D1:

image

Okay, cool but not very interesting. Let’s put this LED under cloud control with a function.

Particle Functions

Let’s write a Particle Function that can turn the LED on or off remotely.

image

With the simple code above you have registered a Particle Function, led, that you can call remotely (with proper credentials of course). When you call the led function and you pass a parameter (always a string) the function ledToggler is executed on the device. Great, but how do you call the led function? There are several options:

  • use the Particle CLI
  • send an HTTP POST to a Particle API endpoint

The CLI is easy. After installing it (see https://docs.particle.io/guide/tools-and-features/cli/photon/), just execute the following command to see your devices and their functions: particle list (note: use particle login first to login with your Particle account)

image

Now call the function using particle call

image

Above you see two calls to turn the LED on and off. After each call, you also see the return value of the function.

To use the HTTP POST method, there’s a myriad of tools and frameworks to do so. From the command line, you can use cURL but you could also use Postman. I use cURL on Windows, which is part of Git Bash. You can also try https://curl.haxx.se/download.html. With cURLyou need to supply your device ID + an access token you can get from the IDE:

image

Above you see the same two calls to turn the LED on or off. The HTTP POST returns some JSON with the return_value from the function.

Particle Variables

Functions are great to trigger actions on your devices, but how do we read data from a sensor like the photoresistor in our case? That’s surprisingly easy again: just use a Particle Variable. Modify the code as follows:

image

Above, the A0 pin (called PR) is setup for reading values. In the loop, we keep reading the brightness from the photoresistor using analogRead followed by a delay of 1 second. A Particle Variable is defined that you can read from the cloud using the CLI or HTTP GET. With the CLI:

image

Without the LED above the photo resistor, inside the house, we get 1485 as a brightness value. With the LED turned on right above it, the value is 2303. Great!!! By the way, the LED is not too bright because I used a 1000 Ohm resistor.

Using cURL:

image

Wrap Up

You have now seen how easy it is to trigger actions with Particle Functions and read sensor data using Particle Variables. This functionality automatically comes with your Particle device at no extra cost and is completely driven from code. There is no need to use other services to post sensor data which keeps things simple. And I like simple, don’t you?

In a subsequent post, we’ll take a look at publishing event data using Particle Publish! Stay tuned!

IoT with Particle: a smooth experience

At ThingTank, the IoT brand of Xylos, we make our own IoT hardware which can be quite complex if you need to connect multiple sensors efficiently, or even multiple MCUs where each MCU has its own set of sensors. Most people that want to start with IoT (typically at home or for a small company proof of concept) use either Arduino or Raspberry Pi with one or two sensors connected. Both solutions are great in their own right but there are others! One such solution is Particle, a combination of both hardware, software and cloud. Let’s take a look at what they offer from a hardware and configuration perspective. Future posts will discuss their cloud offering and how to connect to other systems such as Azure IoT Hub.

Hardware

Particle sell their own hardware (like the Photon and Electron) but they also work with other hardware such as a Raspberry Pi. I bought a Photon from https://www.antratek.be/photon. It costs around 25€ which is not as cheap as some alternatives but still well worth the money.

image

After unpacking, I mounted it on a breadboard and gave it power from a wall socket using an adapter I had lying around that I used in the past to power a Raspberry Pi. Although you can, you do not have to connect the Photon to a computer to configure it. Yes, you heard that right! You can configure the Photon using a mobile app and you can flash new firmware OTA (over the air) right from a web-based IDE called Build. Let’s see how initial configuration works…

Configuration

The Photon only comes with WiFi, compared to the Electron which comes with 2G/3G and a global SIM card. To connect the Photon to WiFi (one of five connections the device can remember), use the Particle app for iOS or Android (the easiest method):

file9

To configure a new device, the app guides you through the whole process. The Photon will create its own WiFi network. After connecting your phone to that network, you can configure the network you want the Photon to connect to:

file10

When the process is finished, the device can be seen in the Particle Console:

image

As part of the configuration process, you can give the device a name. The device name (or device id) can later be used in HTTP calls or from the Particle CLI.

Tinker

This post will not discuss how to flash the device with a custom firmware (that’s for a later post). But even without a custom firmware, you can still start exploring the device and do useful things with the digital and analog ports using the mobile app and the out-of-the-box Tinker firmware. The Tinker firmware can always be flashed back to the device if needed.

In the mobile app, after selecting the device, you will see the port layout of the device:

file-4

 

Without going into details here, know that there is an onboard LED connected to digital port D7. When you select D7, you will be asked what you want to do:

file-5

In this case, we want to turn on the onboard LED so we obviously want to write to the port. After selecting digitalWrite, you can select D7 to set the port HIGH (3.3V) or LOW (GND). When the port is set to HIGH, the on-board LED will light up in blue. Cool no? Although not very useful, you have now configured the Photon to connect to the Particle back-end in the cloud and you can use their app to control the ports from the cloud as well.

Tinker works just a well with analogWrite. If you have an LED connected to D0, and a resistor from the other side of the LED to GND, you can send a value between 0 and 255 to the port which will light up the LED with varying brightness:

file-6

image

Note: D0-D7 are digital ports but D0~D3 may also be used as a PWM output (PWM = pulse width modulaton); that’s why you can send values ranging from 0 to 255 to those ports as well (analogWrite)

Summary

Particle has gone out of its way to make it as easy as possible to get started. Setting up the device is super simple and getting started with the built-in Tinker firmware makes it easier for beginners to understand how to use the ports without having to start coding. In follow-up posts, we’ll have a look at some of the cloud functionality and we’ll connect some more useful sensors like a photoresistor and a PIR sensor… Stay tuned!!!