Internet Reliability Log – using Functions and Application Insights

Just posted over on GitHub my experiences using Azure Functions, PowerShell, Task Scheduler and more to log internet reliability data to Application Insights.

Check out the full article and code here.

Using Azure Functions, PowerShell, Task Scheduler, Table Storage and Application Insights

Jordan Knight, Feb 20, 2017

*Note: You will need a Microsoft Azure account for this. If you don’t have one you may be eligible for a free trial account.

Since getting an upgrade recently my home internet has been very unstable with up to 39 drop outs a day.

I did the usual thing and rang my provider – only to be told that “it’s currently working” and there is not much they can do. Of course, it was working when I called.

So I called when it wasn’t. The tech comes out a couple of days later. “Oh, it’s working fine”. He tinkered with it and left.

It’s still dropping out to the point that I’m having to tether my phone to my home PC.

So I figured I’d collect some data. Lots of data.

I of course had a look around to see if something could do it – the solutions I found were non-intuitive or cost money. Nope – CUSTOM BUILD TIME. There, justified.

I had a think around how I might go about this – I didn’t want to spend too much time on something that was already costing me time. How to throw together a monitoring system without spending too much time?

The system I came up with uses Azure Functions, Table Storage, Application Insights, Powershell and Windows Task Scheduler. I threw it together in a couple of hours at most.

Basic Flow

The process starts with a PowerShell script that is fired by Task Scheduler on Windows every 1 minute.

This script calls the UptimeLogger Azure Function which logs the data to an Azure Storage Table.

I then have a Processor Azure Function that runs every minute to check to see if there are any new entries in the Azure Table. If not – then we know that there has been some downtime.

This processor function sends the results about up-time to Application Insights.

In Depth

Table Storage

Set up an Azure Storage Account.

  • Click here to get started.

  • Enter a name for your new storage account (I called mine internetuptime – all lower case!).

  • Create a new Resource Group called intetnetmonitoring which will allow you to keep all your bits for this project in the same place.

  • Once that is created you should be able to install StorageExplorer and browse to it. This will be a good way to debug your service later.

Azure Functions

If you know about functions, grab the function code from here and set up up.
There are a couple of inputs and outputs to think about and some project.json files to work with for Nugets FYI.

I know about functions, skip me to the next bit

Next you need to set up the Azure Functions. These provide an end point for your pings as well as background processing that does the actual up/down calculation and sends data to Application Insights.

  • They are super easy to get going – click here to get started.

Select consumtion plan if you’re not sure what to select there.
Use your new Resource Group called intetnetmonitoring so you can group all the services for this project in the one place.

img1

  • Next go in to the editor so you can start editing your new functions. If you can’t locate it, look under the App Services section.

  • Add a new function called InternetUpLogger.

  • Filter by C# and API & WebHooks then select the HttpTrigger-CSharp option. Enter the InternetUpLogger and click Create.

This will create a new function that will act as a web endpoint that you can call.

Create the Function
YouTube

You will see a place to drop some code. This is a basic azure function.

Before you can edit the code you need to add some outputs.

Azure Functions can look after some inputs and outputs for you – so you don’t have to write a lot of code and config to say read from a database and write to a table.

  • Click on Integrate, then click New Output. Select Azure Table Storage from the list and click Select.

Next you’ll need to set up the connection to your table storage if you’ve not done so already in this Function.

  • Click New next to Storage account aonnection and select the account from the list.

You may want to change the output table name here to something like pingTable. You will need to remember this for later when we take this new table as input in another function.

  • Once that is completed, click Save.

You can expand the documentation to see some examples of how to use the new output.

Add The Output
YouTube

Now you can paste in the function code from here

Some points of interest

Note the ICollector is passed in to run automatically. This is the param you configured. In my video it’s called outTable, you may need to change it OOPS!

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req,ICollector<UptimePing> outputTable, TraceWriter log)
{
}

The next interesting thing is the RowKey I’m setting.

var dtMax = (DateTime.MaxValue.Ticks - DateTime.UtcNow.Ticks).ToString("d19");

var ping = new UptimePing{
    RowKey = dtMax,
    PartitionKey = data.MachineName,
    PingTime = DateTime.UtcNow
};

outputTable.Add(ping);

You cannot sort on Azure Table rows by default using LINQ etc. RowKey are auto ordered in descending format. So we make sure that the newer the row, the higher the number is by subtracting from DateTime.MaxValue. This will be handy later when we want to get out the latest pings to analyse recent data.

Once that is done we pop it in to the ICollector which will go and add it to the table for us! Too easy!

PowerShell

The next step is to set up the PowerShell script to call the function on a scheduler.

  • Copy the URL of your function from just above the code bit on the Develop tab – you’ll need this in a second.

  • Grab the PS1 file and copy it somewhere on your machine (or just run it from you GitHub checkout place).

  • Edit it to insert your function URL in to the indicated spot.

Ping PowerShell Script

  • Jump in to PowerShell and try it out (hint, go to the directory and type PowerShell in the explorer bar at the top).
.\pinger.ps1

Make sure that it prints out something saying that it worked 😛

  • Next create a new Scheduler Job – from Start Menu search for Task Scheduler.

Add The Task

YouTube

  • For the trigger, select any start time (in the past works best) and then have it repeat every 1 minute.

  • For the action, have it call powershell.exe and pass in the argument -ExecutionPolicy Bypass

Now you can check if it’s working by going back in to the Azure Function and watching the log. Also, you can check that your table was created by exploring with Azure Storage Explorer.

Background task to process the data

In order to know if we’re up or down, then do stuff based on that we need something to process our data.

Azure Functions can be called via HTTP (as we are above) – but they can also be called in many other ways – including on a schedule.

  • Create a new function called InternetUpProcessor that is a TimerTrigger-CSharp.

Create the processor function

YouTube

  • Set the cron expression to one minute:
0 */1 * * * *
  • You’ll also need to pass the table that you created as the output in the first function to the input of this function. In the YouTube video I called it outTable, but you may have renamed it to pingTable or something.

  • Next you need to add another different output table to store the actual up-time/down-time results.

  • Create a new output to an Azure Table called uptimeTable. This will be passed in to the function.

  • At the same time you’ll need to create another table input that also points to uptimeTable… this is so we can check it to see if the system was already down or not and do extra processing.

Create uptime inputs and outputs

YouTube
– Now you can copy in the code for the function from here.

You may note that the function is not building. That’s becuse it uses some Nuget packages that are not available by default.

To add nuget packages you first need to add a new file to your function called project.json.

Add nugets

YouTube

  • Click on View Files and add a new file called project.json. Paste in the content from here and save.
  • You should see packages restoring when you view the log.

Application Insights

Next we need to create a new Application Insights app in Azure.

  • Click here to create a new Application Insights app.
  • Leave the values default and choose the resource group you used for your other things.
  • Once you has been created you can collect you instrumentation key from the properties tab on the new Application Insights resource.
  • Paste that key in to the indicated spot in the function code.

aiproperties

Once you’re recieving telemtry you can do a couple of searches in the Application Insights interface to visualise your internet connection stability.

graphs

I went in and added a couple of metric graphs.

  • Click on Metrics Explorer. If there is not already a graph to edit, click to add one.

I added two.

downtime

This graph shows downtime in minutes. So you can see over time how many minutes your system is out.

metricstate

This one is the state (1 or 0) of the connection over time. Somtimes it will show as in between 1 and 0 – this is the average of the state during the measurement window.

If you want to see actual downtime events you can add a new search.

  • Click on Search from the Overview panel of Application Insights.

state filter

  • Click on filters and search for state. Select false.

This will filter down to events that have a state of false… i.e. INTERNET DOWN. You could also look for the InternetDown event which will show the times when the internet went down as opposed to the timeranges it was down.

outputinaction

This isn’t that the internet went down 96 times, it’s that it was down during 96 sampling periods. The InternetDown event shows the amount of times it went down.

That’s prety much it! You’re done.

Extra Credit – SpeedTest

I added a speed test using this same project for s&g’s.

  • There is another function here that you can install.

  • Then grab the code from here.

  • Edit Upload.cs and paste in your new Speedtest function url.
  • Build it and create a new Scheduled Task for every 15 mins (or what ever).
  • In Application Insights metrics explorer, add a new graph of UploadSpeed_MachineName and DownloadSpeed_MachineName (same graph, they can overlay).

Extra Credit – Push

I’ve set my system up to create pushes.

I did this by creating a new maker url call back channel on IFTTT which passes through the value to a push notification. This then sends the push to the IFTTT app on my phone without me needing to write an app just to recive a push.

It’s outside the scope of this article to go though that, but you can see the remenants of it in the InternetUptimeProcessor funcion.

pushsetting

If you get stuck, ping me – I’d be happy to expand this article to include it later.

Cheers,

Jordan.

DigiGirlz, Deep Zoom and Azure

A few weeks ago I was tasked with coming up with something for the attendees at the first ever DigiGirlz event in Australia to play with. Something to get them a little excited about technology. Catherine Eibner came to me with some ideas that she thought would make for a compelling exercise and we came up with a cool Azure based Deep Zoom app!

We decided to take the Eventr (Codeplex | Blog | Blog) open source project I worked on last year for ReMIX Australia 09 and make it more accessible and dynamic (and brand it a little for the event).

What came out the other side was a dynamic Azure based DeepZoom creation application with auto updating Silverlight front end!

What does it do?

Firstly, check out the running application here!.

The Silverlight application shows a DeepZoom composition of all the photos that were taken at the DigiGirlz event.

The photos were all added to a Flickr Group which the Eventr application automatically scans. When new photos are added, the photos are downloaded in to Azure and processed in to a new DeepZoom composition.

The Silverlight client is notified that new images are available – it then highlights the new images and automatically reloads the DeepZoom composition from the server.

The system is configurable to scan a Flickr Group, perform a Flickr full text search or perform a Flickr tag search (or a combination of these).

The system has been designed to be “multi-tenanted” which means it can host more than one group of images… I.e. it could do DigiGirlz and another collection – all the images would be separated depending on the URL entered.

How does it work?

The application is hosted in Windows Azure and utilises a lot of what Azure has to offer.

  • It uses a WebRole to host the Silverlight application and services (which read from the DataBase).
  • All data is stored in a SQL Azure database. The data stored includes information about which images have been downloaded from Flickr, their metadata (title, description etc.) and their processed state (have they been included in the DeepZoom composition yet?).
  • Linq to SQL is used to access the SQL Azure database. You can access SQL Azure in the exact same way that you would access a normal SQL database. In fact when in development mode (local) I used SQL Express, and during deployment I changed my config to point to a SQL Azure instance… too easy.
  • WCF RIA Services was used to communicate from Silverlight to the server (where Linq to SQL is used to then go up to SQL Azure). The Silverlight client is very easily then able to get the metadata (title, description) from the database, as well as send data back (like view counts etc). All in all, it was a very simple task to get data from SQL Azure into Silverlight.
  • Azure Blob Storage is used to store the generated DeepZoom files. Blob storage is great because you can access the files directly using a URL, as well as get programmatic access to them to CRUD(create read update and delete) them.
  • To fire commands based on events from the user, the system uses Azure Queue Storage. Queues allow you to add an item in one place, and read it in another (only one thing can read an item)… so it’s great to fire a command once, and pick it up once to execute it. An example command in this system is “Clear”. Admins can clear out all the images/collections and data and start again.
  • To build the Deep Zoom collection, the system utilises an Azure Worker Role. The worker role polls the database every few seconds to get out the Flickr search configuration (i.e. which group or text search to scan). It then performs this search against Flickr. New images are downloaded and added to the SQL Azure database for later processing.
  • When the Azure Worker Role downloads an image, it is stored in a Windows Azure Drive. Once stored, it is processed using DeepZoomTools (a part of Deep Zoom Composer.

    Windows Azure Drive is important as DeepZoomComposer doesn’t work with streams (which is how Azure Blobs work)… it needs a drive letter. Azure Drive is great as it provides drive letter access to a special type of blob storage – meaning it will work in Azure!

    Once the composition has been built it is uploaded to Azure Blob storage for access from Silverlight (you cannot access Azure Drive files from outside the Windows Azure hosting platform – they are a special type of storage).

  • Can I test it out??

    Unfortunately at this stage I cannot release the code for this stuff, but – you can have a go at using this project yourself…

    Head over to the test Flickr group here. Add a photo the the group (KEEP IT CLEAN PLEASE!!).

    Then head over to the test URL and keep watching… your new images will show up in no time (under 2 mins)!

    Resources

    Azure Team Blog: http://blogs.msdn.com/windowsazure/
    Azure Storage Intro: http://msdn.microsoft.com/en-us/azure/cc994380.aspx
    SQL Azure Overview: http://www.microsoft.com/windowsazure/sqlazure/
    Windows Azure Overview: http://www.microsoft.com/windowsazure/windowsazure/
    WCF RIA Services Overview: http://www.silverlight.net/getstarted/riaservices/
    Get Started with Silverlight: http://www.silverlight.net/getstarted/
    Eventr DigiGirlz Demo: http://jak.cloudapp.net/Default.aspx?guid=764fbcd0-c15f-45f3-bda5-de3ed9081ce8

    Eventr Codeplex project: http://eventr.clodeplex.com