Upload data to blob storage with Azure Functions
How to store data sent to an Azure Function in blob storage.
Some time ago I used a third party product which accepted data from client applications via a HTTP WCF service and saved this data as files on the local disk. A Windows service would then periodically poll for new files and load the data into a SQL Server database. This worked, as long as both the HTTP server and the loader service were on the same computer/network. As this wasn't suitable for my needs, the software vendor provided me with the source code for the WCF service and I modified this to store the data in Azure blob storage. Those blobs were then periodically downloaded by our unassuming Azure Container Echo program from where the loader service would pick it up.
Although this sounds somewhat convoluted, it does mean all the parts were happily independent and could be located anywhere, with the availability of one part having no affect on the other. I decided that was a good pattern to use for the other ad-hoc information I want to collect, except this time the final destination is going to be RavenDB.
However, I didn't want yet another website to maintain, nor do I want to bolt anything else onto a creaky cyotek.com. So I settled on using Azure Functions, where the only thing I need to worry about the code required to do the data processing, and excluding initial setup (custom domains, SSL, etc) everything else is handled without me having to lift a finger.
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure. Use Azure Functions to run a script or piece of code in response to a variety of events. (source Microsoft)
I started writing an overview of functions and how to create them but then the post was in danger of turning what was supposed to be focused into a sprawling mass. I'm therefore going to assume the reader has familiarity with Azure functions, their creation and basic use.
To get started, the article is making the assumption that you have created a HTTP Trigger function using the C# language. I've replaced the default code with the following placeholder.
This will reject all requests with a 400 status code.
As the only data type I'm going to work with is JSON, it makes a little bit of sense to check the content type and reject the request if it doesn't match.
The Test tool that is part of the function editor seems to automatically include a JSON content type header if one hasn't been explicitly defined
Although I'm not demonstrating it in this example, there are other checks you may wish to perform. For example, in my versions of this function I check for the presence of a non-standard version header. If it's not set, or isn't a value I'm expecting, I perform no further work on the request. This should allow me to use the same URI for different versions of the data if I later choose to expand them.
You could also try validating that the body is actually a block of valid JSON in the format you're expecting in case a badly behaved application is sending corrupt data (or someone randomly hits the endpoints if they are open for anonymous access).
Although I suspect it was more to do with the fact that HTTPS wasn't a given rather than trying to filter out bad data, the third party software I mentioned at the start of the article encrypted all the information before sending it to the WCF service, which then had to be decrypted before putting it into blob storage.
The default function only has access to standard framework
assemblies. However, the C# code you write is not entirely C# -
it's a scripting variant. And one of the features this variant
supports is the #r
directive for referencing external
assemblies.
Adding the following line to the top of the script will add a reference to the library we need for working with blob storage.
To connect to Azure storage we need the connection string of our storage account. We could hard code the entire string, or just bits of it. (put the pitchforks down, there's a follow up!)
Simple enough, except for hard coding the account details as this means you need to edit the function if they change, or worse edit it in multiple places if you have similar functions.
The Function App that you have created actually seems to be a
disguised ASP.NET website and provides access to many of the
things you would define in web.config
, including application
settings. You can access these by clicking Application
settings from the overview page of the function app.
In the Application settings group, click Add new setting then fill in the row. I'll use this feature to define the access key and storage account name. As the Azure platform injects its own settings in here as well, I opted to prefix mine to avoid any potential clashes.
Remember to hit the Save button at the top of the page!
Now we can go back and change our function to use the new settings instead
With account information in hand, it's time to work with the blob storage. First we need to get the container that we want to put our blob into - you can think of this as an equivalent of a directory on your local file system, with the blob a file.
Unfortunately it doesn't seem to be possible to store application settings / secrets on a per function basis. However, as it's unlikely I'd have multiple functions writing to the same container, I'm happy enough to hard code the container name.
Next, try and create the container if it doesn't exist. Alternatively, you could create the container up front and not bother with this code at all.
With container housekeeping performed, we can now create our blob (or file) in the container.
I auto generate a filename using a GUID as I don't want it to be possible for the request to allow a filename specified, and just like with a normal file system, blob names need to be unique.
As well as basic properties such as ContentType
,
ContentEncoding
, ContentLanguage
that you can set yourself,
you can also define and create your own meta data key value
pairs. In this case I'm only setting the content type.
With our blob reference ready, we can now upload our data.
It would have been slightly better to just use the request's input stream directly, instead of converting the body content I'd previously read back into a stream, but then I would lose the ability to perform any further validation.
With the above code I now have a fully functional function that I can post data to and have it placed into blob storage. Happily, the function editor even includes a small testing tool so you can test it directly from your browser window.
By default, a function will respond to all standard HTTP verbs.
If your function is used to upload data only, then you'll
probably want to disable all verbs bar POST
. You can do this
by selecting the Integrate option listed below the function
name in the sidebar.
You may wish to change the authorisation level from the default Function (which requires an access key) to Anonymous.
Finally, if you bind your own custom domain to the function then you may also wish to change the default route to a custom one - that way you may find it easier to migrate from functions to a self hosted solution in future, or make it easier to manage multiple functions in the same Function App.
This is the complete final version of the function. I hope you find this useful as a starting point for your own adventures in Azure!
Like what you're reading? Perhaps you like to buy us a coffee?
# Sean
# Ramesh
# Richard Moss
# Marcus