In this article, you’ll find out how the serverless computing model can be achieved with Microsoft’s cloud technologies, and how it can facilitate application development and deployment processes. To show how the serverless architecture works, let’s have a look at a real-life scenario – implementation of a data sharing service.
Implementing a data sharing service – the old approach
No further than ten years ago, if an IT guy was asked to create a solution that allows file sharing with customers, this would take a few standard steps. First, you had to buy and prepare a server with sufficient disk space. The next step was to implement a safe software solution for data sharing, secure the whole network and, after confirming there are no security breaches, carefully expose the new service to the outside world. But that’s just part of the process. To make everything work, a suitable client application had to be made available. The staff needed to be instructed how to use the new system, and then the customers would have to be provided with a user’s manual. All this turned into a huge project and you had to wait a long time to see the results. What about now? Has the introduction of cloud computing, with execution models such as serverless, really changed how you do things?
The modern approach
The idea behind the concept known as serverless computing is simple – instead of managing server architecture and other application-related resources on your own, you can now delegate these tasks to a cloud provider. As a result, you no longer have to worry about infrastructure maintenance, the system is fully scalable (or elastic, as some people call it) and safe (most cloud-based services come with top-notch security solutions), and the whole implementation is incomparably quicker and easier. Application functionalities can also be moved to the cloud (the concept known as Function as a service or FaaS), further simplifying the development process.
So, what’s the best way to implement the serverless architecture in our real-life case scenario, which is data sharing? While introducing the serverless model, it’s a good idea to take advantage of the resources and knowledge your company already has. This reduces the time and costs required to purchase new applications and devices, and the adoption is much easier because the need for additional staff training is eliminated.
For the purpose of this example, I’ll assume that the company already uses Microsoft Azure, Office 365 and the associated services such as SharePoint. Here’s what the implementation of data sharing can look like if you combine these resources with the serverless cloud technologies that come with Microsoft Azure:
- In SharePoint Online, you create a new folder for the files you want to share.
- In the Azure cloud, you prepare computing space via a service like Azure Blob Storage along with a suitable retention policy (e.g. removing files after 7 days).
- Using Azure Logic Apps, you create an automatic process that:
- Activates if there is a new file in the SharePoint Online folder specified earlier.
- Downloads the new file.
- Places the file into the predefined location in Azure Blob Storage.
- Removes the file from SharePoint Online.
- Grants access to this file via the Shared Access Signatures (SAS). It is also possible to limit the access to certain IP addresses, specified time ranges, permissions, etc.
- Sends an email to the owner (the user who originally uploaded the file to SharePoint Online) with a link to the file and an email template dedicated to the final client (end user).
This is what the processing flow in Azure Logic Apps could look like:
In a relatively short time, you get a simple solution that can be further used by different teams. The solution is immediately available and doesn’t generate costs if it isn’t used. With this approach, files are shared individually; if you want to share multiple files, archive them into a single .zip file before uploading. For safety reasons, files are only visible to those who uploaded them (no other team members can access them) and those who have permission to download them.
Configuration in details
Let’s now take a closer look into the possible configuration:
- You need to specify the intervals, i.e. how often you want to discover the changes in the SharePoint library.
- Then, configure the content download.
- When configuring the file upload to Azure, it is a good idea to create a unique name for each file or the folder that will serve as the storage.
- Configure how to remove files from SharePoint Online.
- Create Shared Access Signatures (SAS).
- The last step is to configure how to send an email to the person who uploaded the file to SharePoint Online.
The bottom line
As you can see, it was fairly quick to implement a data sharing service by using the features available in the Azure cloud. The solution automatically scales and adapts to increasing loads, so you don’t have to worry about the shortage of space. What’s more, it doesn’t require any starting budget (assuming that you already use the Microsoft solutions described) and lets you pay only for the actual time you use your data. In other words, if the service is not used, you don’t pay (consumption-based billing).
Does it mean that all the projects can be handled this way? Of course not. The serverless model has its own flaws and limitations, too: cold starts after inactivity periods, increasing costs with increased usage, or difficulties with debugging. Some other limitations might depend on the cloud provider of your choice.
Still, even in its early days the serverless approach to computing is certainly worth considering and can bring real benefits in specific scenarios. And with the rapid development of cloud technologies, it seems that the demise of the traditional server-based architecture might be closer than ever.
If you want to dive deeper, here are some resources related to the Microsoft technologies mentioned in the article: