Posts Tagged App Service
Building and Integrating LUIS-enabled Chatbots with Slack, using Azure Bot Service, Bot Builder SDK, and Cosmos DB
Posted by Gary A. Stafford in Azure, Cloud, Continuous Delivery, JavaScript, Software Development on August 27, 2018
Introduction
In this post, we will explore the development of a machine learning-based LUIS-enabled chatbot using the Azure Bot Service and the BotBuilder SDK. We will enhance the chatbot’s functionality with Azure’s Cloud services, including Cosmos DB and Blob Storage. Once built, we will integrate our chatbot across multiple channels, including Web Chat and Slack.
If you want to compare Azure’s current chatbot technologies with those of AWS and Google, in addition to this post, please read my previous two posts in this series, Building Serverless Actions for Google Assistant with Google Cloud Functions, Cloud Datastore, and Cloud Storage and Building Asynchronous, Serverless Alexa Skills with AWS Lambda, DynamoDB, S3, and Node.js. All three of the article’s demonstrations are written in Node.js, all three leverage their cloud platform’s machine learning-based Natural Language Understanding services, and all three take advantage of NoSQL database and storage services available on their respective cloud platforms.
Technology Stack
Here is a brief overview of the key Microsoft technologies we will incorporate into our bot’s architecture.
LUIS
The machine learning-based Language Understanding Intelligent Service (LUIS) is part of Azure’s Cognitive Services, used to build Natural Language Understanding (NLU) into apps, bots, and IoT devices. According to Microsoft, LUIS allows you to quickly create enterprise-ready, custom machine learning models that continuously improve.
Designed to identify valuable information in conversations, Language Understanding interprets user goals (intents) and distills valuable information from sentences (entities), for a high quality, nuanced language model. Language Understanding integrates seamlessly with the Speech service for instant Speech-to-Intent processing, and with the Azure Bot Service, making it easy to create a sophisticated bot. A LUIS bot contains a domain-specific natural language model, which you design.
Azure Bot Service
The Azure Bot Service provides an integrated environment that is purpose-built for bot development, enabling you to build, connect, test, deploy, and manage intelligent bots, all from one place. Bot Service leverages the Bot Builder SDK.
Bot Builder SDK
The Bot Builder SDK allows you to build, connect, deploy and manage bots, which interact with users, across multiple channels, from your app or website to Facebook, Messenger, Kik, Skype, Slack, Microsoft Teams, Telegram, SMS, Twilio, Cortana, and Skype. Currently, the SDK is available for C# and Node.js. For this post, we will use the current Bot Builder Node.js SDK v3 release to write our chatbot.
Cosmos DB
According to Microsoft, Cosmos DB is a globally distributed, multi-model database-as-a-service, designed for low latency and scalable applications anywhere in the world. Cosmos DB supports multiple data models, including document, columnar, and graph. Cosmos also supports numerous database SDKs, including MongoDB, Cassandra, and Gremlin DB. We will use the MongoDB SDK to store our documents in Cosmos DB, used by our chatbot.
Azure Blob Storage
According to Microsoft, Azure’s storage-as-a-service, Blob Storage, provides massively scalable object storage for any type of unstructured data, images, videos, audio, documents, and more. We will be using Blob Storage to store publically-accessible images, used by our chatbot.
Azure Application Insights
According to Microsoft, Azure’s Application Insights provides comprehensive, actionable insights through application performance management (APM) and instant analytics. Quickly analyze application telemetry, allowing the detection of anomalies, application failure, performance changes. Application Insights will enable us to monitor our chatbot’s key metrics.
High-Level Architecture
A chatbot user interacts with the chatbot through a number of available channels, such as the Web, Slack, and Skype. The channels communicate with the Web App Bot, part of Azure Bot Service, and running on Azure’s App Service, the fully-managed platform for cloud apps. LUIS integration allows the chatbot to learn and understand the user’s intent based on our own domain-specific natural language model.
Through Azure’s App Service platform, our chatbot is able to retrieve data from Cosmos DB and images from Blob Storage. Our chatbot’s telemetry is available through Azure’s Application Insights.
Azure Resources
Another way to understand our chatbot architecture is by examining the Azure resources necessary to build the chatbot. Below is an example of all the Azure resources that will be created as a result of building a LUIS-enabled bot, which has been integrated with Cosmos DB, Blob Storage, and Application Insights.
Chatbot Demonstration
As a demonstration, we will build an informational chatbot, the Azure Tech Facts Chatbot. The bot will respond to the user with interesting facts about Azure, Microsoft’s Cloud computing platform. Note this is not intended to be an official Microsoft bot and is only used for demonstration purposes.
Source Code
All open-sourced code for this post can be found on GitHub. The code samples in this post are displayed as GitHub Gists, which may not display correctly on some mobile and social media browsers. Links to the gists are also provided.
Development Process
This post will focus on the development and integration of a chatbot with the LUIS, Azure platform services, and channels, such as Web Chat and Slack. The post is not intended to be a general how-to article on developing Azure chatbots or the use of the Azure Cloud Platform.
Building the chatbot will involve the following steps.
- Design the chatbot’s conversation flow;
- Provision a Cosmos DB instance and import the Azure Facts documents;
- Provision Azure Storage and upload the images as blobs into Azure Storage;
- Create the new LUIS-enabled Web App Bot with Azure’s Bot Service;
- Define the chatbot’s Intents, Entities, and Utterances with LUIS;
- Train and publish the LUIS app;
- Deploy and test the chatbot;
- Integrate the chatbot with Web Chat and Slack Channels;
The post assumes you have an existing Azure account and a working knowledge of Azure. Let’s explore each step in more detail.
Cost of Azure Bots!
Be aware, you will be charged for Azure Cloud services when building this bot. Unlike an Alexa Custom Skill or an Action for Google Assistant, an Azure chatbot is not a serverless application. A common feature of serverless platforms, you only pay for the compute time you consume. There typically is no charge when your code is not running. This means, unlike AWS and Google Cloud Platform, you will pay for Azure resources you provision, whether or not you use them.
Developing this demo’s chatbot on the Azure platform, with little or no activity most of the time, cost me about $5/day. On AWS or GCP, a similar project would cost pennies per day or less (like, $0). Currently, in my opinion, Azure does not have a very competitive model for building bots, or for serverless computing in general, beyond Azure Functions, when compared to Google and AWS.
Conversational Flow
The first step in developing a chatbot is designing the conversation flow of the between the user and the bot. Defining the conversation flow is essential to developing the bot’s programmatic logic and training the domain-specific natural language model for the machine learning-based services the bot is integrated with, in this case, LUIS. What are all the ways the user might explicitly invoke our chatbot? What are all the ways the user might implicitly invoke our chatbot and provide intent to the bot? Taking the time to map out the possible conversational interactions is essential.
With more advanced bots, like Alexa, Actions for Google Assistant, and Azure Bots, we also have to consider the visual design of the conversational interfaces. In addition to simple voice and text responses, these bots are capable of responding with a rich array of UX elements, including what are generically known as ‘Cards’. Cards come in varying complexity and may contain elements such as text, title, sub-titles, text, video, audio, buttons, and links. Azure Bot Service offers several different cards for specific use cases.
Channel Design
Another layer of complexity with bots is designing for channels into which they integrate. There is a substantial visual difference in a conversational exchange displayed on Facebook Messanger, as compared to Slack, or Skype, Microsoft Teams, GroupMe, or within a web browser. Producing an effective conversational flow presentation across multiple channels a design challenge.
We will be focusing on two channels for delivery of our bot, the Web Chat and Slack channels. We will want to design the conversational flow and visual elements to be effective and engaging across both channels. The added complexity with both channels, they both have mobile and web-based interfaces. We will ensure our design works with the compact real-estate of an average-sized mobile device screen, as well as average-sized laptop’s screen.
Web Chat Channel Design
Below are two views of our chatbot, delivered through the Web Chat channel. To the left, is an example of the bot responding with ThumbnailCard UX elements. The ThumbnailCards contain a title, sub-title, text, small image, and a button with a link. Below and to the right is an example of the bot responding with a HeroCard. The HeroCard contains the same elements as the ThumbnailCard but takes up about twice the space with a significantly larger image.
Slack Channel Design
Below are three views of our chatbot, delivered through the Slack channel, in this case, the mobile iOS version of the Slack app. Even here on a larger iPhone 8s, there is not a lot of real estate. On the right is the same HeroCard as we saw above in the Web Chat channel. In the middle are the same ThumbnailCards. On the right is a simple text-only response. Although the text-only bot responses are not as rich as the cards, you are able to display more of the conversational flow on a single mobile screen.
Lastly, below we see our chatbot delivered through the Slack for Mac desktop app. Within the single view, we see an example of a HeroCard (top), ThumbnailCard (center), and a text-only response (bottom). Notice how the larger UI of the desktop Slack app changes the look and feel of the chatbot conversational flow.
In my opinion, the ThumbnailCards work well in the Web Chat channel and Slack channel’s desktop app, while the text-only responses seem to work best with the smaller footprint of the Slack channel’s mobile client. To work across a number of channels, our final bot will contain a mix of ThumbnailCards and text-only responses.
Cosmos DB
As an alternative to Microsoft’s Cognitive Service, QnA Maker, we will use Cosmos DB to house the responses to user’s requests for facts about Azure. When a user asks our informational chatbot for a fact about Azure, the bot will query Cosmos DB, passing a single unique string value, the fact the user is requesting. In response, Cosmos DB will return a JSON Document, containing field-and-value pairs with the fact’s title, image name, and textual information, as shown below.
There are a few ways to create the new Cosmos DB database and collection, which will hold our documents, we will use the Azure CLI. According to Microsoft, the Azure CLI 2.0 is Microsoft’s cross-platform command line interface (CLI) for managing Azure resources. You can use it in your browser with Azure Cloud Shell, or install it on macOS, Linux, or Windows, and run it from the command line. (gist).
#!/usr/bin/env sh | |
# author: Gary A. Stafford | |
# site: https://programmaticponderings.com | |
# license: MIT License | |
set -ex | |
LOCATION="<value_goes_here>" | |
RESOURCE_GROUP="<value_goes_here>" | |
ACCOUNT_NAME="<value_goes_here>" | |
COSMOS_ACCOUNT_KEY="<value_goes_here>" | |
DB_NAME="<value_goes_here>" | |
COLLECTION_NAME="<value_goes_here>" | |
az login | |
az cosmosdb create \ | |
--name ${ACCOUNT_NAME} \ | |
--resource-group ${RESOURCE_GROUP} \ | |
--location "East US=0" \ | |
--kind MongoDB | |
az cosmosdb database create \ | |
--name ${ACCOUNT_NAME} \ | |
--db-name ${DB_NAME} \ | |
--resource-group-name ${RESOURCE_GROUP} \ | |
az cosmosdb collection create \ | |
--collection-name ${COLLECTION_NAME} \ | |
--name ${ACCOUNT_NAME} \ | |
--db-name ${DB_NAME} \ | |
--resource-group-name ${RESOURCE_GROUP} \ | |
--throughput 400 \ | |
--key ${COSMOS_ACCOUNT_KEY}\ | |
--verbose --debug |
There are a few ways for us to get our Azure facts documents into Cosmos DB. Since we are writing our chatbot in Node.js, I also chose to write a Cosmos DB facts import script in Node.js, cosmos-db-data.js. Since we are using Cosmos DB as a MongoDB datastore, all the script requires is the official MongoDB driver for Node.js. Using the MongoDB driver’s db.collection.insertMany() method, we can upload an entire array of Azure fact document objects with one call. For security, we have set the Cosmos DB connection string as an environment variable, which the script expects to find at runtime (gist).
// author: Gary A. Stafford | |
// site: https://programmaticponderings.com | |
// license: MIT License | |
// Reference: https://docs.microsoft.com/en-us/azure/cosmos-db/mongodb-samples | |
// Insert Azure Facts into Cosmos DB Collection | |
'use strict'; | |
/* CONSTANTS */ | |
const mongoClient = require('mongodb').MongoClient; | |
const assert = require('assert'); | |
const COSMOS_DB_CONN_STR = process.env.COSMOS_DB_CONN_STR; | |
const DB_COLLECTION = "azuretechfacts"; | |
const azureFacts = [ | |
{ | |
"fact": "certifications", | |
"title": "Azure Certifications", | |
"image": "image-06.png", | |
"response": "As of June 2018, Microsoft offered ten Azure certification exams, allowing IT professionals the ability to differentiate themselves and validate their knowledge and skills." | |
}, | |
{ | |
"fact": "cognitive", | |
"title": "Cognitive Services", | |
"image": "image-09.png", | |
"response": "Azure's intelligent algorithms allow apps to see, hear, speak, understand and interpret user needs through natural methods of communication." | |
}, | |
{ | |
"fact": "competition", | |
"title": "Azure's Competition", | |
"image": "image-05.png", | |
"response": "According to the G2 Crowd website, Azure's Cloud competitors include Amazon Web Services (AWS), Digital Ocean, Google Compute Engine (GCE), and Rackspace." | |
}, | |
{ | |
"fact": "compliance", | |
"title": "Compliance", | |
"image": "image-06.png", | |
"response": "Microsoft provides the most comprehensive set of compliance offerings (including certifications and attestations) of any cloud service provider." | |
}, | |
{ | |
"fact": "cosmos", | |
"title": "Azure Cosmos DB", | |
"image": "image-17.png", | |
"response": "According to Microsoft, Cosmos DB is a globally distributed, multi-model database service, designed for low latency and scalable applications anywhere in the world, with native support for NoSQL." | |
}, | |
{ | |
"fact": "kubernetes", | |
"title": "Azure Kubernetes Service (AKS)", | |
"image": "image-18.png", | |
"response": "According to Microsoft, Azure Kubernetes Service (AKS) is a fully managed Kubernetes container orchestration service, which simplifies Kubernetes management, deployment, and operations." | |
} | |
]; | |
const insertDocuments = function (db, callback) { | |
db.collection(DB_COLLECTION).insertMany(azureFacts, function (err, result) { | |
assert.equal(err, null); | |
console.log(`Inserted documents into the ${DB_COLLECTION} collection.`); | |
callback(); | |
}); | |
}; | |
const findDocuments = function (db, callback) { | |
const cursor = db.collection(DB_COLLECTION).find(); | |
cursor.each(function (err, doc) { | |
assert.equal(err, null); | |
if (doc != null) { | |
console.dir(doc); | |
} else { | |
callback(); | |
} | |
}); | |
}; | |
const deleteDocuments = function (db, callback) { | |
db.collection(DB_COLLECTION).deleteMany( | |
{}, | |
function (err, results) { | |
console.log(results); | |
callback(); | |
} | |
); | |
}; | |
mongoClient.connect(COSMOS_DB_CONN_STR, function (err, client) { | |
assert.equal(null, err); | |
const db = client.db(DB_COLLECTION); | |
deleteDocuments(db, function () { | |
insertDocuments(db, function () { | |
findDocuments(db, function () { | |
client.close(); | |
}); | |
}); | |
}); | |
}); |
Azure Blob Storage
When a user asks our informational chatbot for a fact about Azure, the bot will query Cosmos DB. One of the values returned is an image name. The image itself is stored on Azure Blob Storage.
The image, actually an Azure icon available from Microsoft, is then displayed in the ThumbnailCard or HeroCard shown earlier.
According to Microsoft, an Azure storage account provides a unique namespace in the cloud to store and access your data objects in Azure Storage. A storage account contains any blobs, files, queues, tables, and disks that you create under that account. A container organizes a set of blobs, similar to a folder in a file system. All blobs reside within a container. Similar to Cosmos DB, there are a few ways to create a new Azure Storage account and a blob storage container, which will hold our images. Once again, we will use the Azure CLI (gist).
#!/usr/bin/env sh | |
# author: Gary A. Stafford | |
# site: https://programmaticponderings.com | |
# license: MIT License | |
set -ex | |
## CONSTANTS ## | |
ACCOUNT_NAME="<value_goes_here>" | |
LOCATION="<value_goes_here>" | |
RESOURCE_GROUP="<value_goes_here>" | |
STORAGE_ACCOUNT="<value_goes_here>" | |
STORAGE_CONTAINER="<value_goes_here>" | |
az login | |
az storage account create \ | |
--name ${STORAGE_ACCOUNT} \ | |
--default-action Allow \ | |
--kind Storage \ | |
--location ${LOCATION} \ | |
--resource-group ${RESOURCE_GROUP} \ | |
--sku Standard_LRS | |
az storage container create \ | |
--name ${STORAGE_CONTAINER} \ | |
--fail-on-exist \ | |
--public-access blob \ | |
--account-name ${STORAGE_ACCOUNT} \ | |
--account-key ${ACCOUNT_KEY} |
Once the storage account and container are created using the Azure CLI, to upload the images, included with the GitHub project, by using the Azure CLI’s storage blob upload-batch
command (gist).
#!/usr/bin/env sh | |
# author: Gary A. Stafford | |
# site: https://programmaticponderings.com | |
# license: MIT License | |
set -ex | |
## CONSTANTS ## | |
ACCOUNT_NAME="<value_goes_here>" | |
STORAGE_ACCOUNT="<value_goes_here>" | |
STORAGE_CONTAINER="<value_goes_here>" | |
az storage blob upload-batch \ | |
--account-name ${STORAGE_ACCOUNT} \ | |
--account-key ${ACCOUNT_KEY} \ | |
--destination ${STORAGE_CONTAINER} \ | |
--source ../azure-tech-facts-google/pics/ \ | |
--pattern image-*.png | |
# --dryrun |
Web App Chatbot
To create the LUIS-enabled chatbot, we can use the Azure Bot Service, available on the Azure Portal. A Web App Bot is one of a variety of bots available from Azure’s Bot Service, which is part of Azure’s larger and quickly growing suite of AI and Machine Learning Cognitive Services. A Web App Bot is an Azure Bot Service Bot deployed to an Azure App Service Web App. An App Service Web App is a fully managed platform that lets you build, deploy, and scale enterprise-grade web apps.
To create a LUIS-enabled chatbot, choose the Language Understanding Bot template, from the Node.js SDK Language options. This will provide a complete project and boilerplate bot template, written in Node.js, for you to start developing with. I chose to use the SDK v3, as v4 is still in preview and subject to change.
Azure Resource Manager
A great DevOps features of the Azure Platform is Azure’s ability to generate Azure Resource Manager (ARM) templates and the associated automation scripts in PowerShell, .NET, Ruby, and the CLI. This allows engineers to programmatically build and provision services on the Azure platform, without having to write the code themselves.
To build our chatbot, you can continue from the Azure Portal as I did, or download the ARM template and scripts, and run them locally. Once you have created the chatbot, you will have the option to download the source code as a ZIP file from the Bot Management Build console. I prefer to use the JetBrains WebStorm IDE to develop my Node.js-based bots, and GitHub to store my source code.
Application Settings
As part of developing the chatbot, you will need to add two additional application settings to the Azure App Service the chatbot is running within. The Cosmos DB connection string (COSMOS_DB_CONN_STR
) and the URL of your blob storage container (ICON_STORAGE_URL
) will both be referenced from within our bot, as an environment variable. You can manually add the key/value pairs from the Azure Portal (shown below), or programmatically.
The chatbot’s code, in the app.js file, is divided into three sections: Constants and Global Variables, Intent Handlers, and Helper Functions. Let’s look at each section and its functionality.
Constants
Below is the Constants used by the chatbot. I have preserved Azure’s boilerplate template comments in the app.js file. The comments are helpful in understanding the detailed inner-workings of the chatbot code (gist).
// author: Gary A. Stafford | |
// site: https://programmaticponderings.com | |
// license: MIT License | |
// description: Azure Tech Facts LUIS-enabled Chatbot | |
'use strict'; | |
/*----------------------------------------------------------------------------- | |
A simple Language Understanding (LUIS) bot for the Microsoft Bot Framework. | |
-----------------------------------------------------------------------------*/ | |
/* CONSTANTS AND GLOBAL VARIABLES */ | |
const restify = require('restify'); | |
const builder = require('botbuilder'); | |
const botbuilder_azure = require("botbuilder-azure"); | |
const mongoClient = require('mongodb').MongoClient; | |
const COSMOS_DB_CONN_STR = process.env.COSMOS_DB_CONN_STR; | |
const DB_COLLECTION = "azuretechfacts"; | |
const ICON_STORAGE_URL = process.env.ICON_STORAGE_URL; | |
// Setup Restify Server | |
const server = restify.createServer(); | |
server.listen(process.env.port || process.env.PORT || 3978, function () { | |
console.log('%s listening to %s', server.name, server.url); | |
}); | |
// Create chat connector for communicating with the Bot Framework Service | |
const connector = new builder.ChatConnector({ | |
appId: process.env.MicrosoftAppId, | |
appPassword: process.env.MicrosoftAppPassword, | |
openIdMetadata: process.env.BotOpenIdMetadata | |
}); | |
// Listen for messages from users | |
server.post('/api/messages', connector.listen()); | |
/*---------------------------------------------------------------------------------------- | |
* Bot Storage: This is a great spot to register the private state storage for your bot. | |
* We provide adapters for Azure Table, CosmosDb, SQL Azure, or you can implement your own! | |
* For samples and documentation, see: https://github.com/Microsoft/BotBuilder-Azure | |
* ---------------------------------------------------------------------------------------- */ | |
const tableName = 'botdata'; | |
const azureTableClient = new botbuilder_azure.AzureTableClient(tableName, process.env['AzureWebJobsStorage']); | |
const tableStorage = new botbuilder_azure.AzureBotStorage({gzipData: false}, azureTableClient); | |
// Create your bot with a function to receive messages from the user | |
// This default message handler is invoked if the user's utterance doesn't | |
// match any intents handled by other dialogs. | |
const bot = new builder.UniversalBot(connector, function (session, args) { | |
const DEFAULT_RESPONSE = `Sorry, I didn't understand: _'${session.message.text}'_.`; | |
session.send(DEFAULT_RESPONSE).endDialog(); | |
}); | |
bot.set('storage', tableStorage); | |
// Make sure you add code to validate these fields | |
const luisAppId = process.env.LuisAppId; | |
const luisAPIKey = process.env.LuisAPIKey; | |
const luisAPIHostName = process.env.LuisAPIHostName || 'westus.api.cognitive.microsoft.com'; | |
const LuisModelUrl = 'https://' + luisAPIHostName + '/luis/v2.0/apps/' + luisAppId + '?subscription-key=' + luisAPIKey; | |
// Create a recognizer that gets intents from LUIS, and add it to the bot | |
const recognizer = new builder.LuisRecognizer(LuisModelUrl); | |
bot.recognizer(recognizer); |
Notice that using the LUIS-enabled Language Understanding template, Azure has provisioned a LUIS.ai app and integrated it with our chatbot. More about LUIS, next.
Intent Handlers
The next part of our chatbot’s code handles intents. Our chatbot’s intents include Greeting, Help, Cancel, and AzureFacts. The Greeting intent handler defines how the bot handles greeting a new user when they make an explicit invocation of the chatbot (without intent). The Help intent handler defines how the chatbot handles a request for help. The Cancel intent handler defines how the bot handles a user’s desire to quit, or if an unknown error occurs with our bot. The AzureFact intent handler, handles implicit invocations of the chatbot (with intent), by returning the requested Azure fact. We will use LUIS to train the AzureFacts intent in the next part of this post.
Each intent handler can return a different type of response to the user. For this demo, we will have the Greeting, Help, and AzureFacts handlers return a ThumbnailCard, while the Cancel handler simply returns a text message (gist).
/* INTENT HANDLERS */ | |
// Add a dialog for each intent that the LUIS app recognizes. | |
// See https://docs.microsoft.com/en-us/bot-framework/nodejs/bot-builder-nodejs-recognize-intent-luis | |
bot.dialog('GreetingDialog', | |
(session) => { | |
const WELCOME_TEXT_LONG = `You can say things like: \n` + | |
`_'Tell me about Azure certifications.'_ \n` + | |
`_'When was Azure released?'_ \n` + | |
`_'Give me a random fact.'_`; | |
let botResponse = { | |
title: 'What would you like to know about Microsoft Azure?', | |
response: WELCOME_TEXT_LONG, | |
image: 'image-16.png' | |
}; | |
let card = createThumbnailCard(session, botResponse); | |
let msg = new builder.Message(session).addAttachment(card); | |
// let msg = botResponse.response; | |
session.send(msg).endDialog(); | |
} | |
).triggerAction({ | |
matches: 'Greeting' | |
}); | |
bot.dialog('HelpDialog', | |
(session) => { | |
const FACTS_LIST = "Certifications, Cognitive Services, Competition, Compliance, First Offering, Functions, " + | |
"Geographies, Global Infrastructure, Platforms, Categories, Products, Regions, and Release Date"; | |
const botResponse = { | |
title: 'Need a little help?', | |
response: `Current facts include: ${FACTS_LIST}.`, | |
image: 'image-15.png' | |
}; | |
let card = createThumbnailCard(session, botResponse); | |
let msg = new builder.Message(session).addAttachment(card); | |
session.send(msg).endDialog(); | |
} | |
).triggerAction({ | |
matches: 'Help' | |
}); | |
bot.dialog('CancelDialog', | |
(session) => { | |
const CANCEL_RESPONSE = 'Goodbye.'; | |
session.send(CANCEL_RESPONSE).endDialog(); | |
} | |
).triggerAction({ | |
matches: 'Cancel' | |
}); | |
bot.dialog('AzureFactsDialog', | |
(session, args) => { | |
let query; | |
let entity = args.intent.entities[0]; | |
let msg = new builder.Message(session); | |
if (entity === undefined) { // unknown Facts entity was requested | |
msg = 'Sorry, you requested an unknown fact.'; | |
console.log(msg); | |
session.send(msg).endDialog(); | |
} else { | |
query = entity.resolution.values[0]; | |
console.log(`Entity: ${JSON.stringify(entity)}`); | |
buildFactResponse(query, function (document) { | |
if (!document) { | |
msg = `Sorry, seems we are missing the fact, '${query}'.`; | |
console.log(msg); | |
} else { | |
let card = createThumbnailCard(session, document); | |
msg = new builder.Message(session).addAttachment(card); | |
} | |
session.send(msg).endDialog(); | |
}); | |
} | |
} | |
).triggerAction({ | |
matches: 'AzureFacts' | |
}); |
Helper Functions
The last part of our chatbot’s code are the helper functions the intent handlers call. The functions include a function to return a random fact if the user requests one, selectRandomFact()
. There are two functions, which return a ThumbnailCard or a HeroCard, depending on the request, createHeroCard(session, botResponse)
and createThumbnailCard(session, botResponse)
.
The buildFactResponse(factToQuery, callback)
function is called by the AzureFacts intent handler. This function passes the fact from the user (i.e. certifications) and a callback to the findFact(factToQuery, callback)
function. The findFact
function handles calling Cosmos DB, using MongoDB Node.JS Driver’s db.collection().findOne method. The function also returns a callback (gist).
/* HELPER FUNCTIONS */ | |
function selectRandomFact() { | |
const FACTS_ARRAY = ['description', 'released', 'global', 'regions', | |
'geographies', 'platforms', 'categories', 'products', 'cognitive', | |
'compliance', 'first', 'certifications', 'competition', 'functions']; | |
return FACTS_ARRAY[Math.floor(Math.random() * FACTS_ARRAY.length)]; | |
} | |
function buildFactResponse(factToQuery, callback) { | |
if (factToQuery.toString().trim() === 'random') { | |
factToQuery = selectRandomFact(); | |
} | |
mongoClient.connect(COSMOS_DB_CONN_STR, function (err, client) { | |
const db = client.db(DB_COLLECTION); | |
findFact(db, factToQuery, function (document) { | |
client.close(); | |
if (!document) { | |
console.log(`No document returned for value of ${factToQuery}?`); | |
} | |
return callback(document); | |
}); | |
}); | |
} | |
function createHeroCard(session, botResponse) { | |
return new builder.HeroCard(session) | |
.title('Azure Tech Facts') | |
.subtitle(botResponse.title) | |
.text(botResponse.response) | |
.images([ | |
builder.CardImage.create(session, `${ICON_STORAGE_URL}/${botResponse.image}`) | |
]) | |
.buttons([ | |
builder.CardAction.openUrl(session, 'https://azure.microsoft.com', 'Learn more...') | |
]); | |
} | |
function createThumbnailCard(session, botResponse) { | |
return new builder.ThumbnailCard(session) | |
.title('Azure Tech Facts') | |
.subtitle(botResponse.title) | |
.text(botResponse.response) | |
.images([ | |
builder.CardImage.create(session, `${ICON_STORAGE_URL}/${botResponse.image}`) | |
]) | |
.buttons([ | |
builder.CardAction.openUrl(session, 'https://azure.microsoft.com', 'Learn more...') | |
]); | |
} | |
function findFact(db, factToQuery, callback) { | |
console.log(`fact to query: ${factToQuery}`); | |
db.collection(DB_COLLECTION).findOne({"fact": factToQuery}) | |
.then(function (document) { | |
if (!document) { | |
console.log(`No document found for fact '${factToQuery}'`); | |
} | |
console.log(`Document found: ${JSON.stringify(document)}`); | |
return callback(document); | |
}); | |
} |
LUIS
We will use LUIS to add a perceived degree of intelligence to our chatbot, helping it understand the domain-specific natural language model of our bot. If you have built Alexa Skills or Actions for Google Assitant, LUIS apps work almost identically. The concepts of Intents, Intent Handlers, Entities, and Utterances are universal to all three platforms.
Intents are how LUIS determines what a user wants to do. LUIS will parse user utterances, understand the user’s intent, and pass that intent onto our chatbot, to be handled by the proper intent handler. The bot will then respond accordingly to that intent — with a greeting, with the requested fact, or by providing help.
Entities
LUIS describes an entity as being like a variable, used to capture and pass important information. We will start by defining our AzureFacts Intent’s Facts Entities. The Facts entities represent each type of fact a user might request. The requested fact is extracted from the user’s utterances and passed to the chatbot. LUIS allows us to import entities as JSON. I have included a set of Facts entities to import, in the azure-facts-entities.json file, within the project on GitHub (gist).
[ | |
{ | |
"canonicalForm": "certifications", | |
"list": [ | |
"certification", | |
"certification exam", | |
"certification exams" | |
] | |
}, | |
{ | |
"canonicalForm": "cognitive", | |
"list": [ | |
"cognitive services", | |
"cognitive service" | |
] | |
}, | |
{ | |
"canonicalForm": "competition", | |
"list": [ | |
"competitors", | |
"competitor" | |
] | |
}, | |
{ | |
"canonicalForm": "cosmos", | |
"list": [ | |
"cosmosdb", | |
"cosmos db", | |
"nosql", | |
"mongodb" | |
] | |
}, | |
{ | |
"canonicalForm": "kubernetes", | |
"list": [ | |
"aks", | |
"kubernetes service", | |
"docker", | |
"containers" | |
] | |
} | |
] |
Each entity includes a primary canonical form, as well as possible synonyms the user may utter in their invocation. If the user utters a synonym, LUIS will understand the intent and pass the canonical form of the entity to the chatbot. For example, if we said ‘tell me about Azure AKS,’ LUIS understands the phrasing, identifies the correct intent, AzureFacts intent, substitutes the synonym, ‘AKS’, with the canonical form of the Facts entity, ‘kubernetes’, and passes the top scoring intent to be handled and the value ‘kubernetes’ to our bot. The bot would then query for the document associated with ‘kubernetes’ in Cosmos DB, and return a response.
Utterances
Once we have created and associated our Facts entities with our AzureFacts intent, we need to input a few possible phrases a user may utter to invoke our chatbot. Microsoft actually recommends not coding too many utterance examples, as part of their best practices. Below you see an elementary list of possible phrases associated with the AzureFacts intent. You also see the blue highlighted word, ‘Facts’ in each utterance, a reference to the Facts entities. LUIS understands that this position in the phrasing represents a Facts entity value.
Patterns
Patterns, according to Microsoft, are designed to improve the accuracy of LUIS, when several utterances are very similar. By providing a pattern for the utterances, LUIS can have higher confidence in the predictions.
The topic of training your LUIS app is well beyond the scope of this post. Microsoft has an excellent series of articles, I strongly suggest reading. They will greatly assist in improving the accuracy of your LUIS chatbot.
Once you have completed building and training your intents, entities, phrases, and other items, you must publish your LUIS app for it to be accessed from your chatbot. Publish allows LUIS to be queried from an HTTP endpoint. The LUIS interface will enable you to publish both a Staging and a Production copy of the app. For brevity, I published directly to Production. If you recall our chatbot’s application settings, earlier, the settings include a luisAppId
, luisAppKey
, and a luisAppIdHostName
. Together these form the HTTP endpoint, LuisModelUrl
, through which the chatbot will access the LUIS app.
Using the LUIS API endpoint, we can test our LUIS app, independent of our chatbot. This can be very useful for troubleshooting bot issues. Below, we see an example utterance of ‘tell me about Azure Functions.’ LUIS has correctly deduced the user intent, assigning the AzureFacts intent with the highest prediction score. LUIS also identified the correct Entity, ‘functions,’ which it would typically return to the chatbot.
Deployment
With our chatbot developed and our LUIS app built, trained, and published, we can deploy our bot to the Azure platform. There are a few ways to deploy our chatbot, depending on your platform and language choice. I chose to use the DevOps practice of Continuous Deployment, offered in the Azure Bot Management console.
With Continuous Deployment, every time we commit a change to GitHub, a webhook fires, and my chatbot is deployed to the Azure platform. If you have high confidence in your changes through testing, you could choose to commit and deploy directly.
Alternately, you might choose a safer approach, using feature branch or PR requests. In which case your chatbot will be deployed upon a successful merge of the feature branch or PR request to master.
Manual Testing
Azure provides the ability to test your bot, from the Azure portal. Using the Bot Management Test in Web Chat console, you can test your bot using the Web Chat channel. We will talk about different kinds of channel later in the post. This is an easy and quick way to manually test your chatbot.
For more sophisticated, automated testing of your chatbot, there are a handful of frameworks available, including bot-tester, which integrates with mocha and chai. Stuart Harrison published an excellent article on testing with the bot-tester framework, Building a test-driven chatbot for the Microsoft Bot Framework in Node.js.
Log Streaming
As part of testing and troubleshooting our chatbot in Production, we will need to review application logs occasionally. Azure offers their Log streaming feature. To access log streams, you must turn on application logging and chose a log level, in the Azure Monitoring Diagnostic logs console, which is off by default.
Once Application logging is active, you may review logs in the Monitoring Log stream console. Log streaming will automatically be turned off in 12 hours and times our after 30 minutes of inactivity. I personally find application logging and access to logs, more difficult and far less intuitive on the Azure platform, than on AWS or Google Cloud.
Metrics
As part of testing and eventually monitoring our chatbot in Production, the Azure App Service Overview console provides basic telemetry about the App Service, on which the bot is running. With Azure Application Insights, we can drill down into finer-grain application telemetry.
Web Integration
With our chatbot built, trained, deployed and tested, we can integrate it with multiple channels. Channels represent all how our users might interact with our chatbot, Web Chat, Slack, Skype, Facebook Messager, and so forth. Microsoft describes channels as a connection between the Bot Framework and communication apps. For this post, we will look at two channels, Web Chat and Slack.
Enabling Web Chat is probably the easiest of the channels. When you create a bot with Bot Service, the Web Chat channel is automatically configured for you. You used it to test your bot, earlier in the post. Displaying your chatbot through the Web Chat channel, embedded in your website, requires a secret, for which, you have two options. Option one is to keep your secret hidden, exchange your secret for a token, and generate the secret. Option two, according to Microsoft, is to embed the web chat control in your website using the secret. This method will allow other developers to easily embed your bot into their websites.
Embedding Web Chat in your website allows your website users to interact directly with your chatbot. Shown below, I have embedded our chatbot’s Web Chat channel in my website. It will enable a user to interact with the bot, independent of the website’s primary content. Here, a user could ask for more information on a particular topic they found of interest in the article, such as Azure Kubernetes Service (AKS). The Web Chat window is collapsible when not in use.
The Web Chat is embedded using an HTML iframe tag. The HTML code to embed the Web Chat channel is included in the Web Chat Channel Configuration console, shown above. I found an effective piece of JavaScript code by Anthony Cu, in his post, Embed the Bot Framework Web Chat as a Collapsible Window. I modified Anthony’s code to fit my own website, moving the collapsable Web Chat iframe to the bottom right corner and adjusting the dimensions of the frame, as not to intrude on the site’s central content area. I’ve included the code and a simulated HTML page in the GitHub project.
Slack Integration
To integrate your chatbot with the Slack channel, I will assume you have an existing Slack Workspace with sufficient admin rights to create and deploy Slack Apps to that Workspace.
To integrate our chatbot with Slack, we need to create a new Bot App for Slack. The Slack App will be configured to interact with our deployed bot on Azure securely. Microsoft has an excellent set of easy-to-follow documentation on connecting a bot to Slack. I am only going to summarize the steps here.
Once your Slack App is created, the Slack App contains a set of credentials.
Those Slack App credentials are then shared with the chatbot, in the Azure Slack Channel integration console. This ensures secure communication between your chatbot and your Slack App.
Part of creating your Slack App, is configuring Event Subscriptions. The Microsoft documentation outlines six bot events that need to be configured. By subscribing to bot events, your app will be notified of user activities at the URL you specify.
You also configure a Bot User. Adding a Bot User allows you to assign a username for your bot and choose whether it is always shown as online. This is the username you will see within your Slack app.
Once the Slack App is built, credentials are exchanged, events are configured, and the Bot User is created, you finish by authorizing your Slack App to interact with users within your Slack Workspace. Below I am authorizing the Azure Tech Facts Bot Slack App to interact with users in my Programmatic Ponderings Slack Workspace.
Below we see the Azure Tech Facts Bot Slack App deployed to my Programmatic Ponderings Slack Workspace. The Slack App is shown in the Slack for Mac desktop app.
Similarly, we see the same Azure Tech Facts Bot Slack App being used in the Slack iOS App.
Conclusion
In this brief post, we saw how to develop a Machine learning-based LUIS-enabled chatbot using the Azure Bot Service and the BotBuilder SDK. We enhanced the bot’s functionality with two of Azure’s Cloud services, Cosmos DB and Blob Storage. Once built, we integrated our chatbot with the Web Chat and Slack channels.
This was a very simple demonstration of a LUIS chatbot. The true power of intelligent bots comes from further integrating bots with Azure AI and Machine Learning Services, such as Azure’s Cognitive Services. Additionally, Azure cloud platform offers other more traditional cloud services, in addition to Cosmos DB and Blob Storage, to extend the feature and functionally of bots, such as messaging, IoT, Azure serverless Functions, and AKS-based microservices.
Azure is a trademark of Microsoft
The image in the background of Azure icon copyright: kran77 / 123RF Stock Photo
All opinions expressed in this post are my own and not necessarily the views of my current or past employers, their clients, or Microsoft.
Developing Applications for the Cloud with Azure App Services and MongoDB Atlas
Posted by Gary A. Stafford in .NET Development, Azure, Cloud, Software Development on November 1, 2017
Shift Left Cloud
The continued growth of compute services by leading Cloud Service Providers (CSPs) like Microsoft, Amazon, and Google are transforming the architecture of modern software applications, as well as the software development lifecycle (SDLC). Self-service access to fully-managed, reasonably-priced, secure compute has significantly increased developer productivity. At the same time, cloud-based access to cutting-edge technologies, like Artificial Intelligence (AI), Internet Of Things (IoT), Machine Learning, and Data Analytics, has accelerated the capabilities of modern applications. Finally, as CSPs become increasingly platform agnostic, Developers are no longer limited to a single technology stack or operating system. Today, Developers are solving complex problems with multi-cloud, multi-OS polyglot solutions.
Developers now leverage the Cloud from the very start of the software development process; shift left Cloud, if you will*. Developers are no longer limited to building and testing software on local workstations or on-premise servers, then throwing it over the wall to Operations for deployment to the Cloud. Developers using Azure, AWS, and GCP, develop, build, test, and deploy their code directly to the Cloud. Existing organizations are rapidly moving development environments from on-premise to the Cloud. New organizations are born in the Cloud, without the burden of legacy on-premise data-centers and servers under desks to manage.
Example Application
To demonstrate the ease of developing a modern application for the Cloud, let’s explore a simple API-based, NoSQL-backed web application. The application, The .NET Diner, simulates a rudimentary restaurant menu ordering interface. It consists of a single-page application (SPA) and two microservices backed by MongoDB. For simplicity, there is no API Gateway between the UI and the two services, as normally would be in place. An earlier version of this application was used in two previous posts, including Cloud-based Continuous Integration and Deployment for .NET Development.
The original restaurant order application was written with JQuery and RESTful .NET WCF Services. The new application, used in this post, has been completely re-written and modernized. The web-based user interface (UI) is written with Google’s Angular 4 framework using TypeScript. The UI relies on a microservices-based API, built with C# using Microsoft’s Web API 2 and .NET 4.7. The services rely on MongoDB for data persistence.
All code for this project is available on GitHub within two projects, one for the Angular UI and another for the C# services. The entire application can easily be built and run locally on Windows using MongoDB Community Edition. Alternately, to run the application in the Cloud, you will require an Azure and MongoDB Atlas account.
This post is primarily about the development experience. For brevity, the post will not delve into security, DevOps practices for CI/CD, and the complexities of staging and releasing code to Production.
Cross-Platform Development
The API, consisting of a set of C# microservices, was developed with Microsoft Visual Studio Community 2017 on Windows 10. Visual Studio touts itself as a full-featured Integrated Development Environment (IDE) for Android, iOS, Windows, web, and cloud. Visual Studio is easily integrated with Azure, AWS, and Google, through the use of Extensions. Visual Studio is an ideal IDE for cloud-centric application development.
To simulate a typical mixed-platform development environment, the Angular UI front-end was developed with JetBrains WebStorm 2017 on Mac OS X. WebStorm touts itself as a powerful IDE for modern JavaScript development.
Other tools used to develop the application include Git and GitHub for source code, MongoDB Community Edition for local database development, and Postman for API development and testing, both locally and on Azure. All the development tools used in the post are cross-platform. Versions of WebStorm, Visual Studio, MongoDB, Postman, Git, Node.js, npm, and Bash are all available for Mac, Windows, and Linux. Cross-platform flexibility is key when developing modern multi-OS polyglot applications.
Postman
Postman was used to build, test, and document the application’s API. Postman is an excellent choice for developing RESTful APIs. With Postman, you define Collections of HTTP requests for each of your APIs. You then define Environments, such as Development, Test, and Production, against which you will execute the Collections of HTTP requests. Each environment consists of environment-specific variables. Those variables can be used to define API URLs and as request parameters.
Not only can you define static variables, Postman’s runtime, built on Node.js, is scriptable using JavaScript, allowing you to programmatically set dynamic variables, based on the results of HTTP requests and responses, as demonstrated below.
Postman also allows you to write and run automated API integration tests, as well as perform load testing, as shown below.
Azure App Services
The Angular browser-based UI and the C# microservices will be deployed to Azure using the Azure App Service. Azure App Service is nearly identical to AWS Elastic BeanStalk and Google App Engine. According to Microsoft, Azure App Service allows Developers to quickly build, deploy, and scale enterprise-grade web, mobile, and API apps, running on Windows or Linux, using .NET, .NET Core, Java, Ruby, Node.js, PHP, and Python.
App Service is a fully-managed, turn-key platform. Azure takes care of infrastructure maintenance and load balancing. App Service easily integrates with other Azure services, such as API Management, Queue Storage, Azure Active Directory (AD), Cosmos DB, and Application Insights. Microsoft suggests evaluating the following four criteria when considering Azure App Services:
- You want to deploy a web application that’s accessible through the Internet.
- You want to automatically scale your web application according to demand without needing to redeploy.
- You don’t want to maintain server infrastructure (including software updates).
- You don’t need any machine-level customizations on the servers that host your web application.
There are currently four types of Azure App Services, which are Web Apps, Web Apps for Containers, Mobile Apps, and API Apps. The application in this post will use the Azure Web Apps for the Angular browser-based UI and Azure API Apps for the C# microservices.
MongoDB Atlas
Each of the C# microservices has separate MongoDB database. In the Cloud, the services use MongoDB Atlas, a secure, highly-available, and scalable cloud-hosted MongoDB service. Cloud-based databases, like Atlas, are often referred to as Database as a Service (DBaaS). Atlas is a Cloud-based alternative to traditional on-premise databases, as well as equivalent CSP-based solutions, such as Amazon DynamoDB, GCP Cloud Bigtable, and Azure Cosmos DB.
Atlas is an example of a SaaS who offer a single service or small set of closely related services, as an alternative to the big CSP’s equivalent services. Similar providers in this category include CloudAMQP (RabbitMQ as a Service), ClearDB (MySQL DBaaS), Akamai (Content Delivery Network), and Oracle Database Cloud Service (Oracle Database, RAC, and Exadata as a Service). Many of these providers, such as Atlas, are themselves hosted on AWS or other CSPs.
There are three pricing levels for MongoDB Atlas: Free, Essential, and Professional. To follow along with this post, the Free level is sufficient. According to MongoDB, with the Free account level, you get 512 MB of storage with shared RAM, a highly-available 3-node replica set, end-to-end encryption, secure authentication, fully managed upgrades, monitoring and alerts, and a management API. Atlas provides the ability to upgrade your account and CSP specifics at any time.
Once you register for an Atlas account, you will be able to log into Atlas, set up your users, whitelist your IP addresses for security, and obtain necessary connection information. You will need this connection information in the next section to configure the Azure API Apps.
With the Free Atlas tier, you can view detailed Metrics about database cluster activity. However, with the free tier, you do not get access to Real-Time data insights or the ability to use the Data Explorer to view your data through the Atlas UI.
Azure API Apps
The example application’s API consists of two RESTful microservices built with C#, the RestaurantMenu
service and RestaurantOrder
service. Both services are deployed as Azure API Apps. API Apps is a fully-managed platform. Azure performs OS patching, capacity provisioning, server management, and load balancing.
Microsoft Visual Studio has done an excellent job providing Extensions to make cloud integration a breeze. I will be using Visual Studio Tools for Azure in this post. Similar to how you create a Publish Profile for deploying applications to Internet Information Services (IIS), you create a Publish Profile for Azure App Services. Using the step-by-step user interface, you create a Microsft Azure App Service Web Deploy Publish Profile for each service. To create a new Profile, choose the Microsoft Azure App Service Target.
You must be connected to your Azure account to create the Publish Profile. Give the service an App Name, choose your Subscription, and select or create a Resource Group and an App Service Plan.
The App Service Plan defines the Location and Size for your API App container; these will determine the cost of the compute. I suggest putting the two API Apps and the Web App in the same location, in this case, East US.
The Publish Profile is now available for deploying the services to Azure. No command line interaction is required. The services can be built and published to Azure with a single click from within Visual Studio.
Configuration Management
Azure App Services is highly configurable. For example, each API App requires a different configuration, in each environment, to connect to different instances of MongoDB Atlas databases. For security, sensitive Atlas credentials are not stored in the source code. The Atlas URL and sensitive credentials are stored in App Settings on Azure. For this post, the settings were input directly into the Azure UI, as shown below. You will need to input your own Atlas URL and credentials.
The compiled C# services expect certain environment variables to be present at runtime to connect to MongoDB Atlas. These are provided through Azure’s App Settings. Access to the App Settings in Azure should be tightly controlled through Azure AD and fine-grained Azure Role-Based Access Control (RBAC) service.
CORS
If you want to deploy the application from this post to Azure, there is one code change you will need to make to each service, which deals with Cross-Origin Resource Sharing (CORS). The services are currently configured to only accept traffic from my temporary Angular UI App Service’s URL. You will need to adjust the CORS configuration in the \App_Start\WebApiConfig.cs
file in each service, to match your own App Service’s new URL.
Angular UI Web App
The Angular UI application will be deployed as an Azure Web App, one of four types of Azure App Services, mentioned previously. According to Microsoft, Web Apps allow Developers to program in their favorite languages, including .NET, Java, Node.js, PHP, and Python on Windows or .NET Core, Node.js, PHP or Ruby on Linux. Web Apps is a fully-managed platform. Azure performs OS patching, capacity provisioning, server management, and load balancing.
Using the Azure Portal, setting up a new Web App for the Angular UI is simple.
Provide an App Name, Subscription, Resource Group, OS Type, and select whether or not you want Application Insights enabled for the Web App.
Although an amazing IDE for web development, WebStorm lacks some of the direct integrations with Azure, AWS, and Google, available with other IDE’s, like Visual Studio. Since the Angular application was developed in WebStorm on Mac, we will take advantage of Azure App Service’s Continuous Deployment feature.
Azure Web Apps can be deployed automatically from most common source code management platforms, including Visual Studio Team Services (VSTS), GitHub, Bitbucket, OneDrive, and local Git repositories.
For this post, I chose GitHub. To configure deployment from GitHub, select the GitHub Account, Organization, Project, and Branch from which Azure will deploy the Angular Web App.
Configuring GitHub in the Azure Portal, Azure becomes an Authorized OAuth App on the GitHub account. Azure creates a Webhook, which fires each time files are pushed (git push
) to the dist
branch of the GitHub project’s repository.
Using the ng build --dev --env=prod
command, the Angular UI application must be first transpiled from TypeScript to JavaScript and bundled for deployment. The ng build
command can be run from within WebStorm or from the command line.
The the --env=prod
flag ensures that the Production environment configuration, containing the correct Azure API endpoints, issued transpiled into the build. This configuration is stored in the \src\environments\environment.prod.ts
file, shown below. You will need to update these two endpoints to your own endpoints from the two API Apps you previously deployed to Azure.
Optionally, the code should be optimized for Production, by replacing the --dev
flag with the --prod
flag. Amongst other optimizations, the Production version of the code is uglified using UglifyJS. Note the difference in the build files shown below for Production, as compared to files above for Development.
Since I chose GitHub for deployment to Azure, I used Git to manually push the local build files to the dist
branch on GitHub.
Every time the webhook fires, Azure pulls and deploys the new build, overwriting the previously deployed version, as shown below.
To stage new code and not immediately overwrite running code, Azure has a feature called Deployment slots. According to Microsoft, Deployment slots allow Developers to deploy different versions of Web Apps to different URLs. You can test a certain version and then swap content and configuration between slots. This is likely how you would eventually deploy your code to Production.
Up and Running
Below, the three Azure App Services are shown in the Azure Portal, successfully deployed and running. Note their Status, App Type, Location, and Subscription.
Before exploring the deployed UI, the two Azure API Apps should be tested using Postman. Once the API is confirmed to be working properly, populated by making an HTTP Post request to the menuitems
API, the RestaurantOrderService
Azure API Service. When the HTTP Post request is made, the RestaurantOrderService
stores a set of menu items in the RestaurantMenu
Atlas MongoDB database, in the menus
collection.
The Angular UI, the RestaurantWeb
Azure Web App, is viewed by using the URL provided in the Web App’s Overview
tab. The menu items displayed in the drop-down are supplied by an HTTP GET request to the menuitems
API, provided by the RestaurantMenuService
Azure API Service.
Your order is placed through an HTTP Post request to the orders
API, the RestaurantOrderService
Azure API Service. The RestaurantOrderService
stores the order in the RestaurantOrder
Atlas MongoDB database, in the orders
collection. The order details are returned in the response body and displayed in the UI.
Once you have the development version of the application successfully up and running on Atlas and Azure, you can start to configure, test, and deploy additional application versions, as App Services, into higher Azure environments, such as Test, Performance, and eventually, Production.
Monitoring
Azure provides in-depth monitoring and performance analytics capabilities for your deployed applications with services like Application Insights. With Azure’s monitoring resources, you can monitor the live performance of your application and set up alerts for application errors and performance issues. Real-time monitoring useful when conducting performance tests. Developers can analyze response time of each API method and optimize the application, Azure configuration, and MongoDB databases, before deploying to Production.
Conclusion
This post demonstrated how the Cloud has shifted application development to a Cloud-first model. Future posts will demonstrate how an application, such as the app featured in this post, is secured, and how it is continuously built, tested, and deployed, using DevOps practices.
All opinions in this post are my own, and not necessarily the views of my current or past employers or their clients.