Chapter 6 Connect to and consume Azure services and third-party services

Nowadays, companies use different systems for different tasks that are usually performed by different departments. Although these separate systems work for solving a specific need, they usually act as independent actors in a big scenario. These independent actors manage information about the company that can potentially be duplicated by other independent actors.

When a company realizes that independent actors are managing their data, they usually try to make all the independent actors or systems work together and share information between them. This situation is independent of using cloud services or on-premises services. To make the independent actors work together, you need to make connections between each actor or service that needs to communicate with the other.

You can use different services and techniques to achieve this interconnection. Azure provides some useful services that allow different services to work together without making big changes to the interconnected services.

Skills covered in this chapter:

Skill 6.1: Develop an App Service Logic App

Skill 6.2: Integrate Azure Search within solutions

Skill 6.3: Establish API Gateways

Skill 6.4: Develop event-based solutions

Skill 6.5: Develop message-based solutions

Skill 6.1: Develop an App Service Logic App

Exchanging information between different applications is a goal for most companies. Sharing the information enriches the internal process and creates more insight into the information itself. By using the App Service Logic App, you can create workflows that interconnect different systems, based on conditions and rules and easing the process of sharing information between them. Also, you can also take advantage of the Logic Apps features to implement business process workflows.

This skill covers how to:

  • Create a Logic App
  • Create a custom connector for Logic Apps
  • Create a custom template for Logic Apps

Create a Logic App

Before you can interconnect two separate services, you need to fully understand which information you need to share between the services. Sometimes the information needs to undergo some transformations before it can be consumed by a service. You could write code for making this interconnection, but this is a time-consuming and error-prone task.

Azure offers the App Service Logic Apps that allows the interconnection of two or more services that share information between them. This interconnection between different services is defined by a business process. Azure Logic Apps allows you to build complex interconnection scenarios by using some elements that ease the work:

Workflows Define the source and destination of the information. Workflows connect to different services by using connectors. A workflow defines the steps or actions that the information needs to take to deliver the information from the source to the correct destination. You use a graphical language to visualize, design, build, automate, and

deploy a business process.

Managed Connectors A connector is an object that allows your workflow to access data, services, and systems. Microsoft provides some prebuilt connectors to the Microsoft services. These connectors are managed by Microsoft and provide the needed triggers and action objects to work with those services.

Triggers Triggers are events that fire when certain conditions are met. You use a trigger as the entry or starting point of a workflow. For example, when a new message arrives at your company’s purchases mailbox, it can start a workflow that can access information from the subject and body of the message and create a new entry in the ERP system.

Actions Actions are each of the steps that you configure in your workflow. Actions happen only when the workflow is executed. The workflow starts executing when a new trigger fires.

Enterprise Integration Pack If you need to perform more advanced integrations, the Enterprise Integration Pack provides you with BizTalk Server capabilities.

Note Azure Logic App, Azure Functions, Azure App Service Webjobs, and Microsoft Flow

If you need to implement workflows, Microsoft provides some products that you can use for that task. Although there is some overlap of the features provided by Logic Apps, Functions,WebJobs, and Flow, they are designed for different scenarios. You can review more details about the https://docs.microsoft.com/enus/azure/azure-functions/functions-compare-logic-apps-ms-flow-webjobs.

You can use Azure Logic Apps for different purposes. The following procedure shows how to create an Azure Logic App workflow that writes a message in the Microsoft Teams app when a new build completes in Azure DevOps. For this procedure, you need an Azure DevOps account with a configured project that you can build. You also need a Microsoft Office 365 subscription with access to the Microsoft Teams application. Let’s start by creating and configuring the Azure Logic App:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource in the navigation menu on the left side of the Azure Portal.
  3. On the New blade, on the Azure Marketplace list at the left side of the blade, click Integration.
  4. In the Featured column on the right side of the blade, click Logic App.
  5. On the Create Logic App blade, type a name in the Name text box.
  6. In the Resource group text box, type a name for a new resource group. Alternatively, you can use a new resource group, selecting it from the drop-down menu.
  7. Select a location from the Location drop-down menu.
  8. Click the Create button at the bottom of the blade.
  9. A dialog box appears in the top-right corner of the Azure Portal once the Azure Logic App is created. Click the Go To Resource button in the dialog box.
  10. On the Logic Apps blade, on the Logic App Designer, click the Blank Logic App in the Templates section.
  11. On the Logic Apps Designer, on the Search Connectors And Triggers text box, type Azure DevOps..
  12. Click the Azure DevOps icon on the Results panel.
  13. On the Triggers tab, select the trigger named When A Build Completes.
  14. Click the Sign In button on the Azure DevOps element. At this point, you connect your Azure Subscription with your Azure DevOps account. If you don’t already have an Azure DevOps account, you can configure one now by using the guide at https://docs.microsoft.com/enus/azure/devops/user-guide/sign-up-invite-teammates?view=azuredevops..
  15. On the panel for the When A Build Completes trigger, shown in Figure 6-1, select your Azure DevOps account’s name from the Account Name drop-down menu.
  16. Figure 6-1 Configuring an Azure Logic Apps trigger
    Screenshot_59
  17. On the panel for the When A Build completes trigger, shown in Figure 6-1, select a project in your Azure DevOps account from the Project Name drop-down menu. If you don’t have a project in Azure DevOps, you can review the article at https://docs.microsoft.com/enus/azure/devops/organizations/projects/create-project?view=azuredevops
  18. Click the New Step button below the trigger panel.
  19. On the Choose An Action panel, type Teams in the Search Connectors And Actions text box..
  20. On the Actions tab, click the Post A Message (V3) action.
  21. On the Post A Message (V3) action panel, select a team from the Team drop-down menu.
  22. Select General from the Channel drop-down menu.
  23. In the Message text area, type The..
  24. In the Dynamic Content dialog, shown in Figure 6-2, on the right side of the Message text area, click the See More link. This link shows the list of dynamic attributes that you can add to your message.
  25. Figure 6-2 Dynamic content from a connector trigger
    Screenshot_60
  26. Choose Value Build Definition Name..
  27. Type build finished with status in the message area next to the dynamic attribute..
  28. Click Value Status.
  29. Click the Save button in the top-left corner of the Azure Logic Apps Designer blade..
  30. Click the Run button in the top left corner of the Azure Logic Apps Designer blade.
  31. At this point, the Azure Logic Apps start listening for the configured trigger in Azure DevOps. When a new build finishes in Azure DevOps, Azure Logic Apps sends a message to your configured team’s General Channel in your Microsoft Teams account. Now, you are going to create a pipeline in Azure DevOps to build an example project. Once the build finishes, you will receive a new message in the Microsoft Teams channel:.
  32. Navigate to the GitHub example at https://github.com/MicrosoftDocs/pipelines-dotnet-core, and sign in to your account. If you don’t have a GitHub account, you can create a new one for free by visiting https://github.com/join
  33. At the top-right corner of the project’s page, click the Fork button.
  34. Open your Azure DevOps account (https://dev.azure.com
  35. Click the Project name that you configured in step 16.
  36. Click the Pipelines element on the navigation bar on the left side of the project’s page.
  37. Click the New Pipeline button.
  38. On the New Pipeline window, click GitHub.
  39. Sign in to your GitHub account.
  40. On the Authorize Azure Pipelines (OAuth) window, click Authorize AzurePipelines at the bottom of the page.
  41. In your Azure DevOps account, on the Select A Repository page, click the pipelines-dotnet-core project.
  42. On the GitHub Approve & Install Azure Pipelines page, click the Approve & Install button at the bottom of the page.
  43. In your Azure DevOps account, on the Configure Your Pipeline page, click ASP.NET Core.
  44. Click the Save And Run button in the top-right corner of the page.
  45. On the Save and Run panel, click the Save And Run button on the bottom of the panel.

At this point, the Azure DevOps agent is building the sample project. Once the build finishes, you will receive a new message in your Microsoft Teams channel, as shown in Figure 6-3.

Figure 6-3 A message in Microsoft Teams from Azure DevOps
Screenshot_61

Need More Review? Control Workflow

In a regular workflow, you need to run different actions based on the input values or based on certain conditions. You can run these different actions using control workflow. To learn more about control flow and other connectors that you can use in Azure Logic Apps, see

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-controlflow-conditional-statement.

Create a custom connector for Logic Apps

Microsoft provides more than 200 built-in connectors that you can use in your Azure Logic Apps workflow. Despite this number of connectors, there are opportunities that you need some specific features that are not provided by the built-in connectors, or you want to create a connector for your company’s application.

You can create custom connectors for Microsoft Flow, PowerApps, and Azure Logic Apps. Although you cannot share Azure Logic Apps connectors with Microsoft Flow and PowerApps connectors, the principle for creating custom connectors is the same for all three platforms. A custom connector is basically a wrapper for a REST or SOAP API. This wrapper allows Azure Logic Apps to interact with the API of the application. The application that you want to include in the custom connector can be a public application, such as Amazon Web Services, Google Calendar, or the API of your application published to the Internet. Using the on-premises data gateway, you can also connect the custom connector with an on-premises application deployed in your data center. Every custom connector has the following lifecycle:

  1. Build your API You can wrap any REST or SOAP API in a custom connector. If you are creating your API, you should consider using Azure Functions, Azure Web Apps, or Azure API Apps.
  2. Secure your API You need to authenticate the access to your API. If you are implementing your application using Azure Functions, Azure Web Apps, or Azure API Apps, you can enable the Azure Active Directory authentication in the Azure Portal for your application. Also, you can enforce authentication directly on your API’s code. You can use any of the following authentication mechanisms:
  3. Generic OAuth 2.0.

    OAuth 2.0 for specific services, like Azure Active Directory, Dropbox, GitHub, or SalesForce

    Basic Authentication

    API Key

  4. Describe the API and define the custom connector You need to provide a description of your API’s endpoints. Azure Logic Apps supports two different language-agnostic and machine-readable document formats that you can use for documenting this description: OpenAPI (formerly known as Swagger) or Postman collections. You can create a custom connector from the OpenAPI or Postman collection documentation.
  5. Use the connector in an Azure Logic Apps Once you have created the custom connector, you can use it as a regular managed built-in connector in your workflow. You need to create a connection to your API using your custom connector. Then you can use the triggers and actions that you configured in your custom connector.
  6. Share your connector Once you have created your custom connector, you can share it with other users in your organization. This step is optional.
  7. Certify your connector If you want to share your custom connector with other users outside your organization, you need to send the custom connector to Microsoft. Then Microsoft can review your custom connector to ensure that it works correctly. Once the connector is reviewed and validated, Microsoft certifies it, and you can share with users outside your organization.

The following steps show how to create a custom connector following the previous lifecycle. For the sake of brevity, we are using an Azure Cognitive Services API for creating this custom connector. For this example, you need to create an Azure Cognitive Services account that uses the Text Analytics API. You can sign up for a Text Analytics API at https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/howtos/text-analytics-how-to-signup

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource in the top-left corner of the Azure Portal.
  3. Type logic apps on the Search The Marketplace text box.
  4. Click Logic Apps Custom Connector in the results list.
  5. Click the Create button.
  6. On the Create Logic Apps Custom Connector panel, type a Name for your custom connector.
  7. Select a Subscription and Location from their respective drop-down menus.
  8. Type a name for the new Resource Group. Alternatively, you can click the Use Existing control and select an existing resource group from the drop-down menu.
  9. Click the Create button at the bottom of the panel.
  10. Type the name of your Azure Text Analytics API in the search text box at the top of the Azure Portal.
  11. Click the name of your Azure Text Analytics API in the results list.
  12. On the Overview blade of your Text Analytics account, copy the Endpoint value. You will need this value in a later step.
  13. On the Overview blade, click the Show Access Keys link.
  14. On the Keys blade, copy one of the two available keys. You will need this value in step 27 to create a workflow using this custom connector.
  15. Download the OpenAPI definition of Azure Cognitive Services at https://procsi.blob.core.windows.net/docs/SentimentDemo.openapi_defin You will need this file in step 20.
  16. Type the name of your Azure Logic Apps Custom Connector on the search text box at the top of the Azure Portal.
  17. Click the name of your Azure Logic Apps Custom Connector in the results list.
  18. Click the Edit button on the Custom Connector’s Overview blade.
  19. On the Edit Logic Apps Custom Connector blade, click the Import button on the How Do You Want To Create Your Connector panel, shown in Figure 6-4.
  20. Figure 6-4 Importing the OpenAPI definition of a REST API.
    Screenshot_62
  21. Choose the JSON file that you downloaded in step 15.
  22. In the General Information section, below the Custom Connectors section, change the value of the Host parameter to the hostname of your Azure Text Analytics endpoint. You copied this value in step 12. You only need to use the host part of the URL.
  23. Note: Base URL

    In this example, the endpoint definition already contains the correct base URL in the endpoint definition. If you change the default Base URL property in the General Information section in your Azure Logic Apps Custom Connector, you will receive a 404 error every time that you try to use your custom connector in an Azure Logic Apps workflow.

  24. Click the Security link on top of the page. (You can also find a Security link at the bottom-left corner of the page, and either of these links will take you to the Security section.)
  25. Review the options on the Security page. For this custom connector, you are using API Key authentication. This authentication means that you need to provide an API Key when you use this connector in a workflow. Leave all settings in this page as they are.
  26. Click the Definition link on top of the page. (You can also find a Definition link at the bottom-left corner of the page, and either of these links will take you to the Definition section.)
  27. Review the settings on the Definition page. On this page, you will find all the endpoints that have been defined in the OpenAPI JSON file that you imported in step 19. These endpoints translate into Actions or Triggers, depending on the definition in the JSON file. You can also manually add new Actions and Triggers as needed using this page. Leave all settings on this page unchanged.
  28. Click the Update Connector link at the top-right corner of the Edit Logic Apps Custom Connector blade.

At this point, you have successfully created your Azure Logic Apps Custom Connector. In the following steps, you are going to create a workflow for testing your new custom connector. This workflow gets the content from files in an Azure Blob Storage account, sends the content of the files to the Azure Cognitive Services, and sends the sentiment score of the content to a Microsoft Teams channel:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource in the top-left corner of the Azure Portal.
  3. Type logic apps in the Search The Marketplace text box.
  4. Click Logic App in the results list.
  5. Click the Create button.
  6. On the Create Logic App panel, type a Name for your Azure Logic App.
  7. Select the appropriate Subscription and Location from their respective drop-down menus.
  8. In the Resource Group text box, type the name of the new resource group. Alternatively, you can use an existing resource group by clicking the Use Existing option and selecting a resource group from the dropdown menu.
  9. Click the Create button at the bottom of the panel.
  10. Navigate to the newly created Azure Logic App.
  11. On the Logic Apps Designer blade, choose the Blank Logic App template. If you don’t see the Logic Apps Designer blade as soon as you open your Azure Logic App, click Logic App Designer on the navigation menu on the left side of your Azure Logic App.
  12. In the Search Connectors and Triggers text box, type Azure Blob.
  13. In the Triggers section, click the When A Blob Is Added Or Modified (Properties Only) trigger.
  14. On the When A Blob Is Added Or Modified (Properties Only) panel, type a name for the connection with the Azure Blob Storage service. If you don’t have an Azure Blob Storage account, you can learn how to create a new Azure Blob Storage account and a new container at .
  15. Select a storage account from the available list of storage accounts.
  16. On the When A Blob Is Added Or Modified (Properties Only) panel, click the folder icon next to the Select A Container text box, shown in Figure 6-5, and select the container that contains the files we used in this example.
  17. Figure 6-5 Selecting a container for an Azure Blob Storage connector trigger.
    Screenshot_63
  18. Click the New Step button below the trigger dialog box.
  19. On the Choose An Action panel, click the Azure Blob Storage icon in the Recent section.
  20. Click the Get Blob Content action in the Actions section.
  21. Click the Specify The Blob text box.
  22. Click the List Of Files Id in the dialog box with the Dynamic Content list. Leave the Infer Content Type setting as is.
  23. Click the New Step button.
  24. On the Choose An Action panel, click Custom. Your Azure Logic Apps Custom Connector should appear in this section.
  25. Click your Azure Logic Apps Custom Connector.
  26. Click the Returns A Numeric Score Representing The Sentiment Detected action.
  27. On the Returns A Numeric Score Representing The Sentiment Detected action panel, type a Connection Name for the connection with your Azure Cognitive Services Text Analytics API.
  28. On the API Key text box, paste the value of the Text Analytics API key that you copied in step 14; doing so creates an Azure Logic Apps Custom Connector.
  29. Click the Create button.
  30. Click inside the Documents ID – 1 text box.
  31. In the list of Dynamic Content, select List Of Files Id.
  32. Click inside the Documents Text -1 text box.
  33. In the list of Dynamic Content, select File Content from the Get Blob Content section.
  34. Click the New Step button.
  35. Type Microsoft Teams in the Search Connectors And Actions text box on the Choose An Action panel.
  36. Click the Microsoft Teams icon.
  37. Click the Post A Message (V3) action.
  38. Grant access to your Microsoft Teams account.
  39. On the Post A Message (V3) action panel, select an existing team from the Team drop-down menu.
  40. Select the General channel in the Add Teams Channel ID drop-down menu.
  41. Click inside the Message text box.
  42. In the Dynamic Content dialog box, click the See More link in the Get Blob Content section.
  43. In the Get Blob Content section, click File Content.
  44. In the Message text area, type Sentiment score: . (Note that you need to add a space after the colon.)
  45. In the Dynamic Content dialog box, click the See More link in the Returns A Numeric Score Representing The Sentiment section.
  46. In the Returns A Numeric Score Representing The Sentiment section, click the Documents Score item.
  47. Click the Save button in the top-left corner of the Logic Apps Designer blade.
  48. Click the Run button.

At this point, you should be ready to test your new Azure Logic Apps Custom Connector. Now you can simply create one text file containing a positive sentence, such as Today is a great day. Then upload that file to the Azure Blob Storage container that you configured previously. If everything went well, you should get a new message in the selected team’s General channel, and the output of the workflow should look similar to Figure 6-6.

Figure 6-6 Workflow execution result
Screenshot_64

Exam Tip

You can create custom connectors for Azure Logic Apps, Microsoft Flow, and Microsoft PowerApps. You cannot reuse a connector created for Azure Logic Apps in Microsoft Flow or PowerApps (or vice-versa). You can use the same OpenAPI definition to create a custom connector for these three services.

Need More Review? Custom Connector

You can learn more about custom connectors at https://docs.microsoft.com/en-us/connectors/custom-connectors/.

Create a custom template for Azure Logic Apps

Once you have created an Azure Logic App, you can reuse it in other Azure subscriptions or share with other colleagues. You can create a template from your working Azure Logic App to automate deployment processes. When you create a template, you are converting the definition of your Azure Logic App into an Azure Resource Manager (ARM) template. Using ARM templates allows you to take advantage of the flexibility of the ARM platform by separating the definition of the Azure Logic App from the values used in the logic app. When you deploy a new Azure Logic App from a template, you can provide a parameter file in the same way that you do with other ARM templates. Azure also provides some prebuilt Logic App templates. You can use these templates as a base for creating your own templates.

You can download an Azure Logic Apps template using several mechanisms:

Azure Portal You can use the Export Template option in the Azure Logic App in the Azure Portal for downloading the ARM template..

Visual Studio You can use the Azure Logic Apps Tools extension for Visual Studio to connect your Azure Subscription and download a template from your Azure Logic Apps..

PowerShell You can use the LogicAppTemplate PowerShell module to download a template from your Azure Logic App..

A Logic App template is a JSON file comprises of three main areas:

Logic App resource This section contains basic information about the Logic App itself. This information is the location of the resource, the pricing plans, and the workflow definition..

Workflow definition This section contains the description of the workflow, including the triggers and action in your workflow. This section also contains how the Logic App runs these triggers and actions..

Connections This section stores the information about the connectors that you use in the workflow..

Use the following procedure to create a template from your Azure Logic App using Visual Studio:

  1. Download the Azure Logic Apps Tool extension for Visual Studio 2019 at https://aka.ms/download-azure-logic-apps-tools-visual-studio-2019
  2. Install the Azure Logic Apps Tool extension.
  3. Open Visual Studio 2019.
  4. In the Visual Studio 2019 welcome window, click the Continue Without Code link below the Get Started section.
  5. In the Visual Studio window, click View > Cloud Explorer.
  6. In the Cloud Explorer window, shown in Figure 6-7, click the user icon to open the Account Manager.
  7. Figure 6-7 Cloud Explorer window.
    Screenshot_65
  8. Click the Manage Accounts link.
  9. In the All Accounts section, click the Sign In link.
  10. Sign in with an account that has privileges to access your Azure subscription.
  11. Ensure that your Azure subscription appears in the list of subscriptions in the Cloud Explorer window.
  12. Click the Apply button.
  13. In the Cloud Explorer tree control, navigate to Your Subscription > Logic Apps.
  14. Right-click the Logic App that you want to convert to a template.
  15. In the contextual menu shown in Figure 6-8, click Open With Logic App Editor.
  16. Figure 6-8 Logic App tool contextual menu.
    Screenshot_66
  17. On the Logic App Editor tab, click the Download button.
  18. Select a location to which you want to download the JSON file.

At this point, you can edit and customize your template. Once you are done with the modifications to your template, you can create a parameters file for deploying this template.

Need More Review? Logic App Templates

You can learn more by reading the following articles about Logic App templates:

Create Logic App Templateshttps://docs.microsoft.com/enus/azure/logic-apps/logic-apps-create-deploy-azure-resourcemanager-templates

Deploy Logic App Templateshttps://docs.microsoft.com/enus/azure/logic-apps/logic-apps-create-deploy-azure-resourcemanager-templates

Skill 6.2: Integrate Azure Search within solutions

If you want to add the ability to search for information in your application, indexing that information can be difficult if your app manages a lot of data. Fortunately, Microsoft offers the Azure Search service. Azure Search is a Search-as-a-Service (SaaS) platform that allows you to index information from different data sources, and it integrates with your application using a REST API or a .NET SDK. Using Azure Search, you can provide the needed infrastructure for storing the indexes and all other components needed by the search engine.

This skill covers how to:

  • Create an Azure Search index
  • Import searchable data
  • Query the Azure Search index

Create an Azure Search index

The Azure Search service provides an API that you can use to add search capabilities to your solution. You can access the Azure Service by using a REST API or a .NET SDK. Using the API or the SDK, you can extrapolate the details from and the complexity of retrieving information from the indexes.

You can think of an index as a database that contains the information that you want to be available for searching. That information is stored in the index as a document, which is a searchable entity or unit. For example, if your application manages data from devices, a document is the structure that represents each of the devices managed by your application. Another example of a document would be each of the articles of a news company. Indexes and documents are conceptually equivalent to tables and rows in a database.

Azure Search creates physical structures to store the indexes that you upload or add to the service. These physical structures depend on the index schema that you provide. Before you can create an Azure Search index, you need to create an Azure Search service instance. Use the following procedure to create an Azure Search service instance:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource on the navigation menu at the left side of the portal.
  3. Click Web in the New blade’s Azure Marketplace column.
  4. Click Azure Search in the New blade’s Featured column.
  5. On the New Search Service blade, select a resource group from the Resource Group drop-down menu. Alternatively, you can create a new resource group by clicking the Create New link below the Resource Group drop-down menu.
  6. In the Instance Details section, type a name for your service instance in the URL text box.
  7. Click the Change Pricing Tier link.
  8. Select the Free offering in the Select Pricing Tier panel.
  9. Click the Select button at the bottom of the panel.
  10. On the New Search Service blade, click the Review + Create button at the bottom of the blade.
  11. Once the validation of your settings finishes successfully, click the Create button at the bottom of the blade.

Once you have created an Azure Search service instance, you can create indexes and import data. You can create indexes by using the Azure Portal, or you can create them programmatically using C#, PowerShell, Postman, or Python. Creating an index is usually an iterative process that requires several iterations before you get the appropriate index definition for your application. Because of this iterative process and the fact that the index schema is tightly coupled with the physical storage, each time you need to make significant changes to an existing field definition, you need to rebuild the entire index. During the development stage of your index, you will use the following recommended workflow for creating a basic index:

  1. Create an initial index in the portal You can create the first initial version of your index by using the Azure Portal. You should also review whether any of your data sources are compatible with the indexer tool. If you are using supported data sources, you can speed up the prototyping and loading steps by using the Import Data wizard.
  2. Download the index schema Once you have created the initial definition of your index in the portal, you can download the index definition by using the Get Index REST API method. You can use any web testing tool, such as Postman, to complete this step. In this step, you get a JSON representation of your index definition that you can use to make modifications and to fine-tune without needing to create the entire definition in the portal on every iteration.
  3. Load your modified index If you made any modifications to your index definition, you could upload them in this step. If you made any modifications to the definition of your index, you should also upload some data to your index in this step. You can upload data to your index using JSON documents in the payload of a REST API request.
  4. Query your index Now it’s time to check your index by querying the Azure Search API. You can use Postman or Search Explorer for this purpose.
  5. Use code for the following iterations If you need to make modifications to your index definition, you should use the modificationby-code approach because you cannot make modifications to an existing index. This means you need to rebuild the entire index. Using the Azure Portal, this rebuild process means that you need to create all the fields in your index every time the index needs to be modified. Using the modification-by-code approach is much more effective because you only need to modify an existing JSON definition that you can upload on every rebuild.

Note: Size of the Dataset

During the development of an index, you should consider using only a subset of your entire dataset because you usually need to make modifications to the index definition. Every time that you modify the index definition, you need to rebuild the entire index. Using a subset of your dataset makes this rebuilding process faster because you aren’t loading as much data in your index.

Use the following procedure to create your index by using the Azure Portal. (Once you create the first definition of your index, we will review how to make modifications to an existing index using some code.)

  1. Open the Azure Portal (https://portal.azure.com).
  2. In the Azure Portal search text box, type the name of your Azure Search service instance.
  3. Click the name of your Azure Search service instance in the results list.
  4. On the Overview blade of your Azure Search service instance, click the Add Index button in the top-left corner of the blade.
  5. On the Add Index blade, shown in Figure 6-9, type a name for the index. This name must start and end with alphanumeric characters and may contain only lowercase letters, digits, or dashes.
  6. Figure 6-9 Creating a new index.
    Screenshot_67
  7. Click the Add Field button.
  8. Type a name for the new field.
  9. In the Type drop-down menu, select the appropriate type for the field.
  10. Select the appropriate field attributes. For example, you can create a field to store customer surnames:
  11. Field Name This should be the name of the field; in this example, it should be surname..

    Type Because this field stores customer surnames, the type should be Edm.String.

    Retrievable Checked. You should include this field in the search results.

    Filterable Not checked. When you search strings by using filters, you look for exact matches by using Boolean operations. Filters are case sensitive and more strict than using search queries.

    Sortable Checked. You may want to get your results ordered by this field.

    Facetable Not checked. If your application does not use navigation trees for constructing filters, you should not check this attribute.

    Searchable Checked. You want to search for people by their surnames. Using this option, you get the full power of the full-text search engine.

  12. Click the Create button at the bottom of the blade.
  13. Click Keys in the navigation menu on the left side of your Azure Search service instance.
  14. On the Keys blade, copy the Primary Admin Key. You will need this value in a later step.
  15. Open Postman or your favorite web analysis tool. (You can download Postman from https://www.getpostman.com.)
  16. On the Postman’s welcome window, click Request in the Building Blocks section.
  17. In the Save Request dialog box, type a name for the request.
  18. Click the Create Collection link.
  19. Type a name for your new collection and press Enter.
  20. Click the Save button on the bottom-right corner of the dialog box.
  21. In the Requests Definition window, use the following URL for accessing your index: https://[servi cename].search.windows.net/indexes/[index name]?api-version=2019-05-06
  22. [service name] This should be the name of your Azure Search service instance that you created at the beginning of this section..

    [index-name] This should be the name of the index that you provided in step 5 of this procedure.

  23. In the Request Definition window, click the Headers tab, below the request’s URL text box.
  24. Add the following headers: Content-type application/json.
  25. api-key Paste the value that you copied on step 12

  26. Click the Send button beside the request’s URL. At this point, you should get the JSON definition of your index, as shown in Figure 6-10. You can download this JSON definition by clicking the Download button in the top-right corner of the results window.
  27. Figure 6-10 Downloading an index definition using Postman
    Screenshot_68

At this point, you can modify your index definition by changing the JSON file with your index definition that you just downloaded in the previous procedure. When you are creating and updating the definition of your index and its fields, there are some concepts that you should bear in mind:

Fields Your index is composed of a list of fields that define the data contained in the index. You can find the definition of the fields grouped as a collection in the JSON file. Each field has some attributes that control the behavior of the field. Each field has the following properties that you need to provide:

Name This is the name of the field..

Type This sets the kind of data that the field stores.

Attributes This is an attribute of a field that defines the behavior of the field in the index. For example, you can assign the searchable attribute to a field. That attribute marks the field as full-text searchable. Another important attribute that you need to consider is the Key attribute. Every index that you define must have one—and only one—field configured with the Key attribute. This field must be of type Edm.String. Table 6-1 shows a list of the available attributes that you can configure for your fields.

Suggesters You use this section to define which fields should be used to auto-complete the search feature. This feature presents some suggestions for search terms as the user is typing..

Scoring Profiles By assigning scores to the items, you can influence which items should appear higher in the search results. You need to define scoring profiles that are made up of functions and fields so you can automatically assign the scores to the items in the search results. The scoring profiles are transparent to the users..

Analyzers Configure the language analyzer for a field. When you configure a field as searchable, you need to set the language that this field contains to make the appropriate analysis..

CORS You configure which JavaScript client-side code can make API calls to the Search API. By default, CORS is disabled, so your solutions cannot query the Search API from the client side..

Encryption key You can configure your Azure Search service instance to encrypt the indexes using customer manager keys instead of the default Microsoft managed keys..

Once you are happy with the first definition of your index, you can upload data to your index and make queries to ensure that the index is performing as you need. You can repeat these steps as many times as you need to achieve the best results. In the following sections, we review how to upload data to your index and how to query your indexes.

Exam Tip

You cannot make modifications or edits to an existing index using the Azure Portal. Because the definition of an index is an iterative process, you should make the first index definition using the Azure Portal and then download the index definition using any web analyzing tool, such as Postman. Once you have the JSON definition, you can make updates to the index and upload it to your Azure Search service instance.

Need More Review? Creating an Index

Creating an Azure Search index is a complex task that you need to review carefully. You can review more details about creating an index, such as how to work with analyzers, scoring profiles, or suggesters, or the storage implications of the definition of your fields by reading the article at https://docs.microsoft.com/en-in/azure/search/search-what-is-an-index.

Azure Portal is not the only way to create new indexes. You can also use other methods, such as C#, Postman, Python, or Powershell. Learn how to create an index using C# at https://docs.microsoft.com/enin/azure/search/search-get-started-dotnet.

Import searchable data

Once you have made a definition of your index, you need to import data to it before you can get results from the API. Importing data to the index depends on the type of data source that you are using for your data. You can define empty search indexes, but you cannot query those empty indexes until you fill them with some data.

There are two ways to import data into an index or search index:

Push Using this method; you are actually uploading data to your index. You need to provide JSON documents to the index and you programmatically upload the content to the index. The advantage of this import data method is that it is very flexible, and there are no restrictions regarding the type of data source that you use for your data as long as you convert the data to JSON documents that you upload to the search index..

Pull The data is not stored in the index itself; instead, it is stored in one of the supported data sources. Using the indexers that you can configure for your search index, you can connect data stored in the following Azure services: Azure Blob storage, Azure Table storage, Azure

Cosmos DB, Azure SQL database, and SQL Server deployed on Azure VMs. The Azure Search service connects your search index with one of the supported indexers, and then, the indexer maps the fields defined in your index with the fields stored in the documents in the data source. The indexer automatically converts your data stored in the data source to a JSON document that can be used in the search index..

Because the push mechanism stores the information directly in the search index, it provides the lowest latency, making this method the most appropriate for applications that are sensitive to latency.

Note: Sample Dataset

The remaining examples that we review in this skill are based on the sample data provided by Microsoft. You can review and download this sample data at https://azure.microsoft.com/en-us/resources/samples/azuresearch-sample-data/

The following procedure shows how to upload content to your index by using Postman. Before you can start uploading data to the Azure Search service, you need to create the hotels index:

  1. Open the Postman desktop application.
  2. Create a new PUT Using the PUT or POST methods, the Azure Search API creates the index if it doesn’t exist. If the index already exists, then the index is updated with the new definition.
  3. Use the following URL to create the hotels index: https://[service name].search.windows.net/indexes/hotels?apiversion=2019-05-06.
  4. Click the Headers tab below the request’s URL.
  5. Create the following headers: Content-type application/json.
  6. api-key You can find this value in the Keys blade of your Azure Search service instance.

  7. Click the Body tab below the request’s URL.
  8. Select the raw option below the Body tab. Ensure that the JSON (application/json) option is selected in the drop-down menu.
  9. Open the index JSON definition. You can find this definition in the hotels/Hotels_IndexDefinition.json file in the Azure Search sample dataset.
  10. Copy the content of the json file into the text area in the Body tab of your request in Postman, as shown in Figure 6-11.
  11. Figure 6-11 Creating a new index using Postman.
    Screenshot_69
  12. Click the Send button next to the request’s URL.

At this point, you have created a new index called hotels in your Azure Search service instance. The next step is to import data into the new index. In this section, we review how to import data by using push and pull methods.

You upload (or push) data to your Azure Search service instance by using the REST API or by using the .NET SDK. Using the REST API is quite similar to the procedure that we previously used for creating an index. You need to use the POST HTTP method and use the URL https://[service name].search.windows.net/indexes/[index name]/docs/index?apiversion=2019-05-06. For this example, the index name should be hotels. You need to use the content of the HotelsData_toAzureSearch.json file to upload the information to your Azure Search service instance.

When you are pushing data to your Azure Search service instance, you can control each action you want to perform on each document in the index. By setting the @search.action property in the JSON file, you can control the action performed on each document. The following list shows the available operations you can apply to the documents:

upload If the document doesn’t exist on the index, it uploads the new document. If the document already exists, it updates the document. When updating a document, any existing field in the document that is not specified in the request will be automatically set to null..

merge This action updates an existing document with the value of the fields specified in the request. The value set in the Request For field replaces the value of that field in the existing document in the index. If the document doesn’t exist in the index, the request fails..

mergeOrupload This action is similar to the merge action, but if the document doesn’t exist in the index, the action behaves as the upload action and creates a new document.. delete This action removes the specified document from the index..

For each action type, you at least need to provide a value for the key field to locate the correct document in the index. The other fields are optional, depending on the type of action that you want to perform on the document. Complete the following steps to push data to your index by using the .NET

SDK:

  1. Open Visual Studio 2019.
  2. On the welcome window, click Create A New Project.
  3. Select the Console App (.NET Core) template.
  4. Click the Next button at the bottom-right corner of the Create A New Project window.
  5. In the Configure Your New Project window, type a name for your project.
  6. Select a Location for your solution.
  7. Click the Create button in the bottom-right corner of the window.
  8. Click Tools > NuGet Package Manager > Manage NuGet Packages For Solution.
  9. In the NuGet – Solution tab, click Browse.
  10. In the Search text box, type Azure.Search.
  11. Click Azure.Search in the results list.
  12. On the right side of the NuGet – Solution tab, click the name of your project.
  13. Click the Install button.
  14. In the Preview Changes window, click the OK button.
  15. In the License Acceptance window, click the I Accept button.
  16. Repeat steps 10 to 15 and install the
  17. Microsoft.Extensions.Configuration.Json NuGet Package.

  18. In the Solution Explorer window, right-click your project’s name.
  19. In the contextual menu, click Add > New Item.
  20. In the Add New Item dialog box, type json in the Search text box.
  21. Click the JSON File template.
  22. Type json in the Name text box.
  23. Click the Add button in the bottom-right corner of the window.
  24. In the Solution Explorer window, click the json file.
  25. In the properties window, set the Copy To Output Directory setting to Copy Always.
  26. Open the json file and replace the content of the file with the content of Listing 6-1. You can get the admin key from the Key blade in your Azure Search service instance.
  27. Listing 6-1 appsetting.json file

    	
    {
    "JSONDocumentsFile": "HotelsData_toAzureSearch.json",
    "SearchIndexName": "hotels",
    "SearchServiceAdminApiKey": "<Your_admin_key>",
    "SearchServiceName": "<Your_search_service_name>"
    }.
    	
    
  28. Copy the json file from the sample data folder to your solution’s folder.
  29. In the Solution Explorer, right-click your project’s name.
  30. In the contextual menu, click Add > Existing Item.
  31. In your solution’s folder, click the json file and click the Add button.
  32. On the Solution Explorer, click the json.
  33. On the file’s properties window, set the Copy To Output Directory setting to Copy Always.
  34. Open the json file.
  35. Remove the value Ensure that the JSON file starts with the [ character.
  36. Remove the last line of the file. Ensure that the JSON file ends with the ]
  37. Add a new empty C# class file and name it cs.
  38. Replace the content of the cs file with the content of Listing 6-2.
  39. Listing 6-2 Hotel.cs file

    	
    // C# .NET
    using Microsoft.Azure.Search;
    using Microsoft.Azure.Search.Models;
    using System;
    using System.Collections.Generic;
    using System.Text;
    using Newtonsoft.Json;
    namespace <your_project_name>
    {
    public partial class Hotel
    {
    [System.ComponentModel.DataAnnotations.Key]
    [IsFilterable]
    public string HotelId { get; set; }
    [IsSearchable, IsSortable]
    public string HotelName { get; set; }
    [IsSearchable]
    [Analyzer(AnalyzerName.AsString.EnMicrosoft)]
    public string Description { get; set; }
    [IsSearchable]
    [Analyzer(AnalyzerName.AsString.FrLucene)]
    [JsonProperty("Description_fr")]
    public string DescriptionFr { get; set; }
    [IsSearchable, IsFilterable, IsSortable, IsFacetable]
    public string Category { get; set; }
    [IsSearchable, IsFilterable, IsFacetable]
    public string[] Tags { get; set; }
    [IsFilterable, IsSortable, IsFacetable]
    public bool? ParkingIncluded { get; set; }
    [IsFilterable, IsSortable, IsFacetable]
    public DateTimeOffset? LastRenovationDate { get; set; }
    [IsFilterable, IsSortable, IsFacetable]
    public double? Rating { get; set; }
    public Address Address { get; set; }
    }
    }
    	
    

    The Hotel class represents a document stored in the JSON file. As you can see in Listing 6-2, each of the properties has assigned a property attribute, such as IsFilterable, IsSortable, IsFacetable, and IsSearchable, which matches the attributes in the definition of a search index field. When you work with the .NET Azure Search SDK, you need to explicitly add the property attribute to the properties that define your document. You cannot depend only on the definition of the index fields. This is different from using REST API, where you don’t need to add that explicit definition to the JSON document. Also, pay attention to the HotelID property. This property has also assigned the System.ComponentModel.DataAnnotations.Key property attribute which means it is the Key attribute for this document. You also use property attributes for configuring the Analyzer for those properties that have the IsSearchonable attribute..

  40. Add a new empty C# class file and name it Hotel.Methods.cs.
  41. Replace the content of the Hotel.Methods.cs file with the content of Listing 6-3.
  42. Listing 6-3 Hotel.Methods.cs file

    	
    // C# .NET
    using System;
    using System.Collections.Generic;
    using System.Text;
    namespace <your_project_name>
    {
    public partial class Hotel
    {
    public override string ToString()
    {
    var builder = new StringBuilder();
    if (!String.IsNullOrEmpty(HotelId))
    {
    builder.AppendFormat("HotelId: {0}n", HotelId);
    }
    if (!String.IsNullOrEmpty(HotelName))
    {
    builder.AppendFormat("Name: {0}n", HotelName);
    }
    if (!String.IsNullOrEmpty(Description))
    {
    builder.AppendFormat("Description: {0}n", Description);
    }
    if (!String.IsNullOrEmpty(DescriptionFr))
    {
    builder.AppendFormat("Description (French): {0}n", DescriptionFr);
    }
    if (!String.IsNullOrEmpty(Category))
    {
    builder.AppendFormat("Category: {0}n", Category);
    }
    if (Tags != null && Tags.Length > 0)
    {
    builder.AppendFormat("Tags: [ {0} ]n", String.Join(", ", Tags));
    }
    if (ParkingIncluded.HasValue)
    {
    builder.AppendFormat("Parking included: {0}n", ParkingIncluded.Value ? "yes" : "no");
    }
    if (LastRenovationDate.HasValue)
    {
    builder.AppendFormat("Last renovated on: {0}n", LastRenovationDate);
    }
    if (Rating.HasValue)
    {
    builder.AppendFormat("Rating: {0}n", Rating);
    }
    if (Address != null && !Address.IsEmpty)
    {
    builder.AppendFormat("Address: n{0}n", Address.ToString());
    }
    return builder.ToString();
    }
    }
    }
    	
    
  43. Add a new empty C# class file and name it Address.cs.
  44. Replace the content of the Address.cs file with the content of Listing 6-4.
  45. Listing 6-4 Address.cs file

    	
    // C# .NET
    using System;
    using Microsoft.Azure.Search;
    using Microsoft.Azure.Search.Models;
    using Newtonsoft.Json;
    namespace <your_project_name>
    {
    public partial class Address
    {
    [IsSearchable]
    public string StreetAddress { get; set; }
    www.examsnap.com ExamSnap - IT Certification Exam Dumps and Practice Test Questions
    [IsSearchable, IsFilterable, IsSortable, IsFacetable]
    public string City { get; set; }
    [IsSearchable, IsFilterable, IsSortable, IsFacetable]
    public string StateProvince { get; set; }
    [IsSearchable, IsFilterable, IsSortable, IsFacetable]
    public string PostalCode { get; set; }
    [IsSearchable, IsFilterable, IsSortable, IsFacetable]
    public string Country { get; set; }
    }
    }
    	
    
  46. Add a new empty C# class file and name it Address.Methods.cs.
  47. Replace the content of the Address.Methods.cs file with the content of Listing 6-5.
  48. Listing 6-5 Address.Methods.cs file

    	
    // C# .NET
    using System;
    using System.Collections.Generic;
    using System.Text;
    using Newtonsoft.Json;
    namespace <your_project_name>
    {
    public partial class Address
    {
    public override string ToString() =>
    IsEmpty ?
    string.Empty :
    $"{StreetAddress}n{City}, {StateProvince} {PostalCode}n{Country}";
    [JsonIgnore]
    public bool IsEmpty => String.IsNullOrEmpty(StreetAddress) &&
    String.IsNullOrEmpty(City) &&
    String.IsNullOrEmpty(StateProvince) &&
    String.IsNullOrEmpty(PostalCode) &&
    String.IsNullOrEmpty(Country);
    }
    }
    	
    
  49. Open the Program.cs file and add the following using statements:
  50. using System.Collections.Generic;.

    using System.IO; using System.Linq;

    using Microsoft.Azure.Search; using Microsoft.Azure.Search.Models;

    using Microsoft.Extensions.Configuration; using Newtonsoft.Json;

  51. Replace the content of the Main method with the content in Listing 6-6.
  52. Listing 6-6 Program.cs Main method

    	
    // C# .NET
    IConfigurationBuilder builder = new ConfigurationBuilder().AddJsonFile("appsettings.
    json");
    IConfigurationRoot configuration = builder.Build();
    string searchServiceName = configuration["SearchServiceName"];
    string indexName = configuration["SearchIndexName"];
    string adminApiKey = configuration["SearchServiceAdminApiKey"];
    string jsonFilename = configuration["JSONDocumentsFile"];
    SearchServiceClient serviceClient = new SearchServiceClient
    (searchServiceName, new SearchCredentials(adminApiKey));
    ISearchIndexClient indexClient = serviceClient.Indexes.GetClient(indexName);
    //Batch documents import.
    //Reading documents from the JSON file.
    List<Hotel> actions;
    using (StreamReader file = File.OpenText(jsonFilename))
    {
    string json = File.ReadAllText(jsonFilename);
    actions = JsonConvert.DeserializeObject<List<Hotel>>(json);
    }
    //Create a batch object.
    var batchActions = new List<IndexAction<Hotel>>();
    foreach (var hotel in actions)
    {
    var indexAction = new IndexAction<Hotel>(hotel);
    batchActions.Add(indexAction);
    }
    var batch = IndexBatch.New(batchActions.ToArray());
    //Push the documents to the Azure Search service instance
    try
    {
    indexClient.Documents.Index(batch);
    }
    catch (IndexBatchException ex)
    {
    Console.WriteLine($"Failed to index some documents: {String.Join(", ",
    ex.IndexingResults.Where(r => !r.Succeeded).Select(r => r.Key))}");
    }
    	
    

Now press F5 to execute the code. You can check whether the documents have been correctly loaded into your index by reviewing the Overview blade in your Azure Search service instance, as shown in Figure 6-12. Depending on the size of your dataset, it can take several minutes to show the updated summary information in the Overview blade.

Figure 6-12 List of indexes in an Azure Search service instance
Screenshot_70

Pushing data to your search index is one way of importing data. The other way is to pull data from a supported data source. By using indexers, Azure Search service can connect to Azure Blobs, Azure Tables, Azure Cosmos DBs, Azure SQL Databases, or databases in SQL Servers deployed on Azure VMs, and the Azure Search service can extract information from those data sources. The indexer connects to a table, view, or equivalent structure in the data source, and it maps the columns or fields in the data source structure with the fields defined in the search index. Then the indexer converts the rowset into a JSON document that is loaded into the index. You can schedule the indexer to check at regular intervals for changes in the data source. You can configure an indexer by using the Azure Portal, the REST API, or the .NET Azure Search SDK. The following procedure shows how to import data from a Cosmos DB database to an index using Azure Portal:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource in the navigation menu on the left side of the Azure Portal.
  3. Click Azure Cosmos DB on the Popular column in the New blade.
  4. On the Create Azure Cosmos DB Account, select a Subscription from the drop-down menu.
  5. Select an existing resource group from the Resource Group drop-down menu. Alternatively, you can create a new resource group by clicking the Create New link below the drop-down menu.
  6. Type an Account Name in the Enter Account Name text box.
  7. Ensure that the Core (SQL) value is selected in the API drop-down menu.
  8. Leave other options as is and click the Review + Create button at the bottom of the blade.
  9. On the Review + Create tab, click the Create button at the bottom of the blade.
  10. On the Deployment Overview blade, click the Go To Resource button. This button appears when the new Cosmos DB is deployed successfully.
  11. Click Data Explorer in the navigation bar on the left side of your newly created Azure Cosmos DB Account blade.
  12. On the Data Explorer blade, click the New Container button.
  13. On the Add Container panel, shown in Figure 6-13, type hoteldb as the Database id.
  14. Figure 6-13 Creating a Cosmos DB database for Azure Search.
    Screenshot_71
  15. Type HotelId as the Container Id.
  16. Type Category as the Partition Key. You don’t need to add the / character at the beginning of the Partition Key, as the Azure Portal does it automatically for you.
  17. Click the OK button at the bottom of the panel.
  18. On the Data Explorer blade, in the tree control on the left side of the blade, choose hoteldb > HotelId > Items.
  19. Click the Upload Item icon at the top of the Items tab.
  20. On the Upload Items panel, click the folder icon next to the text box.
  21. In the Open file dialog box, look for the json file. Remember that you can get this file from the Azure Search sample dataset, which you can download from https://azure.microsoft.com/enus/resources/samples/azure-search-sample-data/
  22. Click the Upload button at the bottom of the panel.
  23. Click Keys on the navigation menu on the left side of the Azure Cosmos DB Account.
  24. Copy the Primary Connection String. Ensure that the connection string has the AccountEndpoint and AccountKey You need this value for a later step.
  25. In the Search Resources text box at the top of the Azure Portal, type the name of your Azure Search service instance.
  26. Click your Azure Search service instance’s name in the results list.
  27. Click the Import Data button on the Overview blade.
  28. On the Import Data blade, select the Cosmos DB in the Data Source drop-down menu control.
  29. Type a Name for the data source.
  30. Paste the connection string that you copied in step 23 into the Cosmos DB Account text box.
  31. Select the hoteldb value in the Database drop-down menu.
  32. In the Collection drop-down menu, select HotelId. Your blade should look like Figure 6-14.
  33. Figure 6-14 Creating an Azure Search data source.
    Screenshot_72
  34. Click the Add Cognitive Search (Optional) button at the bottom of the blade. When you click this button, the Import Data wizard automatically creates the configured data source.
  35. On the Add Cognitive Search (Optional) tab, click the Skip To: Customize Target Index option at the bottom of the blade.
  36. On the Customize Target Index tab, select HotelId in the Key dropdown menu.
  37. On the fields definition table, click the checkbox in the Retrievable column, which will select this attribute for all fields.
  38. Click the checkbox in the Searchable column, which will select this attribute for all fields of the String type.
  39. In the Description field, select the English – Microsoft value in the Analyzer column.
  40. In the Description_fr field, select the French – Microsoft value in the Analyzer column. Your index definition should look like Figure 6-15.
  41. Figure 6-15 Creating an Azure Search search index.
    Screenshot_73
  42. Click the Next: Create An Indexer button at the bottom of the blade.
  43. On the Create An Indexer tab, leave the Name as is. If you prefer, you can provide a different indexer name.
  44. In the Schedule setting, click Hourly. This setting configures how often the indexer checks for changes on the data source.
  45. Click the Submit button.
  46. On the Overview blade of your Azure Search service instance, you can view the newly created index. Importing the new data can take several minutes.

When you are using the Import Data wizard in the Azure Portal, you can only associate data sources to new indexes created during the importing data process. If you need to associate a data source with an existing search index, you need to use the REST API or the .NET Azure Search SDK.

Exam Tip

Using the Azure Portal when working with Azure Search index offers a limited set of features. The Azure Portal is the best tool for creating the initial definition of fields, data sources, and indexes. If you need to import data into your existing index or you need to make modifications to the definition of existing indexes, you should use the REST API or the .NET Azure Search SDK.

Need More Review? Loading Data Into Azure Search

You learn more about complex scenarios by reviewing the following articles:

Load Data https://docs.microsoft.com/en-in/azure/search/search-whatis-data-import

Load Data with Indexers https://docs.microsoft.com/enin/azure/search/search-indexer-overview

Indexer Operations Using the Azure Search Service REST API https://docs.microsoft.com/en-us/rest/api/searchservice/Indexeroperations

Need More Review? Using Cognitive Services with Azure Search Service

You can take advantage of the several Azure Cognitive Services by integrating your Azure Search Service with Computer Vision (for image analysis) or Text Analytics (for entities recognition). This allows you to enrich the features available for Azure Services. You can review how to make these integrations at https://docs.microsoft.com/enin/azure/search/cognitive-search-attach-cognitive-services

Query the Azure Search index

Once you have defined your index and populated it with your dataset, you need to query the index to take advantage of the Azure Search service. You can make queries to your index by using different tools:

Search Explorer You can use the Search Explorer tool integrated into

the Azure Portal for querying your index..

Web testing tools Use your favorite web testing tool, such as Fiddler or Postman, to make REST API calls..

.NET SDK Using the .NET Azure Search SDK, you can extrapolate from the details of implementing the REST API calls to the Azure Search service. You need to use the SearchIdexClient class for querying the index..

REST API You can make REST API calls using your favorite language and using the GET or POST methods on your index..

Whatever tool you decide to use for making queries to your index, you need to choose between two different query types—simple and full.

Simple The simple query type is used for typical queries. The behavior of the Azure Search when you use the simple query type is similar to other search engines like Google or Bing. This query type is faster and more effective for free-form text queries. The simple query syntax used in the simple query type contains operators like AND, OR, NOT, phrase, suffix, and precedence operators..

Full The full query type extends the features of the simple query type. While the simple query type uses the Simple Query Parser, the full query type uses the Lucene Query Parser. The Azure Search service is based on the Apache Lucene high-performance search engine developed by the Apache Software Foundation. Using the full query type, you can construct more complex queries by using regular expressions, proximity searches, or fuzzy and wildcard searches, among others. Using queryType=full in your request instructs the Azure Search to use the Lucene Query Parser instead of the default Simple Query Parser..

When constructing a query, you need to bear in mind the definition of your index. Depending on the attributes that you assigned to the fields of your index, you can use them for sorting and filtering the results based on a particular field (or if you can use a specific field for searching). The definition of the field also affects whether the field would be included in the results of the query. For example, in our previous hotel’s index example in which you get the information from an Azure Cosmos DB database, you cannot sort by any field because we didn’t configure any fields as sortable.

Another important distinction that you should bear in mind when querying your index is the difference between filtering and searching.

Searching Searching uses full-text search to look for a value in a string. During the lookup process in a full-text search, the Lucene engine removes stopwords (such as “the” or “and”), reduces the query term to the root of the word, lowercases the query term, and breaks composite words into their parts. Then the engine returns the list of matches. .

Filtering On the other hand, filtering uses a Boolean expression that must evaluate true or false. You can use filtering with strings, but in those situations, you only get those results that exactly match your filter expression, including word casing..

Need More Review? The Lucene Full Text Search Engine

In most situations, the Lucene search engine returns the expected results. But there could be some situations in which you get unexpected results. In those situations, it is interesting to understand how the full-text search engine works. You can learn about the Lucene engine in the Azure Search service at https://docs.microsoft.com/en-in/azure/search/search-lucenequery-architecture.

The following procedure shows how to perform a search using the .NET Azure Search SDK. The procedure below is based on the procedure that you followed in” Import Searchable Data,” earlier in this chapter, in which you learned how to push data into an Azure Search index. Please ensure that you have completed the procedure in Listings 6-1 to 6-6 before starting the following procedure:

  1. Open the solution for the example you created in the “Import Searchable Data” section earlier in this chapter.
  2. Open the cs file.
  3. Add the content of Listing 6-7 to the end of the Main
  4. Listing 6-7 Program.cs Main method

    	
    // C# .NET
    //Querying the index
    SearchParameters parameters;
    DocumentSearchResult<Hotel> results;
    //Looking for hotels in Dallas.
    Console.WriteLine("Query 1: Search for term 'Atlanta'");
    parameters = new SearchParameters();
    results = indexClient.Documents.Search<Hotel>("Atlanta", parameters);
    WriteDocuments(results);
    //Looking for hotels in Dallas. Get only certain fields of the document.
    Console.WriteLine("Query 2: Search for term 'Atlanta'");
    Console.WriteLine("Get only properties: HotelName, Tags, Rating, and
    Address:n");
    parameters = new SearchParameters()
    {
    Select = new[] {"HotelName", "Tags", "Rating", "Address"},
    };
    results = indexClient.Documents.Search<Hotel>("Atlanta", parameters);
    WriteDocuments(results);
    //Looking for hotels with restaurants and wifi. Get only the HotelName,
    //Description and Tags properties
    Console.WriteLine("Query 3: Search for terms 'restaurant' and 'wifi'");
    Console.WriteLine("Get only properties: HotelName, Description, and
    Tags:n");
    parameters = new SearchParameters()
    {
    Select = new[] { "HotelName", "Description", "Tags" },
    };
    results = indexClient.Documents.Search<Hotel>("Dallas", parameters);
    WriteDocuments(results);
    //Use filtering instead of full text searches
    Console.WriteLine("Query 4: Filter on ratings greater than 4");
    Console.WriteLine("Returning only these fields: HotelName, Rating:n");
    parameters =
    new SearchParameters()
    {
    Filter = "Rating gt 4",
    Select = new[] { "HotelName", "Rating" }
    };
    results = indexClient.Documents.Search<Hotel>("*", parameters);
    WriteDocuments(results);
    //Getting the two best scored hotels.
    Console.WriteLine("Query 5: Search on term 'boutique'");
    Console.WriteLine("Sort by rating in descending order, taking the top two
    results");
    Console.WriteLine("Returning only these fields: HotelId, HotelName,
    Category, Rating:n");
    parameters =
    new SearchParameters()
    {
    //If you try to use a field that is not configured with the
    //IsSortable attribute, you will get an error
    OrderBy = new[] { "Rating desc" },
    Select = new[] { "HotelId", "HotelName", "Category", "Rating" },
    Top = 2
    };
    results = indexClient.Documents.Search<Hotel>("boutique", parameters);
    WriteDocuments(results);
    	
    
  5. Add the method shown in Listing 6-8 to the Program class.

Listing 6-8 Program.cs WriteDocuments helper method

	
// C# .NET
//Helper method for printing the results of a query.
private static void WriteDocuments(DocumentSearchResult<Hotel> searchResults)
{
foreach (SearchResult<Hotel> result in searchResults.Results)
{
Console.WriteLine(result.Document);
}
Console.WriteLine();
}
	

Need More Review? Querying Azure Search Index Using REST API

Querying the Azure Search REST API to get results is similar to the process that you used to upload documents or create an index. The following articles show how to use the REST API to get results from your Azure Search index, using the simple and full query types:

Query data examples simple syntax https://docs.microsoft.com/enin/azure/search/search-query-simple-examples

Query data examples full syntax https://docs.microsoft.com/enin/azure/search/search-query-lucene-examples

Skill 6.3: Establish API Gateways

Most of the applications and solutions that you can find or develop nowadays offer an API for accessing the features available in the solution. In business environments, those solutions usually need to communicate with each other using their respective APIs. Sometimes, you need to expose your solutions to your clients to offer your services. In those situations, you need to ensure that you offer a consistent and secure API. It isn’t easy to implement the necessary mechanism to achieve an enterprise-grade level of security, consistency, and flexibility. If you also need to publish several of your services under a common API, this task is even harder.

Microsoft provides the Azure API Management (APIM) service. This service allows you to create an enterprise-grade API for your existing backend services. Using APIM, you can securely publish your back-end applications, providing your customers with a platform protected against DOS attacks or JWT token validations.

This skill covers how to:

  • Create an APIM instance
  • Configure authentication for APIs
  • Define policies for APIs

Create an APIM instance

The API Management service allows you to expose a portion (or all) of the APIs offered by your back-end systems. By using the APIM service, you can unify all your back-end APIs in a common interface that you can offer to external users, such as clients or partners and internal or external developers. In general, the APIM service is a façade of the APIs that you configure in your APIM instance. Thanks to this façade feature, you can customize the front-end API offered by the APIM instance without changing the back-end API.

When exposing your back-end systems, you are not limited to REST API back ends. You can use a back-end service that uses a SOAP API and then publish this SOAP API as a REST API. This means you can update your older back-end systems without needing to modify the code and take advantage of the greater level of integration of the REST APIs. Use the following procedure to create a new APIM instance:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource on the navigation bar on the left side of the Azure Portal.
  3. On the New blade, click Integration in the Azure Marketplace column.
  4. Click API Management in the Featured column. If the API Management service doesn’t appear in the Featured column, you can use the Search The Marketplace text box and look for the API Management service.
  5. On the API Management Service blade, type a Name for your new APIM instance.
  6. Select a subscription from the Subscription drop-down menu.
  7. Select a resource group from the Resource Group drop-down menu. Alternatively, you can create a new one by clicking the Create New link below the drop-down menu.
  8. Select a location from the Location drop-down menu.
  9. In the Organization Name text box, type the name of your organization. This name appears on the developer’s portal and email notifications.
  10. In the Administrator Email, type the name of the email account that should receive all notifications from the APIM instance. By default, the value associated with this property is the email address of the logged-in user.
  11. In the Pricing Tier, leave the Developer tier selected.
  12. Click the Create button at the bottom of the blade. The process of creating the APIM instance takes several minutes. When your new APIM instance is ready, you will receive a welcome email at the administrator email address that you configured in step 10.

Note: Pricing Tiers

The Developer pricing tier is appropriate for testing and development environments, but you should not use it for production because the Developer tier does not offer high-availability features and can be affected by disconnections during the updates of the node. You can review the full offer and the features available on each tier at https://azure.microsoft.com/en-us/pricing/details/api-management/

Once you have created your APIM instance, you can start adding APIs to your instance. In the following procedure, you are going to add two different APIs, using different methods. The first method is the OpenAPI specification. For the second API, you are going to create a blank API definition and add only those methods that are appropriate for you.

  1. Open the Azure Portal (https://portal.azure.com.
  2. Type the name of your APIM instance in the Search text box at the top of the portal.
  3. Click the name your APIM instance in the results list.
  4. Click APIs in the navigation menu on your APIM instance blade.
  5. On the Add A New API blade, click OpenAPI.
  6. On the Create From OpenAPI Specification dialog box, shown in Figure 6-16, copy the following URL from the OpenAPI specification text box: https://conferenceapi.azurewebsites.net/?format=json
  7. Figure 6-16 Adding a back-end API to an APIM instance.
    Screenshot_74
  8. Azure automatically fills the the Display Name and Name properties text boxes; check them to ensure the entries are correct.
  9. Type conference in the API URL Suffix field. If you are going to connect more than one back-end API to the APIM instance, you need to provide a suffix for each API. The APIM instance uses this suffix for differentiating between the different APIs that you connected to the instance.

  10. Do not select any product for the Products property.
  11. Click the Create button at the bottom of the dialog box.

At this point, you have added your first back-end API to the APIM instance by using the OpenAPI specification of your back-end API. In the following steps, you are going to add a back-end API without using any specification. Creating the front-end endpoints is useful if you need to connect only a few endpoints from your back-end API, or if you don’t have the OpenAPI or SOAP specification of your API in any format:

  1. Click APIs on the navigation menu in your APIM instance blade.
  2. On the Add A New API blade, click Blank API.
  3. On the Create A Blank API dialog box, type Fake API in the Display Name text box.
  4. Leave the Name property with the default value.
  5. Type https://fakerestapi.azurewebsites.net in the Web Service URL text box.
  6. Type fakeapi in the API URL Suffix text box.
  7. On the Design tab of the API blade with the newly added API selected, click Add Operation.
  8. On the Add Operation editor, shown in Figure 6-17, type GetActivities in the Display Name text box.
  9. Figure 6-17 Adding an API operation to an API in an APIM instance.
    Screenshot_75
  10. In the URL HTTP Method drop-down menu, ensure that the GET method is selected.
  11. In the URL text box, type /api/activities.
  12. Click the Save button at the bottom of the editor.
  13. On the API blade, ensure that Fake API is selected.
  14. Click the Test tab.
  15. Click the GetActivities
  16. Click the Send button at the bottom of the GetActivities operation panel. Using this panel, you can test each of the operations that are defined in your API. Alternatively, you can also use the Developer Portal for testing your APIs and Applications. You can access the Developer Portal by clicking the appropriate button on the top side of the APIs blade.

At this point, you have two back-end APIs connected to your APIM instance. As you can see in the previous example, you don’t need to expose the entire back-end API. By adding the appropriate operations, you can publish only those parts of the back-end API that are useful for you. Once you have created the APIs in your APIM instance, you can grant access to these APIs to your developers by using the Developer Portal. You can access

the APIM Developer Portal at https://<your_APIM_name>.developer.azureapi.net/.

Bear in mind that you need to associate a Product to your API for publishing it. Because you didn’t associate your APIs to any Product, your APIs won’t be available to the external world. Bear in mind that you can associate an API to more than one Product. Use the following procedure to create a Product and associate it with your APIs:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Type the name of your APIM instance in the Search text box on the topmiddle of the portal.
  3. Click the name of your APIM instance in the results list.
  4. Click Products on the navigation menu in your APIM instance blade.
  5. Click the Add button in the top-left corner of the Products blade.
  6. Type a Name in the Display Name text box on the Add Product panel.
  7. Leave the value in the ID text box as is.
  8. Type a description in the Description text area.
  9. Select the Published value in the State switch control. If you don’t select this option at this time, you can publish later, or you can publish the Product using its panel.
  10. Click the Select API button in the APIs section.
  11. On the APIs blade, select Demo Conference and Fake APIs by clicking the checkbox next to the name of the API.
  12. Click the Select button at the bottom of the panel.
  13. Click the Create button at the bottom of the Add Product panel.

By default, when you create a new Product, only members of the Administrators built-in group can access the Product. You can configure this by using the Access Control section in the Product.

You can customize the Developers Portal for your APIM instance by modifying the content of the pages, adding more pages, and so on. You can review the details about how to perform these customizations by consulting the article at https://docs.microsoft.com/en-us/azure/apimanagement/api-management-customize-styles

Need More Review? Revisions and Versions

During the lifetime of your API, you may need to add to, update, or remove operations from your API. You can make these modifications without disrupting the usage of your API by using revisions and versions. You can review how to work with revisions and versions in your API by reading the article at https://azure.microsoft.com/es-es/blog/versionsrevisions/

Configure authentication for APIs

Once you have imported your back-end APIs, you need to configure the authentication for accessing these APIs. When you configure the security options in the APIM instance, the back-end API delegates the security to the APIM instance. This means that even though your API has implemented its own authentication mechanism, they are never used when the API is accessed through the APIM instance.

This ability to hide the authentication of the back-end APIs is useful for unifying your security using a consistent and unique authentication mechanism. You can manage the authentication options associated with a Product or API by using Subscriptions. A Subscription manages the keys a developer can use to access your API. If an HTTP request made to an API protected by a Subscription does not provide a valid subscription key, the request is immediately rejected by the APIM gateway without reaching your back-end API. When you define a Subscription, you can use three different scopes to apply this Subscription:

Product When a developer wants to use one of your Products, the developer connects to the Developers Portal of your APIM instance and submits a request to subscribe to the product he or she wants to use..

All APIs The developer can access all APIs in your APIM instance using the same subscription key..

Single API The developer can access a single API in your APIM instance using a subscription key. There is no need for the API to be part of a Product..

If you use the All APIs or Single API scopes, you don’t need to associate the back-end API with an API. A Subscription using any of these two scopes allows access directly to the API. You can use the following procedure to create a Subscription and associate it with a Program:

  1. Open the Azure Portal (https://portal.azure.com.
  2. Type the name of your APIM instance in the Search text box at the top of the portal.
  3. Click the name of your APIM instance in the results list.
  4. Click Subscriptions in the navigation menu in your APIM instance blade.
  5. Click the Add Subscription button in the top-left corner of the Subscriptions blade.
  6. On the New Subscription panel shown in Figure 6-18, type a Name for the subscription. This name may only contain letters, numbers, and hyphens.
  7. Figure 6-18 Creating a new API Management Subscription.
    Screenshot_76
  8. In the Scope drop-down menu, select the Product value.
  9. Click the Product property.
  10. In the Products panel, click the name of the Product that you created in the previous section.
  11. Click the Select button at the bottom of the panel.
  12. Click the Save button at the bottom of the panel.
  13. On the Subscription blade, click the ellipsis at the end of the row for your newly created subscription.
  14. On the contextual menu, click Show/Hide keys. You can use either of these keys to access the APIs configured in the Product associated with the Subscription. You need to use the Header Ocp-Apim-SubscriptionKey to provide the subscription key in your HTTP requests.

Need More Review? Other Authentication Methods

Using subscription and subscription keys is not the only mechanism for protecting access to your APIs. API Management allows you to use OAuth 2.0, client certificates, and IP whitelisting. You can use the following articles to review how to use other authentication mechanisms for protecting your APIs:

IP whitelisting https://docs.microsoft.com/en-us/azure/apimanagement/api-management-access-restrictionpolicies#RestrictCallerIPs

OAuth 2.0 authentication using Azure AD

https://docs.microsoft.com/en-us/azure/api-management/apimanagement-howto-protect-backend-with-aad

Mutual authentication using client certificates https://docs.microsoft.com/en-us/azure/api-management/apimanagement-howto-mutual-certificates

Define policies for APIs

When you publish a back-end API using the API Management service, all the requests made to your APIM instance are forwarded to the correct back-end API, and the response is sent back to the requestor. All of these requests or responses are altered or modified by default. But there could be some situations where you need to modify some requests and/or responses. An example of these modification needs could be transforming the format or a response from XML to JSON. Another example could be throttling the number of incoming calls from a particular IP or user.

A policy is a mechanism that you can use to change the default behavior of the APIM gateway. Policies are XML documents that describe a sequence of inbound and outbound steps or statements. Each policy is made of four sections:

Inbound In this section, you can find any statement that applies to requests from the managed API clients..

Backend This section contains the steps that need to be applied to the request that should be sent from the API gateway to the back-end API..

Outbound This section contains statements or modifications that you need to apply to the response before it’s sent to the requestor..

On-Error In case there is an error on any of the other sections, the engine stops processing the remaining steps on the faulty section and jumps to this section..

When you are configuring or defining a policy, you need to bear in mind that you can apply it at different scope levels:

Global The policy applies to all APIs in your APIM instance. You can configure global policies by using the code editor in the All APIs policy editor on the APIs blade of your APIM instance..

Product The policy applies to all APIs associated with a Product. You can configure product policies on the Policies blade of the Product in your API instance..

API The policy applies to all operations configured in the API. You can configure API-scoped policies by using the code editor in the All Operations option on the Design Tab of the API in your APIM instance..

Operation The policy applies only to a specific operation in your API. You can configure operation-scoped policies by using the code editor in the specific operation..

Policies are a powerful and very flexible mechanism that allow you to do a lot of useful work, such as applying caching to the HTTP requests, performing monitoring on the request and responses, authenticating with your back-end API using different authentication mechanisms, or even interacting with external services. Use the following procedure to apply some transformations to the Demo Conference API that you configured in previous sections:

  1. Open the Azure Portal (https://portal.azure.com.
  2. Type the name of your APIM instance in the Search text box at the top of the portal.
  3. Click the name of your APIM instance in the results list.
  4. Click APIs in the navigation menu on your APIM instance blade.
  5. Click Demo Conference API on the APIs blade.
  6. Click the GetSpeakers
  7. Click the Test tab.
  8. Click the Send button at the bottom of the tab. This will send a request to the Demo Conference API and get results similar to those shown in Figure 6-19. In this procedure, you transform the HTTP headers highlighted in Figure 6-19.
  9. Figure 6-19 Testing an API operation.
    Screenshot_77
  10. Click the Design tab.
  11. Click All Operations in the list of available operations for this API.
  12. Click the icon next to Policies in the Outbound Processing section.
  13. In the Policy Editor, place the cursor before the base tag in the
  14. Outbound section and add a new line by pressing the Enter key.

  15. Click the Insert Policy button in the top-left corner of the Policy Editor.
  16. In the list of available policies on the right side of the Policy Editor, navigate to Transformation Policies.
  17. Click the Set HTTP Header policy twice to insert the policies.
  18. Modify the inserted policies with the following content:
  19. 	
    <set-header name="X-Powered-By" exists-action="delete" />
    <set-header name="X-AspNet-Version" exists-actio
    	
    
  20. On the list of available policies on the right side of the Policy Editor, click the Find And Replace String In Body policy. Insert this policy below the two policies that you inserted in the previous step.
  21. Modify the inserted policy by adding the values from and to. The policy should look like this:
  22. 	
    <find-and-replace from="://conferenceapi.azurewebsites.net" to="://
    name>.azure-api.net/conference" />.
    	
    
  23. Click the Save button at the bottom of the Policy Editor.
  24. Repeat steps 6 to 8 to apply the transformation policies. You should notice that the X-Powered-By and X-AspNet-Version headers are missing. Also, the href properties are pointing to the correct URL using your APIM instance.

Need More Review? More About Policies

There are a lot of useful things you can do using policies.too many to cover in this section.

If you want to learn more about APIM policies, see the following articles:

Error handling in API Management https://docs.microsoft.com/enus/azure/api-management/api-management-error-handling-policies

How to Set or Edit Azure API Management Policies https://docs.microsoft.com/en-us/azure/api-management/set-edit-policies

Skill 6.4: Develop event-based solutions

One of the main principles of code development is to reuse as much as possible. To make it possible to reuse the code, you need to ensure that the code is as loosely coupled as possible, which reduces the dependencies with other parts of the code or other systems to a minimum.

With this principle in mind, to make loosely coupled systems communicate, you need to use a kind of communication. Event-driven architectures allow communication between separate systems by sharing information through events.

In general, an event is a significant change of the system state that happens in the context of the system. An example of an event could be when a user adds an item to the shopping cart in an e-Commerce application or when an IoT device collects the information from its sensors.

Azure provides different services, like Event Grid, notification hubs, or event hubs, to cover the different needs when implementing event-driven architectures.

This skill covers how to:

  • Implement solutions that use Azure Event Grid
  • Implement solutions that use Azure Notification Hubs
  • Implement solutions that use Azure Event Hub

Implement solutions that use Azure Event Grid

Azure Event Grid allows you to create an application using serverless architecture by providing a confident platform for managing events. You can use Azure Event Grid for connecting to several types of data sources, such as Azure Blob Storage, Azure Subscription, Event Hubs, IoT Hubs, and others; Azure Even Grid also allows you to use different event handlers to manage these events. You can also create your custom events to integrate your application with the Azure Event Grid. Before you can start using the Azure Event Grid in your solution, there are some basic concepts that we should review:

Event This is a change of state in the source (for example, in an Azure Blob Storage or when an event happens when a new blob is added to the Azure Blob Storage)..

Event source This is the service or application in which an event occurs; there is an event source for every event type..

Event handler This is the app or service that reacts to the event..

Topics These are the endpoints where the event source can send the events. You can use topics for grouping several related events..

Event subscriptions When a new event is added to a topic, that event can be processed by one or more event handlers. The event subscription is an endpoint or built-in mechanism to distribute the events between the different event handlers. Also, you can use subscriptions to filter incoming events..

An important consideration that you need to bear in mind is that an event does not contain the full information about the event itself. The event only contains information relevant to the event itself, such as the source of the event, a time when the event took place, and a unique identifier. For example, when a new blob is added to an Azure Blob storage account, the new blob event doesn’t contain the blob. Instead, the event contains a reference to the blob in the Azure Blob storage account.

When you need to work with events, you configure an event source to send events to a topic. Any system or event handler that needs to process those events subscribes to that topic. When new events raise, the event source pushes the event into the topic configured in the Azure Event Grids service.

Any event handler subscribed to that topic reads the event and processes it according to its internal programming. There is no need for the event source to have event handlers subscribed to the topic; the event source pushes the event to the topic and forgets it. The following steps show how to create a custom topic. Then we will create console applications using C# to send events to the topic and process these events:

  1. Open the Azure Portal (https://portal.azure.com.
  2. Click All Services in the navigation menu on the left side of the Azure Portal.
  3. On the Search Everything text box, type event.
  4. Click Event Grid Topic in the results list.
  5. On the Event Grid Topics blade, click the Create button in the top-left corner of the blade.
  6. On the Create Topic panel, type a Name for the Event Grid Topic.
  7. Select a subscription in the Subscription drop-down menu.
  8. Select a resource group in the Resource Group drop-down menu. Alternatively, you can create a new resource group by clicking the Create New link below the drop-down menu.
  9. Select a location in the Location drop-down menu.
  10. Leave the Event Schema property as is.
  11. Click the Create button at the bottom of the panel.

When the Azure Resource Manager finishes creating your new Event Grid Topic, you can subscribe to the topic to process the events. Also, you can send your custom events to this topic. Use the following steps to publish custom events to your newly created Event Grid Topic:

  1. Open Visual Studio 2019.
  2. On the welcome screen, click Create A New Project.
  3. On the Create A New Project window, select the template Console App (.NET Core).
  4. Click the Next button at the bottom-right corner of the window.
  5. Type a Project Name.
  6. Select a location for your solution.
  7. Click the Create button.
  8. Click Tools > NuGet Package Manager > Manage NuGet Packages For Solution.
  9. On the NuGet – Solution tab, click Browse.
  10. In the Search text box, type Azure.EventGrid.
  11. Click Azure.EventGrid in the results list.
  12. On the right side of the NuGet – Solution tab, click the name of your project.
  13. Click the Install button.
  14. On the Preview Changes window, click the OK button.
  15. In the License Acceptance window, click the I Accept button.
  16. Repeat steps 10 to 15 and install the
  17. Microsoft.Extensions.Configuration.Json NuGet Package.

  18. In the Solution Explorer window, right click your project’s name.
  19. On the contextual menu, click Add > New Item.
  20. On the Add New Item, type json in the Search text box.
  21. Click the JSON File template.
  22. Type json in the Name text box.
  23. Click the Add button at the bottom-right corner of the window.
  24. On the Solution Explorer window, click the json file.
  25. On the properties window, set the Copy To Output Directory setting to Copy Always.
  26. Open the json file and replace the content of the file with the content of Listing 6-9. You can get the access key from the Access Key blade in your Event Grid Topic.
  27. Listing 6-9 appsettings.json file

    	
    {
    "EventGridAccessKey": "<Your_EventGridTopic_Access_Key>",
    "EventGridTopicEndpoint": "https://<Your_EventGrid_Topic>.<region_name>-1.eventgrid.
    azure.net/api/events"
    }
    	
    
  28. Create a new empty C# class called NewItemCreatedEvent.
  29. Replace the content of the NewItemCreatedEvent.cs file with the content of Listing 6-10.
  30. Listing 6-10 NewItemCreatedEvent.cs

    	
    // C# .NET
    using Newtonsoft.Json;
    namespace <your_project_name>
    {
    class NewItemCreatedEvent
    {
    [JsonProperty(PropertyName = "name")]
    public string itemName;
    }
    }.
    	
    
  31. Open the Program.cs file.
  32. Add the following using statements:
  33. using Microsoft.Azure.EventGrid;

    using Microsoft.Azure.EventGrid.Models;

    using Microsoft.Extensions.Configuration;

    usign System.Collections.Generic;

  34. Replace the content of the Main method with the content in Listing 6-11.

Listing 6-11 Program.cs Main method

	
// C# .NET
IConfigurationBuilder builder = new ConfigurationBuilder().AddJsonFile("appsettings.
json");
IConfigurationRoot configuration = builder.Build();
string topicEndpoint = configuration["EventGridTopicEndpoint"];
string apiKey = configuration["EventGridAccessKey"];
string topicHostname = new Uri(topicEndpoint).Host;
TopicCredentials topicCredentials = new TopicCredentials(apiKey);
EventGridClient client = new EventGridClient(topicCredentials);
List<EventGridEvent> events = new List<EventGridEvent>();
events.Add(new EventGridEvent()
{
Id = Guid.NewGuid().ToString(),
EventType = "MyCompany.Items.NewItemCreated",
Data = new NewItemCreatedEvent()
{
itemName = "Item 1"
},
EventTime = DateTime.Now,
Subject = "Store A",
DataVersion = "3.7"
});
client.PublishEventsAsync(topicHostname, events).GetAwaiter().GetResult();
Console.WriteLine("Events published to the Event Grid Topic");
Console.ReadLine();
	

At this point, your console application publishes events to the Event Grid topic that you previously created. Press F5 to run your console application to ensure that everything compiles and works correctly; you will not be able to see the published message. Use the following steps to create a subscriber Azure Function that connects to the Event Grid Topic and processes these events:

  1. Open Visual Studio 2019.
  2. On the Welcome screen, click Create A New Project.
  3. On the Create A New Project window, click the template Azure Functions.
  4. Click Next.
  5. Type a Project Name.
  6. Select a location for your project.
  7. Click Create.
  8. On the Create A New Azure Functions Application window, click the HTTP Trigger.
  9. Click Create.
  10. Install the Azure.EventGrid NuGet package.
  11. Create a new empty C# class called NewItemCreatedEvent.
  12. Replace the content of the NewItemCreatedEvent.cs file with the content of Listing 6-12.
  13. Listing 6-12 NewItemCreatedEvent.cs

    	
    // C# .NET
    using Newtonsoft.Json;
    namespace <your_project_name>
    {
    class NewItemCreatedEvent
    {
    [JsonProperty(PropertyName = "name")]
    public string itemName;
    }
    }.
    	
    
  14. Replace the content of Function1.cs with the content in Listing 6-13.
  15. Listing 6-13 Function1.cs

    	
    // C# .NET
    using System.Net;
    using System.Net.Http;
    using System.Threading.Tasks;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Azure.WebJobs.Extensions.Http;
    using Microsoft.Extensions.Logging;
    using Microsoft.Azure.EventGrid;
    using Microsoft.Azure.EventGrid.Models;
    namespace <your_project_name>
    {
    public static class Function1
    {
    [FunctionName("Function1")]
    public static async Task<HttpResponseMessage> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]
    HttpRequestMessage req,
    ILogger log)
    {
    log.LogInformation("C# HTTP trigger handling EventGrid Events.");
    string response = string.Empty;
    const string CustomTopicEvent = "Contoso.Items.ItemReceived";
    string requestContent = await req.Content.ReadAsStringAsync();
    log.LogInformation($"Received events: {requestContent}");
    EventGridSubscriber eventGridSubscriber = new EventGridSubscriber();
    eventGridSubscriber.AddOrUpdateCustomEventMapping(CustomTopicEvent,
    typeof(NewItemCreatedEvent));
    EventGridEvent[] eventGridEvents = eventGridSubscriber.DeserializeEventGridE
    vents(requestContent);
    foreach (EventGridEvent eventGridEvent in eventGridEvents)
    {
    if (eventGridEvent.Data is SubscriptionValidationEventData)
    {
    var eventData = (SubscriptionValidationEventData)eventGridEvent.
    Data;
    log.LogInformation($"Got SubscriptionValidation event data,
    validationCode: {eventData.ValidationCode}, validationUrl:
    {eventData.ValidationUrl}, topic: {eventGridEvent.Topic}");
    // Do any additional validation (as required) such as validating
    // that the Azure resource ID of the topic matches
    // the expected topic and then return back the below response
    var responseData = new SubscriptionValidationResponse()
    {
    ValidationResponse = eventData.ValidationCode
    };
    return req.CreateResponse(HttpStatusCode.OK, responseData);
    }
    else if (eventGridEvent.Data is StorageBlobCreatedEventData)
    {
    var eventData = (StorageBlobCreatedEventData)eventGridEvent.Data;
    log.LogInformation($"Got BlobCreated event data, blob URI
    {eventData.Url}");
    }
    else if (eventGridEvent.Data is NewItemCreatedEvent)
    {
    var eventData = (NewItemCreatedEvent)eventGridEvent.Data;
    log.LogInformation($"Got NewItemCreated event data, item SKU
    {eventData.itemName}");
    }
    }
    return req.CreateResponse(HttpStatusCode.OK, response);
    }
    }
    }.
    	
    
  16. Publish the Azure Function to your Azure Subscription. Use the following procedure to publish an Azure Function to Azure: https://docs.microsoft.com/en-us/azure/azure-functions/functionsdevelop-vs#publish-to-azure
  17. Open your Event Grid Topic in the Azure Portal.
  18. On your Event Grid Topic Overview blade, click the Event Subscription button.
  19. On the Create Event Subscription blade, shown in Figure 6-20, type a Name for the subscription.
  20. In the Endpoint Type drop-down menu, select WebHook.
  21. Figure 6-20 Creating a subscription using a WebHook endpoint.
    Screenshot_78
  22. Click the Select An Endpoint link below the webhook endpoint type.
  23. On the Select WebHook panel, type the following URL:
  24. 	
    https://<your_azure_function_plan>.azurewebsites.net/
    api/<your_azure_function_name>.
    	
    
  25. Click Confirm Selection.
  26. Click Create.

At this point, you should be able to publish and process events using the Event Grid Topic that you created previously. Use the following steps to ensure that everything works correctly:

  1. Open the publisher console application in Visual Studio 2019.
  2. Run the console application to publish an event to the topic.
  3. Open the Azure Portal and navigate to your Azure Function.
  4. In the Azure Functions blade, click Monitor in the tree control.
  5. You should be able to see a list of invocations when the function has been called because a new event arrived at the Event Grid Topic.
  6. Click one of the successful invocations; you will get a result similar to Figure 6-21.

Note: Azure Function Monitoring

You need to have Application Insight integration enabled to be able to see the log messages generated from the Azure Function. Review the article about how to monitor Azure Functions using Application Insights at https://docs.microsoft.com/en-us/azure/azure-functions/functionsmonitoring.

Figure 6-21 Log messages from a successful event processing
Screenshot_79

Need More Review? Dead Letter and Retry Policies

When you work with event-driven architectures, there can be situations when the event cannot be delivered to the event handler. In those situations, it’s appropriate to set a retry strategy to try to recover the event before it expires. You can learn more about these retry policies and dead letter management at https://docs.microsoft.com/en-us/azure/eventgrid/manage-event-delivery.

Implement solutions that use Azure Notification Hubs

Developing applications that can be accessed using mobile devices can be challenging because you usually need to allow access to your application from different mobile platforms. The challenge becomes even bigger because the different mobile platforms use different notification systems to send events. You need to deal with the Apple Push Notification Service (APNS), the Google Firebase Cloud Messaging (FCM), or the Windows Notification Service(WNS)—and these are just the main mobile platforms on the market.

The Azure Notification Hubs provide an abstraction layer that you can use for connecting to different push notification mobile platforms. Thanks to this abstraction, you can send the notification message to the Notification Hub, which manages the message and delivers it to the appropriate platform. You can also define and use cross-platform templates. Using these templates, you ensure that your solution sends consistent messages independently of the mobile platform that you are using.

In Skill 2.2 (see “Add push notifications for mobile apps” in Chapter 2) we discussed how to create an Azure Mobile App that integrates with the Azure Notification Hub service. Based on the example that we reviewed in that section, we can extend the architecture of an enterprise solution.

When you need to add push notification support to your solution, you should think of the notification hub as a part of a bigger architecture. An example of this could be a solution that needs to connect your line-ofbusiness applications with a mobile application. In such a scenario, a possible architecture could be to use Event Grid topics. The line-of-business applications would be the publishers of events to the appropriate topic, and then you could deploy one or more Azure Mobile Apps that are subscribed to these topics. When one of the line-of-business applications publishes an event in the Event Grid topic, your Azure Mobile App, which is acting as an event handler, can process the event and send a notification to your mobile users by using the Azure Notification Hub. Figure 6-22 shows a schema of this architecture. As you can see in that figure, the key component of the architecture is the Event Grid service and the implementation of an eventdriven architecture.

Figure 6-22 Diagram of an event-driven architecture, including notification hubs
Screenshot_80

Need More Review? Sample Architecture Implementation

You can review a sample architecture implementation using Service Bus messages instead of Event Grid at https://docs.microsoft.com/enus/azure/notification-hubs/notification-hubs-enterprise-push-notificationarchitecture.

Implement solutions that use Azure Event Hub

Azure Event Grid is a great service for implementing event-driven solutions, but it is only one piece of a more complex pipeline. Although Event Grid is appropriate for working with event-driven, reactive programming, it is not the best solution when you need to ingest millions of events per second with low latency.

Azure Event Hub is a more suitable solution when you require a service that can receive and process millions of events per second and provide lowlatency event processing. Azure Event Hub is the front door of a big data pipeline that processes millions of events. Once the Azure Event Hub receives the data, it can deliver the event to Azure Event Grid, store the information in an Azure Blob Storage account, or store the data in Azure Data Lake Storage.

When you work with event hubs, you send events to the hub. The entity that sends events to the event hub is known as an Event Publisher. An Event Publisher can send events to the event hub by using either of these protocols: AMQP 1.0, Kafka 1.0 (or later), or HTTPS.

You can publish events to the Event Hub by sending a single event or grouping several events in a batch operation. If you publish a single event or a batch of them, you are limited to a maximum size of 1 MB of data per publication. When Azure Event Hub stores an event, it distributes the different events in different partitions based on the partition key provided as one of the data of the event. Using this pattern, Azure Event Hub ensures that all events sharing the same partition key are delivered in order and to the same partition.

A partition stores events as they arrive at the partition, which means newer events are added to the end of the partition. You cannot delete events from a partition. Instead, you need to wait for the event to expire and thus, removed from the partition. Because each partition is independent of other partitions in the Event Hub, the growth rates are different from partition to partition. You can define the number of partitions that your Event Hub contains during the creation of the Event Hub. You can create between 2 and 32 partitions, although you can extend the limit of 32 by contacting the Azure Event Hub team. Bear in mind that once you create the Event Hub and set the number of partitions, you cannot change this number later. When planning the number of partitions to assign to the Event Hub, consider the maximum number of parallels downstream that need to connect to the Event Hub.

You can connect event receiver applications to an Event Hub by using consumer groups. A consumer group is equivalent to a downstream in a stream-processing architecture. Using consumer groups, you can have different event receivers or consumers accessing different views (state, position, or offset) of the partitions in the Event Hub. Event consumers connect to the Event Hub by using the AMQP protocol that sends the event to the client as soon as new data is available.

The following procedure shows how to create an Azure Event Hub:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click All Services on the navigation menu on the left side of the Azure Portal.
  3. In the Search Everything text box, type Event.
  4. Click Event Hubs in the results list.
  5. On the Event Hubs blade, click the Create button on the top-left corner of the blade.
  6. On the Create Namespace panel, type a Name for the Event Hub namespace.
  7. Select the Standard tier from the Pricing Tier drop-down menu.
  8. Select a subscription from the Subscription drop-down menu.
  9. Select a resource group from the Resource Group drop-down menu. Alternatively, you can create a new resource group by clicking the Create New link below the drop-down menu.
  10. Select a location from the Location drop-down menu.
  11. Click the Create button at the bottom of the panel.
  12. On the Overview blade in the Event Hub Namespace, click the Event Hub button.
  13. On the Create Event Hub panel, type a Name for the Event Hub.
  14. For this example, leave the Partition Count as 2. (Remember that you cannot change this value once the event hub is created.)
  15. Click the Create button.
  16. Click Shared Access Policies in the navigation menu on the left side of the Event Hub namespace.
  17. Click the RootManageSharedAccessKey.
  18. Copy the Connection String-Primary Key You need this value step 9 in the next procedure.

Once you have created your Event Hubs namespace and your hub, you can start sending and consuming events from the hub. Use the following procedure to create two console applications—one for sending events and another for receiving events:

  1. Open Visual Studio 2019.
  2. On the Welcome screen, click Create A New Project.
  3. Select the Console App (.NET Core) template.
  4. Click Next.
  5. Type a Project Name.
  6. Select a location for the project.
  7. Click Create.
  8. Install the Azure.EventHubs NuGet package.
  9. Replace the content of the cs file with the content of Listing 6-14. You received the Event Hub Namespace connection string in the last step of the previous procedure.

Listing 6-14 Function1.cs

	
// C# .NET
using System;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.EventHubs;
namespace <your_project_name>
{
class Program
{
private static EventHubClient eventHubClient;
private const string EventHubConnectionString =
"<Your_event_hub_namespace_connection_string>";
private const string EventHubName = "<your_event_hub_name>";
private const int numMessagesToSend = 100;
static void Main(string[] args)
{
var connectionStringBuilder = new EventHubsConnectionStringBuilder
(EventHubConnectionString)
{
EntityPath = EventHubName
};
eventHubClient = EventHubClient.CreateFromConnectionString
(connectionStringBuilder.ToString());
for (var i = 0; i < numMessagesToSend; i++)
{
try
{
var message = $"Message {i}";
Console.WriteLine($"Sending message: {message}");
eventHubClient.SendAsync(new EventData(Encoding.UTF8.
GetBytes(message)));
}
catch (Exception exception)
{
Console.WriteLine($"{DateTime.Now} > Exception: {exception.
Message}");
}
Task.Delay(10);
}
Console.WriteLine($"{numMessagesToSend} messages sent.");
eventHubClient.CloseAsync();
Console.WriteLine("Press ENTER to exit.");
Console.ReadLine();
}
}
}
	

At this point, you can press F5 to run the console application. This application console sends 100 messages to the Event Hub that you configured in the EventHubName constant. In the next procedure, you will create another application console for implementing an Event Processor Host. The Event Processor Host is an agent that helps you receive events from the Event Hub. The Event Processor automatically manages the persistent checkpoints and parallel event reception. The Event Processor Host requires an Azure Storage Account to process the persistent checkpoints.

Note: Example Requirements

You need to create an Azure Blob Storage container to run this example. You can review how to create a blob container and how to get the access key by reading the following articles:

Create a container https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-portal#create-acontainer

Get access keys https://docs.microsoft.com/enus/azure/storage/common/storage-account-manage#access-keys

Follow these steps to create the console application that implements the Event Processor Host:

  1. Open Visual Studio 2019.
  2. On the Welcome screen, click Create A New Project.
  3. Select the Console App (.NET Core) template.
  4. Click Next.
  5. Type a Project Name.
  6. Select a location for the project.
  7. Click Create.
  8. Install the following NuGet packages:
  9. Microsoft.Azure.EventHubs.

    Microsoft.Azure.EventHubs.Processor

  10. Create a new empty C# class and name it SimpleEventProcessor. In later steps, this class will implement the IEventProcessor interface that contains the signature of the methods needed for the Event Processor.
  11. Replace the content of the cs file with the content of Listing 6-15.
  12. Listing 6-15 SimpleEventProcessor.cs

    	
    // C# .NET
    using Microsoft.Azure.EventHubs;
    using Microsoft.Azure.EventHubs.Processor;
    using System;
    using System.Collections.Generic;
    using System.Text;
    using System.Threading.Tasks;
    namespace <your_project_name>
    {
    public class SimpleEventProcessor : IEventProcessor
    {
    public Task CloseAsync(PartitionContext context, CloseReason reason)
    {
    Console.WriteLine($"Processor Shutting Down. Partition '{context.
    PartitionId}', Reason: '{reason}'.");
    return Task.CompletedTask;
    }
    public Task OpenAsync(PartitionContext context)
    {
    Console.WriteLine($"SimpleEventProcessor initialized. Partition: '{context.
    PartitionId}'");
    return Task.CompletedTask;
    }
    public Task ProcessErrorAsync(PartitionContext context, Exception error)
    {
    Console.WriteLine($"Error on Partition: {context.PartitionId}, Error:
    {error.Message}");
    return Task.CompletedTask;
    }
    public Task ProcessEventsAsync(PartitionContext context, IEnumerable<EventData>
    messages)
    {
    foreach (var eventData in messages)
    {
    var data = Encoding.UTF8.GetString(eventData.Body.Array, eventData.Body.
    Offset, eventData.Body.Count);
    Console.WriteLine($"Message received. Partition: '{context.
    PartitionId}', Data: '{data}'");
    }
    return context.CheckpointAsync();
    }
    }
    }.
    	
    
  13. Replace the content of the Program.cs with the content of Listing 6-16.

Listing 6-16 Program.cs

	
// C# .NET
using Microsoft.Azure.EventHubs;
using Microsoft.Azure.EventHubs.Processor;
using System;
namespace <your_project_name>
{
class Program
{
private const string EventHubConnectionString =
"<your_event_hub_namespace_connection_string>";
private const string EventHubName = "<your_event_hub_name>";
private const string StorageContainerName = "<your_container_name>";
private const string StorageAccountName = "<your_storage_account_name>";
private const string StorageAccountKey = "<your_storage_account_access_key>";
private static readonly string StorageConnectionString = string.Format($"DefaultE
ndpointsProtocol=https;AccountName={StorageAccountName};AccountKey={StorageAccount
Key}");
static void Main(string[] args)
{
Console.WriteLine("Registering EventProcessor...");
var eventProcessorHost = new EventProcessorHost(
EventHubName,
PartitionReceiver.DefaultConsumerGroupName,
EventHubConnectionString,
StorageConnectionString,
StorageContainerName);
// Registers the Event Processor Host and starts receiving messages
eventProcessorHost.RegisterEventProcessorAsync<SimpleEventProcessor>();
Console.WriteLine("Receiving. Press ENTER to stop worker.");
Console.ReadLine();
// Disposes of the Event Processor Host
eventProcessorHost.UnregisterEventProcessorAsync();
}
}
}
	

Now, you can press F5 and run your console application. The console application registers itself as an Event Processor and starts waiting for events not processed in the Event Hub. Because the default expiration time for the events in the Event Hub is one day, you will receive all the messages sent by your publishing console application in the previous example. If you run your Event Publisher console application without stopping the Event Processor console application, you will be able to see the messages in the Event Processor console almost in real time as they are sent to the Event Hub by the Event Publishing console. This simple example also shows how the Event Hub distributes the events across the different partitions.

Exam Tip

The Azure Event Hub is a service appropriate for processing huge amounts of events with low latency. You should consider the Event Hub as the starting point in an event processing pipeline. You can use the Event Hub as the event source of the Event Grid service.

Need More Review? Event Hubs Concepts

The Azure Event Hub service is designed to work with big data pipelines where you need to process millions of events per second. In those scenarios, making a bad decision when planning the deployment of an Event Hub can have a big effect on the performance. You can learn more about the Event Hub service by reading the article at https://docs.microsoft.com/en-in/azure/event-hubs/event-hubs-features.

Skill 6.5: Develop message-based solutions

In the previous Skill, we reviewed how to use event-driven services in which a publisher pushes a lightweight notification, or event, to the events management system and forgets about how the event is handled or if it is even processed.

In this section, we are going to review how to develop message-based solutions using Azure services. In general terms, a message is raw data produced by a service with the goal of being stored or processed elsewhere. This means that the publisher of the messages has an expectation of some other system or subscriber process the message. Because of this expectation, the subscriber needs to notify the publisher about the status of the message.

This skill covers how to:

Implement solutions that use Azure Service Bus

Implement solutions that use Azure Queue Storage queues

Implement solutions that use Azure Service Bus

Azure Service Bus is an enterprise-level integration message broker that allows different applications to communicate with each other in a reliable way. A message is raw data that an application sends asynchronously to the broker to be processed by another application connected to the broker. The message can contain JSON, XML, or text information.

There are some concepts that we need to review before starting to work with the Azure Service Bus:

Namespace This is a container for all messaging components. A single namespace can contain multiple queues and topics. You can use namespaces as application containers that associate a single solution to a single namespace. The different components of your solution connect to the topics and queues in the namespace..

Queue A queue is the container of messages. The queue stores the message until the receiving application retrieves and processes it. The message queue works as a FIFO (First-In, First-Out) stack. When a new message arrives at the queue, the Service Bus service assigns a timestamp to the message. Once the message is processed, the message is held in redundant storage. Queues are appropriate for point-to-point communication scenarios in which a single application needs to communicate with another single application..

Topic You use topics for sending and receiving messages. The difference between queues and topics is that topics can have several applications receiving messages used in publish/subscribe scenarios. A topic can have multiple subscriptions in which each subscription to a topic receives a copy of the message sent to the topic..

Use the following procedure to create an Azure Service Bus namespace; then you can create a topic in the namespace. We are going to use that topic to create two console applications to send and receive the messages from the topic:

  1. Open the Azure Portal (https://portal.azure.com).
  2. Click Create A Resource in the navigation menu on the left side of the Portal.
  3. Click Integration in the Azure Marketplace column.
  4. Click Service Bus in the Featured column.
  5. On the Create Namespace panel, type a Name for the Service Bus namespace.
  6. Select the Standard tier in the Pricing Tier drop-down menu. You cannot create topics in the Basic pricing tier; you need to use at least the Standard tier.
  7. Select a subscription in the Subscription drop-down menu.
  8. Select a resource group in the Resource Group drop-down menu.
  9. Select a location in the Location drop-down menu.
  10. Click the Create button at the bottom of the panel.
  11. Go to the resource once the Azure Resource Manager finishes the deployment of your new Service Bus Namespace.
  12. On the Overview blade in the Service Bus Namespace, click the Topic button.
  13. Type a name for the Topic on the Create Topic panel, shown in Figure 6-23.
  14. Figure 6-23 Creating a new topic.
    Screenshot_81
  15. Leave the Max Topic Size and Message Time To Live parameter as is.
  16. Check Enable Duplicate Detection. This option ensures that the topic doesn’t store duplicated messages during the configured detection window.
  17. Click the Create button.
  18. Click Shared Access Policies in the navigation menu on the left side of the Service Bus Namespace.
  19. Click the RootManageSharedAccessKey
  20. Copy the Primary Connection String. You are going to use the connection string later in this section.
  21. Click Topics in the navigation menu on the left side of the Service Bus Namespace.
  22. Click your topic.
  23. On the Overview blade on the Service Bus Topic, click the Subscription button.
  24. On the Create Subscription panel, shown in Figure 6-24, type a Name for the subscription.
  25. Figure 6-24 Creating a new subscription.
    Screenshot_82
  26. Leave the other properties as is.
  27. Click the Create button at the bottom of the panel.

Now you are going to create two console applications. One console application will publish messages to the Service Bus Topic; the other console application will subscribe to the Service Bus Topic, process the message, and update the processed message. Use the following procedure to create the console application that publishes messages to the Service Bus Topic:

  1. Open Visual Studio 2019.
  2. On the Welcome screen, click Create A New Project.
  3. Select the Console App (.NET Core) template.
  4. Click Next.
  5. Type a Project Name.
  6. Select a location for the project.
  7. Click Create.
  8. Install the Microsoft.Azure.ServiceBus NuGet package.
  9. Replace the content of the Program.cs file with the content of Listing 6-17.

Listing 6-17 Program.cs

	
// C# .NET
using Microsoft.Azure.ServiceBus;
using System;
using System.Text;
namespace <your_project_name>
{
class Program
{
const string ServiceBusConnectionString =
"<your_service_bus_connection_string>";
const string TopicName = "<your_topic_name>";
const int numberOfMessagesToSend = 100;
static ITopicClient topicClient;
static void Main(string[] args)
{
topicClient = new TopicClient(ServiceBusConnectionString, TopicName);
Console.WriteLine("Press ENTER key to exit after sending all the
messages.");
Console.WriteLine();
// Send messages.
try
{
for (var i = 0; i < numberOfMessagesToSend; i++)
{
// Create a new message to send to the topic.
string messageBody = $"Message {i} {DateTime.Now}";
var message = new Message(Encoding.UTF8.GetBytes(messageBody));
// Write the body of the message to the console.
Console.WriteLine($"Sending message: {messageBody}");
// Send the message to the topic.
topicClient.SendAsync(message);
}
}
catch (Exception exception)
{
Console.WriteLine($"{DateTime.Now} :: Exception: {exception.Message}");
}
Console.ReadKey();
topicClient.CloseAsync();
}
}
}
	

You can now press F5 and publish messages to the topic. Once you publish the messages, you should be able to see an increase in the Message Count column in the Overview blade of your Service Bus Topic. The next steps show how to create the second console application that subscribes to the topic and processes the messages in the topic:

  1. Open Visual Studio 2019.
  2. On the Welcome screen, click Create A New Project.
  3. Select the Console App (.NET Core) template.
  4. Click Next.
  5. Type a Project Name.
  6. Select a location for the project.
  7. Click Create.
  8. Install the Microsoft.Azure.ServiceBus NuGet package.
  9. Replace the content of the cs file with the content of Listing 6-18.

Listing 6-18 Program.cs

	
// C# .NET
using Microsoft.Azure.ServiceBus;
using System;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace <your_project_name>
{
class Program
{
const string ServiceBusConnectionString =
"<your_service_bus_connection_string>";
const string TopicName = "<your_topic_name>";
const string SubscriptionName = "<your_subscription_name>";
static ISubscriptionClient subscriptionClient;
static void Main(string[] args)
{
subscriptionClient = new SubscriptionClient(ServiceBusConnectionString,
TopicName, SubscriptionName);
Console.WriteLine("Press ENTER key to exit after receiving all the
messages.");
// Configure the message handler options in terms of exception handling,
number of concurrent messages to deliver, etc.
var messageHandlerOptions = new MessageHandlerOptions
(ExceptionReceivedHandler)
{
// Maximum number of concurrent calls to the callback
ProcessMessagesAsync(), set to 1 for simplicity.
// Set it according to how many messages the application wants to
process in parallel.
MaxConcurrentCalls = 1,
// Indicates whether the message pump should automatically complete the
messages after returning from user callback.
// False below indicates the complete operation is handled by the user
callback as in ProcessMessagesAsync().
AutoComplete = false
};
// Register the function that processes messages.
subscriptionClient.RegisterMessageHandler(ProcessMessagesAsync,
messageHandlerOptions);
Console.ReadKey();
subscriptionClient.CloseAsync();
}
static async Task ProcessMessagesAsync(Message message, CancellationToken token)
{
// Process the message.
Console.WriteLine($"Received message: SequenceNumber:{message.SystemProperties.
SequenceNumber} Body:{Encoding.UTF8.GetString(message.Body)}");
// Complete the message so that it is not received again.
// This can be done only if the subscriptionClient is created in
// ReceiveMode.PeekLock mode (which is the default).
await subscriptionClient.CompleteAsync(message.SystemProperties.LockToken);
// Note: Use the cancellationToken passed as necessary to determine if the
// subscriptionClient has already been closed.
// If subscriptionClient has already been closed, you can choose to not call
// CompleteAsync() or AbandonAsync() etc.
// to avoid unnecessary exceptions.
}
// Use this handler to examine the exceptions received on the message pump.
static Task ExceptionReceivedHandler(ExceptionReceivedEventArgs
exceptionReceivedEventArgs)
{
Console.WriteLine($"Message handler encountered an exception
{exceptionReceivedEventArgs.Exception}.");
var context = exceptionReceivedEventArgs.ExceptionReceivedContext;
Console.WriteLine("Exception context for troubleshooting:");
Console.WriteLine($"- Endpoint: {context.Endpoint}");
Console.WriteLine($"- Entity Path: {context.EntityPath}");
Console.WriteLine($"- Executing Action: {context.Action}");
return Task.CompletedTask;
}
}
}
	

You can now press F5 and run the console application. As the console application processes the messages in the topic, you can see that the count of the messages in the subscription is decreasing.

Need More Review?: Service Bus Advanced Features

You can learn more about Service Bus in the following articles:

Queues, Topics, and Subscriptions https://docs.microsoft.com/enus/azure/service-bus-messaging/service-bus-queues-topicssubscriptions

Service Bus Performance Improvements https://docs.microsoft.com/en-us/azure/service-bus-messaging/servicebus-performance-improvements

Topic Filters and Actions https://docs.microsoft.com/enus/azure/service-bus-messaging/topic-filters

Implement solutions that use Azure Queue Storage queues

Azure Queue Storage is the first service that Microsoft released for managing message queues. Although Azure Service Bus and Azure Queue Storage share some features, such as providing message queue services, Azure Queue Storage is more appropriate when your application needs to store more than 80GB of messages in a queue. Also, although the queues in the service work as a FIFO (First-In, First-Out) stack, the order of the message is not guaranteed.

Note Azure Queue Storage vs. Azure Service Bus

You can review a complete list of differences between these two queuing services at https://docs.microsoft.com/en-us/azure/service-busmessaging/service-bus-azure-andservice-bus-queues-comparedcontrasted

The maximum size of a single message that you can send to an Azure Queue is 64KB, although the total size of the queue can grow larger than 80GB. You can only access an Azure Queue using the REST API or using the .NET Azure Storage SDK. Here are the steps to create an Azure Queue Storage account and a queue for sending and receiving messages:

  1. Open the Azure Portal (https://portal.azure.com.
  2. Click Create A Resource in the navigation menu on the left side of the Portal.
  3. Click Storage in the Azure Marketplace column.
  4. Click Storage Account in the Featured column.
  5. On the Create Storage Account blade, select a subscription in the Subscription drop-down menu.
  6. Select a resource group in the Resource Group drop-down menu.
  7. Type a Storage Account Name.
  8. Select a location in the Location drop-down menu.
  9. Select Locally-Redundant Storage in the Replication drop-down menu.
  10. Leave the other properties as is.
  11. Click the Review + Create button.
  12. Click the Create button.
  13. Click the Go To Resource button once the deployment finishes.
  14. Click Access Keys on the navigation menu in the Azure Storage account blade.
  15. Copy the Connection String from the key1 You need this value later in this section.

At this point, you can create queues in your Azure Storage account by using the Azure Portal. You can also add messages to the queue using the Azure Portal. This approach is useful for development or testing purposes, but it is not suitable for applications. Use the following steps to create a console application that creates a new queue in your Azure Storage account.

The application also sends and reads messages from the queue:

  1. On the Welcome screen, click Create A New Project.
  2. Select the Console App (.NET Core) template.
  3. Click Next.
  4. Type a Project Name.
  5. Select a location for the project.
  6. Click Create.
  7. Install the following NuGet packages:
  8. Microsoft.Azure.Storage.Common.

    Microsoft.Azure.Storage.Queue

  9. Replace the content of the cs file with the content of Listing 6-19.

Listing 6-19 Program.cs

	
// C# .NET
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Queue;
using System;
using System.Collections.Generic;
using System.Linq;
namespace <your_project_name>
{
class Program
{
private const string connectionString =
"<your_storage_account_connection_string>";
private const string queueName = "az203queue";
private const int maxNumOfMessages = 10;
static void Main(string[] args)
{
CloudStorageAccount storageAccount = CloudStorageAccount.
Parse(connectionString);
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
//Get a reference to the queue.
CloudQueue queue = queueClient.GetQueueReference(queueName);
//Create the queue if it doesn't exist already
queue.CreateIfNotExists();
//Sending messages to the queue.
for (int i = 0; i < maxNumOfMessages; i++)
{
CloudQueueMessage message = new CloudQueueMessage($"Message {i}
{DateTime.Now}");
queue.AddMessage(message);
}
//Getting the length of the queue
queue.FetchAttributes();
int? cachedMessageCount = queue.ApproximateMessageCount;
//Reading messages from the queue without removing the message
Console.WriteLine("Reading message from the queue without removing them from
the queue");
List<CloudQueueMessage> peekedMessages = (queue.PeekMessages((int)
cachedMessageCount)).ToList();
foreach (CloudQueueMessage peekedMessage in peekedMessages)
{
Console.WriteLine($"Message read from the queue: {peekedMessage.
AsString}");
//Getting the length of the queue
queue.FetchAttributes();
int? queueLenght = queue.ApproximateMessageCount;
Console.WriteLine($"Current lenght of the queue {queueLenght}");
}
//Reading messages removing it from the queue
Console.WriteLine("Reading message from the queue removing");
List<CloudQueueMessage> messages = (queue.GetMessages((int)
cachedMessageCount)).ToList();
foreach (CloudQueueMessage message in messages)
{
Console.WriteLine($"Message read from the queue: {message.AsString}");
//You need to process the message in less than 30 seconds.
queue.DeleteMessage(message);
//Getting the length of the queue
queue.FetchAttributes();
int? queueLenght = queue.ApproximateMessageCount;
Console.WriteLine($"Current lenght of the queue {queueLenght}");
}
}
}
}
	

Press F5 to execute the console application that sends and reads messages from the queue. You can see how the messages are added to the queue by using the Azure Portal and navigating to your Azure Storage account and choosing Queues > az203queue. You will see a queue similar to the one shown in Figure 6-25.

Figure 6-25 Creating a new subscription
Screenshot_83

Need More Review? Publish/Subscribe Pattern

Although the Azure Queue Storage service doesn’t provide the ability to create subscriptions to the queues, you can easily implement the publishsubscribe pattern for communicat-ing applications using the Azure Queue Storage. You can learn how to implement this pattern by reviewing the article https://docs.microsoft.com/en-us/learn/modules/communicatebetween-apps-with-azure-queue-storage/

Chapter summary

Azure App Service Logic Apps allows you to interconnect different services without needing to create specific code for the interconnection..

Logic App Workflows define the steps needed to exchange information between applications..

Microsoft provides connectors for sending and receiving information to and from different services..

Triggers are events fired on the source systems..

Actions are each of the steps performed in a workflow..

Azure Logic Apps provides a graphical editor that eases the process of creating workflows..

You can create custom connectors to connect your application with Azure Logic Apps..

A Custom Connector is a wrapper for a REST or SOAP API..

You can create custom connectors for Azure Logic Apps, Microsoft Flow, and Microsoft PowerApps..

You cannot reuse custom connectors created for Microsoft Flow or Microsoft PowerApps with Azure Logic Apps..

You can export your Logic Apps as Azure Resource Manager templates..

You can edit and modify the Logic Apps templates in Visual Studio..

Azure Search service is built on top of the Apache Lucene search engine..

An Azure Search index contains the information you need to add to the search engine..

Azure Search indexes are composed of documents..

Azure Search indexes and documents are conceptually equivalent to tables and rows in a database..

When defining an Azure Search index, you should use the Azure Portal for the initial definition..

You cannot edit or change the definition of an Azure Search index by using the Azure Portal..

You cannot upload data to a search index by using the Azure Portal..

You should use code to edit or modify a search index..

You need to use code to upload data to a search index..

There are two methods to import data to a search index—push and pull.

The push method uploads the actual data, in JSON format, to the search index..

The pull method connects the search index to a supported data source and automatically imports the data.

The push method is less restrictive than the pull method.

The push method has lower latency when performing search operations.

The attributes configured in the field definition affect the physical storage of the index..

The attributes configured in the field definition affect the search queries that you can perform on your search index..

The API Management service allows you to publish your back-end REST or SOAP APIs using a common and secure front end..

You need to create subscriptions in the APIM service to authenticate access to the API..

You need to create a Product to publish a back-end API..

You can publish only some operations of your back-end APIs..

APIM Policies allow you to modify the behavior of the APIM gateway..

An event is a change in the state of an entity..

In an event-driven architecture, the publisher doesn’t have the expectation that the event is processed or stored by a subscriber..

Azure Event Grid is a service for implementing event-driven architectures..

An Event Grid Topic is an endpoint to which a publisher service can send events..

Subscribers are services that read events from an Event Grid Topic..

You can configure several types of services as event sources or event subscribers in Azure Event Grid..

You can create custom events to send them to the Event Grid..

You can subscribe to your custom application with an Event Grid Topic by using webhooks..

The Azure Notification Hub is a service that unifies the push notifications on mobile platforms..

You can connect the push notifications services from the different manufacturers to the Azure Notification Hub..

The Azure Event Hub is the entry point for big data event pipelines..

Azure Event Hub is specialized to ingest millions of events per second with low latency..

You can use Azure Event Hub as an event source for the Event Grid service..

You can use AMQP, Kafka, and HTTPS for connecting to Azure Event Hub..

In a message-driven architecture, the publisher application has the expectation that the message is processed or stored by the subscriber..

The subscriber needs to change the state once the message is processed..

A message is raw data sent by a publisher that needs to be processed by a subscriber..

Azure Service Bus and Azure Queue are message broker services..

Thought experiment

In this thought experiment, you will demonstrate your skills and knowledge of the topics covered in this chapter. You can find answers to this thought experiment in the next section.

Your organization has several Line-Of-Business (LOB) applications deployed on Azure and on-premises environments. The information managed by some of these LOB applications overlaps more than one application. All your LOB applications allow you to use SOAP or REST API for connecting to the application.

Your organization needs to implement business processes that require sharing information between the LOB applications. Answer the following questions about connecting Azure services and third-party applications:

  1. You need to implement a business process that requires that an application be deployed in Azure to share information with an application deployed in your company’s on-premises datacenter. How can you implement this business process?
  2. Your company needs to share information managed by one of the LOB applications with a partner. The LOB application uses a SOAP API for accessing the data. You need to ensure that the partner is authenticated before accessing the information. Your partner requires information to be gathered from your application in JSON format, so you also need to ensure that the information provided by your application is published using a REST API. Which service should you use?
  3. You need to improve the search features of the LOB applications. You decide to use Azure Search for that purpose. The information that you need to include in the Azure Search service is stored in different types of data sources. Which import data method should you use?

Thought experiment answers

This section contains the solutions to the thought experiment. Each answer explains why the answer choice is correct.

  1. You should use Azure Logic Apps for implementing the business process. Azure Logic Apps allows you to create workflows that can be used to implement your business process. You can connect Azure Logic Apps with your on-premises LOB applications by using the onpremises data gateway. You also need to create custom connectors so that your Azure Logic Apps can work with your LOB applications.
  2. You should use the API Management service. This service allows you to securely share your back-end APIs with partners and external developers. Using the APIM policies, you can also convert the XML messages provided by the SOAP API to JSON documents needed for REST APIs. You can use Azure AD, mutual certificate authentication, or API keys for authenticating the access to the API.
  3. You should use the push method to import data to the Azure Search index. The push method is your only choice because the information that must be included in the Azure Search index is stored in different types of data sources, and the push method is the only method that is independent of the data source type. You need to convert the information stored in the data source to JSON documents and then upload to the Azure Search index. The pull method, using indexers, is appropriate only when the data source is stored in one of the following Azure services: Azure Blob storage, Azure Table storage, Azure Cosmos DB, Azure SQL database, and SQL Server deployed on Azure VMs.
UP

LIMITED OFFER: GET 30% Discount

This is ONE TIME OFFER

ExamSnap Discount Offer
Enter Your Email Address to Receive Your 30% Discount Code

A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.