Chapter 4. Implement Authentication and Secure Data

You’ve deployed infrastructure and configured platform as a service (PaaS) applications in many forms, web apps, containers, functions, or logic apps. Running through all of this is your customers’ data, which is the most valuable piece of any organizations’ digital estate.

Access to this data needs to be controlled using one of the four pillars of great Azure Architecture: Security. As an architect, you need to be keeping security at the heart of any design. It extends through implementation and deployment and is at every stage in the life cycle of the application. In the mindset of a good architect, security is not a dirty word!

For the AZ-300 certification exam, you need to understand how to secure access to your applications and how to protect the integrity of the data with security tools like encryption.

Need More Review?

Security

You can find the full “Azure security documentation” at https://docs.microsoft.com/en-us/azure/security This documentation includes best practices, which are a must read for any architect.

Skills covered in this chapter:

Skill 4.1: Implement authentication

Skill 4.2: Implement secure data solutions

Skill 4.1: Implement authentication

You’ve deployed applications, but how do you control who or what is accessing those applications? What can you recommend to your customers so that they can ensure their applications are accessed only by users and applications that have been granted access? The answer is that you use the authentication implementations available to you through Azure services. As an architect, you need to be aware of the available choices when you make recommendations, and you need to know how to implement the options.

This skill covers how to:

Implement authentication by using certificates, forms-based authentication, tokens or Windows-integrated authentication

Implement Multi-Factor Authentication by using Azure AD

Implement OAuth2

Implement Managed Identities for Azure Resources Service Principle authentication

Implement authentication by using certificates, forms-based authentication, tokens or Windowsintegrated authentication

Authentication is the process where a user, application, or service trying to gain access is verified as the entity it claims to be, therefore allowing “entry” to the app and its services. Authentication is not the services a user or application can access once access is granted into the application; that is authorization.

Azure gives your customers multiple ways of authenticating depending on use case. When you’re recommending authentication mechanisms, you need to be aware of how they work and how to configure them.

Need More Review?

Authenticating Web Apps

To learn about the options available for authentication, visit the Microsoft Docs article “Authentication and authorization in Azure App Service” at https://docs.microsoft.com/en-us/azure/appservice/overview-authentication-authorization.

Authentication by certificate—TLS mutual authentication

You can configure an Azure web app or API to require client certificate authentication to enable TLS mutual authentication on an Azure web app or API. When this is enabled, the web app or API requests a certificate from the client application during the SSL handshake. The web app or API then uses the certificate to authenticate the client. This is a form of machine-to-machine authentication that you often see in business-to-business applications; it’s a way for a front-end application to securely interact with a back-end service.

Azure App Service doesn’t have a direct connection to the internet. It’s proxied through other services, so to forward the certificate after SSL termination, the terminating Azure service injects the client certificate as a Base64 encoded value into the HTTP request header. It’s this value that is read by the application. So, to use this form of authentication, your customers’ web application must contain custom code to read the header and perform the authentication.

To look at configuring TLS mutual authentication for a web application, first follow the SSL Termination walkthrough in Skill 4.2. TLS mutual authentication requires that HTTPS Only is set on the web app; otherwise, the SSL certificate won’t be received. Therefore, as in the SSL Termination walkthrough, you need a B1 App Service Plan Tier or higher for SSL support. Once SSL Termination has been set up, complete the following steps to enable authentication by certificate:

  1. Navigate to the SSL Settings section on the App Service blade, and setIncoming Client Certificates to On. The change is automatically saved. You can also enable this by using the following command in the Azure command-line interface (CLI):
  2. 	
    az webapp update --set clientCertEnabled=true --name <app_name> --resource-group
    <group_name>
    	
    
  3. Open a new browser and enter the app service URL. The browser asksfor a client certificate because it’s the client interpreting the request from the app server. If you deny the app server a certificate, you receive a 403 Error “Forbidden: Client Certificate Required.” If you send a certificate, you’re granted access because no further authentication is required.

The same process is used to secure function apps through Networking -> SSL from the Platform Features section of the function app. Other applications, including logic apps, that can present the correct certificate may now access the web app or function app.

Need More Review? Client Certificates

To learn more about enabling client certificates, visit the Microsoft Docs article “Configure TLS mutual authentication for Azure App Service” at https://docs.microsoft.com/en-us/azure/app-service/appservice-web-configure-tls-mutual-auth#enable-client-certificates.

Forms-based authentication

A legacy form of authentication is forms-based authentication. You may have come across this when you’ve been looking to rearchitect legacy on-premises applications to the cloud. This method of authentication has an HTML-based web form, which means it must be viewed and filled in on a browser.

Therefore, the use case for this authentication is purely user intervention login; the user fills in information on the form, normally a username and password to authenticate against. One of the advantages to using this method was that the user didn’t have to be part of the domain to authenticate because the authentication process could be performed against a username and password stored within a database. Figure 4-1 shows the general flow of forms-based authentication, which works like so:

Figure 4-1 Forms-based authentication process
Screenshot_98
  1. The user opens a website, and the browser requests a page that requiresauthentication.
  2. The web server receives the request and serves a page with the loginform.
  3. The user enters credentials and submits them to the form. The formposts the credentials to the web server (in plaintext).
  4. The web server authenticates the customer against the data stored in thedatabase. If the information is correct, the user is redirected back to the application entry page with a session cookie.
  5. The browser sends the session cookie to receive the original resourcerequested in step 1.
  6. The server grants the request because it includes the authentication cookie. The server serves the page and resources.

There are security issues with this implementation that you need to be aware of when you’re determining whether to rearchitect or lift directly into the cloud:

The credentials are sent as plaintext. You must be securing any traffic to this site with HTTPS, although this is always best practice.

The credentials are stored in a database. If the database uses poor encryption or no encryption of the passwords, the data is susceptible to attack. Encrypt the passwords with strong, salted encryption.

Forms use cookies that are vulnerable to cross-site request forgery (CSRF). In CSRF, a rogue site uses a cookie stored on a machine to influence the site that created the cookie to perform an action of which the user is not aware. If the user is an administrator of the site, this could allow the attacker to gain control of the site. This is mitigated using antiforgery techniques that must be coded into the application to create further security tokens to verify the source of requests.

The Azure Web Application Gateway, previously covered in Chapter 2, Skill 2.3 “Implement application load balancing” has a web application firewall. This firewall contains cross-site scripting protection, so it also provides protection against CSRF if you placed it in front of the web application.

Exam Tip

Knowing how to set up forms-based authentication using .NET or any other language is beyond the scope of the exam. However, understanding the concepts of how it works, why it is used, and how to mitigate the security concerns around it is important.

Application authentication with Azure AD

As mentioned earlier, forms-based authentication is a legacy mechanism.

Where possible, you need to recommend that your customers move to modern, secure authentication.

You can see such a mechanism when you log into the portal. To the user, this login looks like the same functionality as web forms; however, the login uses Azure AD as the back end. This method, where Azure AD authenticates the user and grants access to the portal, can be extended to your customers’ applications.

To set this up, you need to register an application with Azure AD and then use this registration on the web app to perform the authentication for you. To explore how this works, follow these steps:

  1. Using knowledge from previous chapters in this book, create a web appservice. Select to publish a Docker Image of Quickstart and Sample of Python Hello World on the app service. Note that on the lower tiers, the container can take a while to pull down. Enter Container Settings on the app blade to check the log for progress.
  2. Note

    App Service Tier

    If you’re planning to use authentication via Azure AD for your app, you have to be using HTTPS. Therefore, if you’re using a custom domain, you need your SSL bindings set (see Skill 4.2 ”SSL/TLS") and at least a B1 App Service Plan.

  3. Copy the web app URL from the Overview section of the Web Appblade. Save the URL for later.
  4. Navigate to Azure Active Directory, select App Registration, and selectNew Registration. Fill in the following information:

    Name Enter a name that will make the app easily identifiable to you and any users.

    Account Types If you’re developing an internal-use app, choose this organizational directory only. Use Any Organizational

  5. Directory for internal use and guests within any Azure AD. The final choice, Accounts In Any Organizational Directory and Personal Microsoft accounts, allows you to also include any Microsoft personal accounts (Skype and so on). C.   Ignore the Redirect URI for now.

    Click Register to save the application definition into Azure AD. You now need to define how the service will use it.

  6. The application is listed in the app registrations page. Click the one youjust created. You need to set up the link to the web app by doing the following:

    Click Branding on the app registration blade, and paste the URLyou copied in step 2 into the Home Page URL field. Click Save. Here you’re setting up the URL for the home page of your web app.

    Click Authentication. Under Redirect URIs, make sure Web isselected as the Type, and paste the URL from step 2 into the Redirect URL but add /.auth/login/aad/callback to the end. It should look like this:

    	
    https://<webappname>.<domainname>/.auth/login/aad/callback
    	
    

    Click Save. Here you’re establishing that you want to send the authentication information back to the page your user logged in from.

    Select the box for ID Tokens. This is only required in this examplebecause we’re using a single-page application with no back end to accept the authorization information. Otherwise the authentication of the application will return an error message. Click Save.

  7. You need to copy some values so that you can point your web app toauthenticate using this application registration.
  8. In the App Registrations blade, choose Quickstarts. Copy theApplication (client) Id from App properties on the right. Keep this for later use; it uniquely identifies your application.

    In the App Registrations blade, choose Overview and then Endpoints, copy the WS-Federation Sign-on Endpoint and save this for later use. This endpoint allows the authentication information to be stored in the session.

  9. You’re ready to configure your web app. Navigate to it in the portaland choose Authentication/Authorization to enable it, but you need to configure it to use Azure AD:
  10. In the Action To Take When Request Is Not Authenticated drop-down menu, choose Log In With Azure Active Directory.

    Select Azure Active Directory under Authentication Providers, andin Management mode choose Advanced.

    Paste the Application client ID you saved in step 5a into the ClientID.

    Paste the WS-Federation Endpoint URL from step 5b into theIssuer URL field, but remove/wsfed from the end. This is pointing the web app at the Single Sign-On URL for your Azure AD.

    Click OK to save the Active Directory Authentication settings, andthen click Save on the Authentication/Authorization settings.

  11. Open a new private browsing window to ensure you’re not logged in and navigate to your web app URL. You can authenticate with any user from the Azure AD you registered the app to and be authenticated to the Hello World page.

Need More Review?

App Registration and Authentication

To learn more about enabling web app authentication through Azure AD, visit the Microsoft Docs article “Register an application with the Microsoft identity platform” at https://docs.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app. Then check out “Configure a client application to access web APIs” in the “Next steps” section on the same page

Tokens

In the walkthrough of the previous section for web application authorization with Azure AD, you registered an application for authentication. This process registers the app with the Microsoft identity platform. Microsoft identity platform is an identity provider that verifies the identity of users and applications against an organization’s directory. Once the identity platform has successfully authenticated a user or application, it issues security tokens back to the calling application. The calling application must then validate the tokens to ensure authentication was successful. These tokens are the refresh token from Azure AD, which is returned on authentication, and the access token from the application, which verifies the user’s access to the application.

Once tokens are returned to the web app, the App Service Token Store automatically collects and stores the tokens to make it easy for the application to access them by making them available in the request header for back-end code or by sending a GET request to the authentication endpoint. If your customer doesn’t need to do anything with the tokens once the user is authenticated, you can turn off the token store. Continuing with the earlier example, use the following steps to turn off the store:

  1. Navigate to the app in the portal and choose
  2. Authentication/Authorization, and then scroll to the bottom of the page.

  3. On Advanced Settings, set Token Store to Off.
  4. Execute the authentication URL once more, and the token is no longeron the address bar.

Integrated Windows Authentication (IWA)

On-premises web applications often leverage Windows Active Directory (AD) as a user store, authenticating a user’s login using Active Directory. The web servers can be configured to use Single Sign-On, which is when the user signs into the client once, and the credentials are sent silently to the web application to use for authentication so that the user does not need to sign in again. This is Integrated Windows Authentication (IWA), and it can be achieved with Window AD configured to use NTLM or Kerberos.

One of the issues with this authentication implementation is the client must be able to complete part of the authentication process by communicating with the IdentityServer, in this case Windows AD. Therefore, the client needs to be on the domain. However, with applications moving to the cloud and remote work becoming more common, this isn’t always possible. In this scenario, as new applications are written or migrated, you should recommend hybrid identity management so that authentication and authorization of users is available wherever they are located.

You explored configuring Azure AD Connect in Chapter 1, Skill 1.9 “Implement and manage hybrid identities,” including the User Sign-In screen shown in Figure 4-2. (Note this is from the edit screen of AD Connect configuration rather than the setup screen.)

Figure 4-2 Hybrid identities and sign-on options
Screenshot_99

Note now the Enable Single Sign-On check box at the bottom of the page. If this is selected and Password Synchronization or Pass Through Authentication is also selected, with some more configuration steps, single sign-on capabilities of IWA can be available to your customers’ applications in the cloud.

Exam Tip

The additional steps to configure Seamless Single Sign-On are beyond the scope of the exam. However, it’s beneficial for you to know which sign-on methods suit specific use cases and which methods support Seamless Single Sign-On.

Need More Review?

Azure AD Connect and Seamless Single Sign-On

To learn more about configuring Seamless Single Sign-On, visit the Microsoft Docs article “Active Directory Seamless Single Sign-On: Quick Start” at https://docs.microsoft.com/en-us/azure/activedirectory/hybrid/how-to-connect-sso-quick-start.

Implement Multi-Factor Authentication by using Azure AD

The previous section explains using Azure AD to provide the authentication mechanism for a web app. When you set up an account type in step 3 of the section “Application authentication with Azure AD”, you selected the value Any Organizational Directory. Selecting this setting means any user in the entire directory can be authenticated. Even for a line-of-business application, granting access to every user in the directory is unlikely. Therefore, you need to be able to educate your customers on how to grant access to their applications only to the users that require it. You can achieve this by looking at the Enterprise Applications settings in Azure AD using the following steps:

  1. Follow the web app and app registration configured for “Application authentication with Azure AD” in the previous section, or use the web app created from the same section.
  2. Navigate to Azure AD and select Enterprise Applications in the ADblade. Click the application you registered in step 1.
  3. Choose properties on the Web Apps Enterprise Application blade,change the value on User Assignment Required to Yes, and click Save.
  4. In a private browsing window, open the URL to the web app youcreated in step 1 and sign in as any user from the directory. You see a 401 access denied error; this confirms that the user assignment required setting is functioning correctly.
  5. Go back to the Web Applications Enterprise Application blade, andselect Users and groups. This is where you can assign users and groups access to the application. Click Add User at the top to open the Add Assignment wizard. Select Users on the left, and then select the user(s) you’d like to grant access to. Click Select at the bottom when all users have been chosen. Click Assign, and the selected users are assigned access to the application.
  6. In a private browsing window, navigate back to the web app youcreated in step 1. Log in with one of the users you selected in step 5 to be authenticated.

Your customers may want to enable users who are outside their organization or users with specific rights on the web application to authenticate by using extra security measures. When architecting such a solution, Microsoft recommends you use conditional access policies and authenticate with Multi-Factor Authentication. Conditional access gives your customers the ability to enforce access requirements when a specific set of conditions occurs. The remaining steps in this section go through how this works for a web application.

Exam Tip

Don’t forget you need an Azure AD premium subscription P2 or P1 to take advantage of conditional access.

For this part of the walkthrough, you need two users in Azure AD: one in a group and one in a different group.

  1. If you don’t have a group set up, navigate to Azure AD in the portal,select Groups in the blade, and then select New Group. The group type is a Security Group because you’re using it to set up a security-related feature. Enter an appropriate Group Name and Group Description; the Membership Type must be Assigned. Click Members, select the members to assign to your group, click Select, and then click Create.
  2. Navigate back to Azure AD and Enterprise Applications on the AzureAD blade; select the web app you registered in step 1. In the Web App Enterprise Application blade, select Conditional Access. Following are the requirements to enforce MFA for a Group:

  3. Select Add New Policy at the top.
  4. In Name, use something relevant to the Access Policy beingdefined.
  5. Click Users and Groups, select to Include Select users and groups,and select the Users and Groups check box. Click Select; then select the Group created in step 1 of this section. Click Select, and then click Done.
  6. You entered Conditional Access through the web app in Enterprise Applications, so the app is already selected in Cloud Apps or Actions. If it was not, select Apps, and then click Select; select the app, click Select, and then click Done.
  7. For this use case, you want every user in the admin group to useMFA, regardless of how they log in, so skip over Conditions.
  8. Select Access Controls, click Grant Access, and then select RequireMulti-Factor Authentication. Click Select. Click Enable Policy and click Save.
  9. Configuration for MFA is complete. Now you can test MFA on thecreated group. Open a private browsing window and enter the URL of your web app to log in as a user who was not placed in the group. You’ll be logged in to the web app and see the Hello World page.
  10. Open another private browsing window. This time, log in to the webapp as the user in the group assigned in step 2c of this section. You’re asked to fill in the additional security verification shown in Figure 4-3.
  11. Figure 4-3 Multi-Factor Authentication additional security
    Screenshot_100
  12. Select the country you’re in and enter your mobile number. Once this iscompleted, you receive a text on your mobile phone to verify the phone is accessible to you. Future logins for this user will always require a code be sent to this phone number (see Figure 4-4).
  13. Figure 4-4 Entering a security code to verify the mobile phone is accessible
    Screenshot_101

    You can check that MFA is being enforced only for the application by logging the same user into the Azure Portal. The MFA code is not requested.

  14. If the user switches phone numbers, the number needs to be reset. Toenforce this, enter Azure AD in the portal, select Users, select the user you added to the group. In the User blade, select Authentication Methods. Here you can change the user’s phone number, or select Require Re-Register MFA. Re-registering sends the user back through the security validation process, and the person can change his or her own phone number. Also note the Revoke MFA Sessions option, which expires the user’s refresh tokens so he or she is required to complete an MFA log on the next attempt to log in.
  15. Now the Admin group is secured, it’s time to require MFA for your application for anyone outside the organization’s network. To do this, you need to set up a named location in Conditional Access. First, you need to set up a named location by following these steps:

  1. In the portal, enter Azure AD and click Security in the Azure ADblade. In the Security blade, click Conditional Access and Named Locations. Complete the following fields:
  2. Name Enter a name for your organization.
  3. Location Here you could set up a named location for countries or region, which you could use to deny access to that region.
  4. IP Ranges Add your static IP address ranges. Note that this uses CIDR notation.
  5. Click Create.

Now you’re ready to configure a location-based conditional access policy. Follow these steps to create the configuration:

  1. Select Add New Policy.
  2. In Name, enter something relevant to the Access Policy you’redefining.
  3. Click Users And Groups. Select Include Select Users And Groups andthen select the Users And Groups check box. Click Select, and then select the user not in the group earlier in this example. Click Select, and then click Done.
  4. You entered Conditional Access through the web app in EnterpriseApplications, so the app is already selected in Cloud Apps or actions. If it wasn’t, select Apps, and then click Select and select the app. Click Select, and then click Done.
  5. For this use case, you want every user outside of the corporate network to log in using MFA, so select Conditions and then click Locations. Click Yes to enable this feature, and then click Exclude because you want to exclude the corporate network. Select the locations, click Select, and select the named location you created earlier. In this example, it’s named “My Office.” Click Select, and then click Done.
  6. You also see trusted IPs listed in the named location. You could add your IP range to the trusted IPs (by selecting Azure AD > Security > MFA); however, this has further implications to security beyond just this application.

    The other conditions available to select are:

    Sign In Risk Azure calculates the likelihood that a sign in isn’t coming from the owner of the account.

    Device Platforms You can select mobile platforms here to force MFA from a mobile device.

    Device State You can use this to enforce MFA on unmanaged devices.

    Exam Tip

    Know the conditions that can be applied to a Conditional Access Policy. Check out the Need More Review documentation for a deeper dive into these.

  7. Select Access Controls. Click Grant Access and select Require Multi-Factor Authentication. Click Select.
  8. Click Enable Policy and click Save.
  9. The configuration of a location-based conditional access policy is nowcomplete. You can see an example of this configuration in Figure 4-5. You need to test to make sure it’s functioning correctly. Using the user that was not placed in the group earlier in this example, log in to your application using a browser that is accessing from the IP you added to the address range. You’re logged in without MFA because you’re accessing from the excluded IP. Now try the same thing on your phone on a cellular network, and you’re asked for your MFA code. Finally, log in as the user you added to the web app admins group earlier in the example from your excluded IP range. MFA is still triggered because of the group conditional access policy.
Figure 4-5 Location-based conditional access configuration
Screenshot_102

Note

Conditional Access

Here we’ve covered conditional access of a web app hosted in Azure. The process is identical for on-premise applications published through Application Proxy.

Need More Review?

Implement MFA with Azure AD

To learn more about enabling MFA for a web application through Azure AD, visit the Microsoft Docs article “Require MFA for specific apps with Azure Active Directory conditional access” at https://docs.microsoft.com/en-us/azure/active-directory/conditionalaccess/app-based-mfa. Also see the generic conditional access documentation at https://docs.microsoft.com/en-us/azure/activedirectory/conditional-access/.

Implement OAuth 2.0

You may have heard OAuth being discussed in the same breath as authentication. However, OAuth 2.0 is not an authentication protocol, although it’s a common misconception that it is. OAuth is an authorization protocol that’s used in tandem with authentication protocols. Here’s an example of the differences using a real-world example of checking in to take a flight:

Authentication You go to the check-in desk, you hand over your passport, and the check-in officer verifies your identity against the passport because it’s a trusted document. It doesn’t say whether you can fly or not, but it proves you are who you say you are.

Authorization You also hand over your flight ticket or proof of booking. At this point, the combination of the passport and ticket enables the check-in officer to authorize you have the correct permissions to go on the flight. The check-in officer hands you a boarding pass, which is your proof of authorization for the flight.

If we map the flight example to a solution, we could have authentication by an authentication protocol such as OpenId. OpenId is an extension of OAuth, and it provides identity information as part of an ID token (id_token). It’s this identity information that is the extension; OAuth has no method for this in its definition. In the walkthrough for the section “Application authentication for Azure AD,“ the authentication endpoint selected was WSFederation. However, you could have chosen the OpenId endpoint. The additional identity information is sent in the form of claims. Key value pairs present in the ID Token such as email or name.

Going back to the flight example, OAuth would check whether you can have access to the flight (resource), by validating the id_token and checking against the booking (permissions, dependent on resource type). It would verify authorization when you hand over a boarding pass (access token), so that you may board your flight (access your resource). There is also the scope of the access. In other words, what does the token grant you access to in the resource? In the context of the example, this could be first-class or economy seating.

It’s important that solution architects understand the differences between these protocols and how they are implemented. Microsoft identity platform enables the implementation of OpenId and OAuth to allow your customer’s developers to authenticate using many identities, including Azure AD work and school accounts, personal Microsoft accounts (Xbox, Outlook, and so on) and social accounts such as Facebook or GitHub. The identity platform returns access tokens in the form of JSON Web Tokens (JWTs), these are also known as “bearer tokens” because the bearer of the token is granted access.

OAuth authorization is implemented using “flows.” There are different flows depending on the client type and what the application they are accessing needs to achieve:

Implicit Grant Flow Browser-based client accessing a single-page application, such as a JavaScript application.

Authorization Code Flow Application installed on a device—either mobile or desktop—that requires access to resources such as Web APIs.

Client Credentials Grant Server-to-server communications happening in the background and utilizing service accounts or daemons.

On-Behalf-Of (OBO) Flow An application calls a web service that invokes another web service. The first web service gets consent to use the second web service on behalf of the user.

Figure 4-6 shows the flow of access requests and tokens between the client, APIs, and the Microsoft identity platform.

Figure 4-6 On-Behalf-Of flow
Screenshot_103

Instead of talking through this, we’re showing you how to set it up using a function app and Microsoft graph. We note where each part of the flow occurs based on the numbered circles in Figure 4-6:

  1. Create an Azure Function using the instructions from Chapter 2, Skill 2.2 “Configure Serverless Computing.” Select a windows function running PowerShell. Create a new quickstart function, select In-Portal Editing, and then select Webhook+API. Test that the function app is functioning correctly from a browser before you continue.
  2. Follow steps 2 through 7 from the section “Application Authentication with Azure AD” earlier in this skill and secure the function app against a directory. For steps 5 and 6 of this same section, use the OpenId endpoint, rather than WS-Federation. While in the App Registration Endpoint, copy the OAuth 2.0 token endpoint (v2) for use later. Note that Authentication/Authorization for a function app is under Platform Features. Test that the function app is functioning correctly with authentication from a browser before continuing.
  3. This is Circle 1 in Figure 4-6. The user is authenticated, and you have an id_token to send, but you can’t see it yet.

  4. In the app registration blade, select Certificates & Secrets, click NewClient Secret, enter an appropriate description and an expiry of 2 years, and click OK. You have created this for the flow; it requires an application to prove its identity using a Client ID and Client Secret. Copy the Client Secret and paste it into the AD Authentication configuration in the Function App Authentication/Authorization Client Secret (Optional) field. Click OK to save the change.
  5. Go to Azure AD in the portal and navigate to the App Registration forthe function app created in step 1 of this walkthrough. Click View API Permissions from the Overview in the App Registration blade:
  6. Select to Add a permission.

    Select Microsoft Graph at the top.

    Select Delegated Permissions and then select On Behalf Of, AccessAPI As Signed In User. Note here the Application Permissions option would be used in the Client Credentials Grant Flow.

    Scroll to the bottom, select User Permission and User.Read. This isthe scope. The API requires read access to the user record.

    Select Add Permissions. Permissions are added and you arereturned to the API permissions.

  7. Run the function app from a browser once more, and you see a secondpage after the authentication login as in Figure 4-7.
  8. Figure 4-7 Consenting for an application to use a secured resource
    Screenshot_104

    Click Accept to grant the function app consent to use the permissions listed on your behalf. This consent stays in place until either the user or admin revokes it. The function app should still work correctly once consent is given.

  9. Go back to the function app and edit the function code to call theMicrosoft Graph API by doing the following:
  10. Delete everything from between

    	
    Write-Host "PowerShell HTTP trigger function processed a request."
    to
    # Associate values to output bindings by calling 'Push-OutputBindin
    	
    

    Paste in this code under the Write-Host:

    	
    $status = [HttpStatusCode]::OK
    $aadToken = $Request.headers['x-ms-token-aad-id-token']
    	
    

    The first line sets up a 200 OK parameter to return at the end. The second line is part of Circle 1 from Figure 4-6. The Token Store of the function app (refer to the “Tokens” section from this skill) contains your id_token. It’s been injected into the header so you can pick it up for use here.

    Paste in the next bit of code. You need the tenantId taken from the OAuth 2.0 token endpoint from step 2:

    	
    $uri = "https://login.microsoftonline.com/<tenantid>/oauth2/v2.0/token"
    # Body required for on behalf of flow
    $body = @{
    client_id = "$env:WEBSITE_AUTH_CLIENT_ID"
    assertion = $aadToken
    scope = "https://graph.microsoft.com/user.read"
    requested_token_use = "on_behalf_of"
    client_secret = "$env:WEBSITE_AUTH_CLIENT_SECRET"
    grant_type =
    "urn:ietf:params:oauth:grant-type:jwt-bearer"
    }
    # Get OAuth 2.0 access and refresh tokens with user.read scope for microsoft
    graph
    $tokenRequest = Invoke-WebRequest -Method Post -Uri $uri -ContentType
    "application/x-www-form-urlencoded" -Body $body
    $token = ($tokenRequest.Content | ConvertFrom-Json).access_tok
    	
    

    Following this code through, you’re setting up a web POST using the Client ID, Client Secret, and id_token from the authorization, stating that the request is for the On-Behalf-Of flow and the token being sent is of type JWT Bearer. The POST is invoked and the access token for the scope Microsoft Graph user.read is stored. See Circle 2 and Circle 3 from Figure 4-6.

    Paste the final section of code:

    	
    # Pass the authorization token to graph
    $graphResp = Invoke-RestMethod -ContentType "application/json" -Headers
    @{Authorization = "Bearer $token"} `
    -Uri https://graph.microsoft.com/v1.0/me `
    -Method Get
    	
    

    This invokes a RESTful call to the Microsoft Graph API to send the Bearer token on the Header. The/me returns user information for the logged in user. This is Circle 4 and Circle 5 from Figure 4-6.

    Change the OutputBinding Body to the Graph API response:

    	
    Body = $graphResp
    	
    
  11. Run the function app in a browser once more, you’ll see the JSON forthe user record of the logged in user as shown in Figure 4-8. This is the full On-Behalf-Of flow in action.
  12. Figure 4-8 Microsoft Graph API metadata for the logged in user
    Screenshot_105
  13. Now you can revoke the permission for access. Here you are revokingthe token. The user must grant their consent once more before the function app can access Microsoft Graph on behalf of the user. Login to https://myapps.microsoft.com as the user who was running the function app and had consented to the permission request. Right click on the ellipses next to the app’s name, and select Remove as shown in Figure 4-9.
  14. Figure 4-9 Revoking consent for an application
    Screenshot_106

This will expire the access token and revoke the consent given by the user.

Note

Validating the Tokens

The example given here doesn’t validate the JWT tokens after they’re returned. You should be ensuring your customer’s developers are validating tokens against the signature on return to make sure they were issued by the correct identity provider. This is best practice, along with validating some claims within the token" in particular the audience’s claim to ensure that the ID token was meant to be used by your customer’s application.

Need More Review?

OAuth 2.0 On-Behalf-Of Flow

To delve deeper into securing API’s using OAuth 2.0 check out “Mircrosoft identity platform and OAuth 2.0 On-Behalf-Of flow” at https://docs.microsoft.com/en-gb/azure/active-directory/develop/v2oauth2-on-behalf-of-flow.

Protecting an API with Azure API Management and OAuth 2.0

You have explored the methods to secure calls between APIs, web apps, and devices using OAuth 2.0. However, these have been predominantly geared to interactions between your customer’s products. What if you need to architect a solution that requires APIs to be accessed externally? For this, you could recommend Azure API Management rather than exposing the APIs directly.

Azure API Management is a service that sits between back-end services (APIs) and the applications that call them. The Azure API Management service at its basic level is a proxy. It enables organizations to securely publish APIs internally and to partners or other third parties that want to leverage their data through the API gateway. The combination of the API gateway and developer and admin portals delivers the following features:

Onboard developers with a portal that can be branded

Response caching for performance

Protects APIs from misuse with security verifying tokens, certificates, and other credentials

Protects APIs from abuse by throttling using rate limits and quotas

Gives usage metrics for each API to the owners of the managed APIs and the developers subscribed to them

Allows on-premise use of the APIs, custom APIs in Azure, or thirdparty APIs

Open, standardized data exchange is a key component to enabling digital business, and solution architects need to know the services available to deliver this and how to configure them. API management is crossing the blurred line between architecting a solution and developing one. However, as mentioned previously, a good architect will have some development skills as part of their armory.

To explore using API management to secure an API using Azure AD and OAuth 2.0, you need to create an API and an API management instance, and then you configure the app to be protected by the instance. The following example looks at this using an Azure logic app because logic apps have integration built in to API Management:

  1. Create a logic app in the portal using the guidance from Chapter 2, Skill 2.2 “Configure Serverless Computing.” In this case, it doesn’t matter what your logic app does, but it needs to be triggered from an HTTP POST and return a response, so you can test it.
  2. Create an API Management Instance. Search for API Management inthe portal and click OK. You should see API management services. Click Add; add a name for the Management Service.
  3. Select your Subscription, Resource Group, and Location.

    Enter your Organization Name. This is used on the developer portaland in email notifications—for example, notification to a developer that their quota is nearly exhausted.

    Enter the Administrator Email. This is the email address that willreceive any system notifications.

    Select a Pricing Tier. Note you can’t use Basic or Consumptionbecause they have no Azure AD integration, which is required in this walkthrough.

    Click Create. When the service is ready to go, the admin emaildesignated in step 2d receives an email.

  4. Let’s have a quick look at products in the API Management service.Products contain APIs with quota definitions and terms of use. You aren’t going to edit any of these; you’re just getting a grasp of what API Management has to offer:
  5. In the API Management Service blade of the API Managementservice created in step 2, click Products, and then click Starter. In Settings, look at the Requires Subscription check box. When checked, this forces a developer to subscribe before the product and its APIs can be used. The state must be published before APIs in a product can be called.

    It lists the APIs in the Starter product.

    In Policies you see a limit of 5 calls every 60 seconds and 100 callstotal in a week. This is for the product Starter.

    Access control is where you can add groups that are created in theUsers And Groups section of the API Management blade.

    Subscriptions are required if a subscription key is needed to accessan API.

  6. Now publish an API using the API Management service:

    Select the API Management service created and then select APIs inthe Management Service blade.

    Here you select what type of API to add. There’s obviously tightintegration to Azure services, so you can select logic app, API app, or function app directly. For other external services, you can pick one of the others as appropriate. Note the Echo API on the left. This is used to create an API which echoes back the response headers and body, which is useful if your back-end API isn’t quite ready, and you need to test. Select Logic App.

    Click Browse to find the logic app you created in step 1 and selectit. The first three lines of the form are auto populated. Set the URL Suffix to a relevant suffix and in the Product select Starter, the product you explored in step 3a. Click Create. The logic app is ingested.

  7. Your API is listed under APIs on the API Management Service blade.You now need to check that the response is returned correctly:

    Click Test, select POST manual-invoke, and scroll down. Take acopy of the Request URL.

    Open PowerShell and execute the following command to call the API, ensuring you are not logged into the Azure AD account that owns the API Management service (or use Postman/equivalent):

    	
    invoke-webrequest -method Post -uri "<Request URL from Step 5a>"
    	
    

    An error is returned because there is no subscription key becausethis is an anonymous request:

    	
    invoke-webrequest : { "statusCode": 401, "message": "Access denied due to
    missing subscription key. Make sure to include subscription key when making
    requests to an API." }
    	
    

    You’re not going to secure this API using subscriptions becauseyou’re going to secure with Azure AD and OAuth 2.0. Go back to the Test tab, but this time select Settings to the left of it, scroll down and unmark Subscription Required. Click Save and try the call once more. You see a 200 OK response with the logic app response in Content.

  8. You now have the building blocks in place to protect the logic app APIwith Azure AD. To protect any app or API with secure AD, the app or API needs to be registered. Navigate to Azure Active Directory and select App Registration, and then select New Registration:
  9. Enter a Name that will make the app easily identifiable to yourselfand any users for the back-end API.

    Select Account Types " Any Organizational Directory because it’sa good fit for internal use and guests within any Azure AD. C. Ignore the Redirect URI for now. Click Register.

    Copy the Application ID from the newly registered app’s Overview and save for later use.

  10. Now you need to expose this application as an API. This will allowyou to grant delegated permissions to it:
  11. Select the app registration you created earlier and select Expose AnAPI in the blade.

    Select Add A Scope.

    In Scope name, enter App.Read. This is the scope of thepermission.

    In Who Can Consent, select Admins And Users. Admins canconsent for the tenant; users can consent for themselves.

    In Admin Consent Display Name And Description, put ApplicationRead And Return Response.

    The Status should be Enabled. Click Add Scope.

  12. All client applications that use your API are registered against AzureAD. Rather than building another application, you can test this using the API Developer portal.
  13. Register another application. This time, put the URL to your APIsdevelopment portal as the Redirect URL: https://<apiname>.developer.azure-api.net/signin.

    Click Register and copy the Application ID for the clientapplication registration from its Overview page.

    Staying in the new App Registrations blade, select Certificates AndSecrets and generate a New Client Secret. Keep a copy of this for later use.

    Again staying in the App Registrations blade select API permissions. Select Add a permission, then My APIs. Select the back-end API registration created in steps 6 and 7 of this walkthrough. Select Delegated Permissions and select App.Read. Click Add Permission.

  14. While in the app registrations section of the Azure AD blade, selectEndpoints and note the authorization and token endpoint for OAuth 2.0 v1. Also note the OpenId Endpoint. You’ll need these to set up OAuth Authorization and Authentication validation.
  15. Your app registrations are now in place, so you need to join the dotswith an OAuth Server in the API Management Service:
  16. Navigate to the API Management Service you created in Step 2.Click OAuth 2.0 in the Management blade. Click Add.

    In the Display name, select a relevant name for the OAuth2 service.Note that this auto populates ID.

    For the client registration page URL, you can use any URL becauseno users will be registering their own accounts for OAuth2 for this example. Enter http://localhost.

    Select Authorization Code Grant Type. This corresponds to thegrant types listed in how OAuth2 works at the beginning of the OAuth 2.0 section of this Skill.

    In the Authorization Endpoint URL, paste in the OAuth 2.0authorization endpoint you copied in step 9 of this walkthrough.

    The API is stateless, so you don’t require a state parameter. Leavethe authorization request method as Get.

    In the Token Endpoint URL, paste in the OAuth 2.0 token endpointcopied in step 9 of this walkthrough.

    Scroll down to Client Credentials, paste the Application ID fromthe back-end app registration in step 8b to Client ID and the Secret from step 8c of this walkthrough to Client Secret.

    Copy the redirect_uri (it’s grayed out) by selecting the box andpressing Ctrl + C. Save this for later.

    Click Create.

  17. Navigate back to the app registration created in step 8 of thiswalkthrough, select Authentication on the App Registration blade, and overwrite the Redirect URI with the URL copied in step 10i. Click Save.
  18. Navigate back to the API Management Service you created in step 2 ofthis walkthrough. Select APIs in the blade and pick the logic app API created in step 4 of this walkthrough. Click Settings at the top then scroll down to Security. In User Authorization, select OAuth 2.0 and then the server created in step 10 of this walkthrough. Click Save.
  19. Now call the API once more as in step 5b of this walkthrough. The APIreturns 200 OK and the Content is the response. Why is this when you haven’t obtained an id_token or logged in to an app? The OAuth 2.0 configuration is performing correctly, but at the moment the token (or lack of token) is not being checked; therefore, the call proceeds successfully. To preauthorize a request in API management, you need to add a policy named validate-jwt to the inbound policy section of the API:
  20. Navigate back to the API Management Service you created in step2 of this walkthrough. Select APIs in the blade, and pick the logic app API created in step 4 of this walkthrough.

    Click Design, and then select Add Policy in the Inbound Processingsection.

    Select Other Policies. Paste this into the inbound policy directlyunder and above ; do not overwrite any other text that may be present between these two tags. Note that you need to replace the back-end App ID and OpenID endpoints into this code block:

    		
    <inbound>
    <validate-jwt header-name="Authorization" failed-validationhttpcode="401" failed-validation-error-message="Unauthorized. Access token is
    missing or invalid.">
    <openid-config url="<OpenID Endpoint>" />
    <issuers>
    <issuer>https://sts.windows.net/<TenantId from OpenId
    Endpoint>/</issuer>
    </issuers>
    <required-claims>
    <claim name="appid">
    www.examsnap.com ExamSnap - IT Certification Exam Dumps and Practice Test Questions
    <value><Client Application Id " Step 8 ></value>
    </claim>
    </required-claims>
    </validate-jwt>
    		
    	

    Staying in the inbound policy configuration section, logic apps cannot handle the Authorization header and error with “The request must be authenticated only by Shared Access Scheme.” To mitigate this, you need to strip it from the header, and add a line above :

    	
    <set-header name="Authorization" exists-action="delete" />
    	
    
  21. Retest the API call using PowerShell. You’ll receive a 401
  22. Unauthorized message; the protection is now working correctly. To test this further, click Developer Portal at the top of the APIs part of the API Management service. The developer portal opens, and you’re already logged in as your user (authentication complete with token in the header). Now call the API:

    Select APIs at the top.

    Select the name of the API you published in step 4 of thiswalkthrough.

    Before clicking Try It, note that the portal has picked up theAuthorization request header with the OAuth 2.0 access token.

    Scroll to the bottom of the page, and click Send. The response isstill Unauthorized. Scroll up and look at the Authorization section. You see the name of your OAuth 2.0 server from step 10 of this walkthrough and next to it the choice of No Auth. Change this to Authorization Code; a permissions request will display because this application requires you use the permission to access your API with App.Read and Graph API. If you recall, this was the scope created in step 7 of this walkthrough. Click Accept. The 200 OK Message and content and other metadata displays.

Need More Review?

Protecting an API with OAuth

For more on protecting APIs using OAuth 2.0 and API Management, read “Protect an API by using OAuth 2.0 with Azure Active Directory and API Management” at https://docs.microsoft.com/en-us/azure/api-management/apimanagement-howto-protect-backend-with-aad.

Exam Tip

Understanding that Azure AD Authentication is available only in certain API Management pricing tiers and having a general grasp of the core features of API Management is good to know. The best practice of validating the JWT token, whether in API Management or a custom application, may also prove useful.

Implement managed identities for Azure resources service principle authentication

One of the issues we face when architecting solutions is managing the credentials that are required when integrating Azure resources. Ideally, the solutions being architected should not have credentials out in the open, and developers should not have access to them. Managed identities for Azure resources (formerly Managed Service Identity or MSI) provides this feature, authenticating Azure services through Azure AD using an automatically managed identity. This managed identity can be used to authenticate any service that supports Azure AD, and once the identity is assigned, authorized actions can be performed against this service without having to store credentials in the code. Architects need to understand how this works and how it’s configured.

There are two types of managed identity: system and user. Let’s look at system-assigned managed identity first.

When you enable a system-assigned identity on an Azure service, Azure AD creates an AD identity and then creates a service principal that can be used to represent the service for Access Control, Access Policies, and RoleBased Access Control (RBAC)—whichever is supported by the Azure service. If you delete the Azure resource, the identity is automatically deleted; it’s required only for the lifetime of the resource. To see this in action, set up a blob to be read and output by an Azure function by completing the following steps:

  1. Create a storage account with a publicly accessible blob containerfollowing the guidance from Chapter 1, Skill 1.2 “Create and configure storage accounts.” Upload a text file with a simple sentence in it to the container and copy the blob URL to this file for later use. Check that the URL returns the file by opening it in a browser.
  2. Create an Azure Function following the instructions from Chapter 2, Skill 2.2 “Configure serverless computing.” On creation, select a windows function running PowerShell. Create a new quickstart function and select In-Portal Editing and then Webhook+API.
  3. Edit the function to the following by pasting your blob URL from step1 of this walkthrough into $blobURL=:
  4. 	
    using namespace System.Net
    # Input bindings are passed in via param block.
    param($Request, $TriggerMetadata)
    # Write to the Azure Functions log stream.
    Write-Host "PowerShell HTTP trigger function processed a request."
    $status = [HttpStatusCode]::OK
    $blobURL = "<Place Blob URL here>"
    # Grab the text from the Blob
    $blobText=invoke-webrequest -URI $blobURL
    # Associate values to output bindings by calling 'Push-OutputBinding'.
    Push-OutputBinding -Name Response -Value ([HttpResponseContext]@{
    StatusCode = $status
    Body = $blobText.Content
    })
    	
    

    Save and run the function. You see the sentence from the blob in the Output pane on the bottom right. This is due to the blob being publicly accessible.

  5. Return to the blob container in the portal and click the ellipsis to the right of the storage container. On the container blade, select Overview, and then Change access level and select Private.
  6. Rerun the function URL in the browser. The blob is inaccessible, soonly the 200 OK is returned; the webpage is empty.
  7. Go back to the function and set up a managed identity:
  8. Select Platform Features at the top of the function app.

    Click Identity.

    In the System Assigned section set Status to On.

    Click Save.

  9. Once the assignment is complete, go back to the storage account toassign access to the identity:
  10. Return to the blob container in the portal and click the ellipsis tothe right of the storage container. Select Container Properties. On the Container blade, select Access Control (IAM).

    Click Add at the top, and then Add Role Assignment. Add RoleAssignment opens.

    Select Storage Blob Data Reader for Role.

    In Assign Access To " Function App, select the function appcreated in step 2 of this section. The role assignment selection is shown in Figure 4-10. Click Save.

    Figure 4-10 Add role assignment for managed identity
    Screenshot_107
  11. Now go back to the function app and edit the PowerShell snippet. Youmade the blob container private in step 4 and set up access control in step 7 of this section. Therefore, you have to retrieve a token to access the blob container. When the identity was created, an endpoint to a local token service was set up in the background by Azure. This is set as the MSI_ENDPOINT environment variable, and it’s available to the function. Edit the code to call this using its Azure managed secret MSI_SECRET. Using these together enables you to obtain the access token for a given resource. (Note you can view these variables in Kudu Env Variables.) Delete from under $status= to above #Associate values and paste in the following code:
	
#Get the token for storage access from the MSI_ENDPOINT using the MSI_SECRET on
the header
$apiVersion = "2017-09-01"
$resourceURI = "https://storage.azure.com"
$tokenAuthURI = $env:MSI_ENDPOINT +
"?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_
SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token
$headers = @{}
$headers.Add("x-ms-version","2018-03-28")
$headers.Add("x-ms-client-request-id",[guid]::NewGuid())
$headers.Add("x-ms-date",(Get-Date).AddHours(1).ToString("ddd, dd MMM yyyy
HH:MM:ss GMT"))
$headers.Add("Authorization","Bearer $($accessToken)")
$resp=Invoke-WebRequest -UseBasicParsing -Uri "https://<yourcontainername>.
blob.core.windows.net/msiblobcontainer/
<yourblobfilename>" -Method GET -Headers $headers
$resp.StatusCode
	

Execute the function in a test and then in a browser. The blob contents are displayed once more as shown in Figure 4-11.

Figure 4-11 Blob contents returned from the browser call
Screenshot_108

The process to retrieve the token for a VM is slightly different. You explore that in Skill 4.2 when you review Azure Key Vault.

User managed identity is currently in preview. A user managed identity is created as an identity in Azure AD, but it can be assigned to more than one Azure service. Therefore, you could have 100 VMs and assign the same user assigned managed identity to it. This would mean you could access multiple resources through the managed identity.

If you delete some of your VMs, the user managed identity is still available because it is a standalone resource. Assigning and granting access to the user managed identity is identical to the system managed as explored earlier; however, you need to know how to configure a user managed Identity:

	
az identity create -g myResourceGroup -n myUserIdentity
	

On the command line, it is a one-line command to create an identity that can be reused across resources. To view managed identities in the portal, search for the managed identities resource. This also has the ability to add and remove the managed identities.

Skill 4.2: Implement secure data solutions

Your customers’ applications are now secured. You know what to recommend when architecting user-to-application and application-toapplication authentication. That’s just half the story, though. What about the data that is transported on premise or across the public internet—or even from private internet to the cloud? What happens to that data when it’s stored for access at a later date? For this you need to understand the end-to-end encryption possibilities in Azure and which ones you should recommend based on the scenarios presented to you.

This skill covers how to:

  • Create, read, update, delete keys, secrets, and certificates by using the key vault API
  • Encrypt and decrypt data at rest and in transit
  • Encrypt data with Always Encrypted
  • Implement Azure Confidential Compute and SSL/TLS communications

Create, read, update, delete keys, secrets, and certificates by using the key vault API

Before you start looking at encryption, you first need to think about the mechanisms used in encryption and secure transit and how these require an extra level of care. Your customers may have a web application, and it may need a connection string to a Redis Cache or perhaps a key to access some file storage. There could be a VM on the back end that cannot be called without some secure form of credentials.

In all of this, you have secrets and keys. Your customers’ developers may have embedded some in the code or perhaps as plaintext in configuration files. Then there’s infrastructure as code, and IT pros should be deploying this in a reusable manner, but usernames and passwords could have been accidentally left in scripts and templates. What happens to this infrastructure when embedded secrets and keys expire? How can you tell where they are being used and that they’re being used for the correct purpose rather than being abused? What if these secrets end up in a public source code repository such as Github and a malicious party checks out your code and has access to your secrets? Even if the secrets don’t get outside your customer’s organization, how can they be sure a rogue employee hasn’t managed to get access to secrets? Either way, your customer’s infrastructure is now insecure, and their data is at risk. As a cloud architect, you need to ensure your customers know how to mitigate these risks, so how can you store these items securely? You need to use Azure Key Vault, a cloud-based securityenhanced secret store.

Key Vault gives your customers the ability to centrally store their secrets. These secrets are split into three groups:

Keys

Symmetric and asymmetric (public/private key pair) keys. Private keys cannot be retrieved; they’re used only in cryptographic operations.

They’re generated by key vault or imported (bring your own key or

BYOK).

They’re used for Azure services like Always Encrypted and Transparent Data Encryption (TDE).

They’re stored in a cryptographic key service. Keys are irretrievable. Secrets

For any sensitive information, database connection strings, Redis Cache connections, storage connections used in applications.

Other sensitive information you may need at runtime, if highly

sensitive, should be encrypted before storing.

25KB is the maximum size.

Certificates

X509 certificates, used in HTTPS/SSL/TLS.

They can link to some external CAs for autorenewal.

They’re stored as JSON wrapped secrets.

With centralized storage of secrets comes control of their distribution. You should be recommending that your customers secure access to the key vault to users via RBAC (currently only at the vault level) and access policies, creating three types of actors:

SecOps

-   Create vaults, manage the keys and secrets in the vault. Revoke/delete.

Grant permissions to users and applications to perform cryptographic and management operations, read URIs, add keys, and so on.

Enable logging for auditors.

Can see some keys/secrets depending how they’re stored.

Developers

Add links to the keys and secrets into applications using URIs instead of actual values or keys.

Never see the keys or secrets. Auditors

Monitor the log files and review usage for compliance and security standards.

Never see keys or secrets.

With Azure Key Vault, Microsoft is unable to view or use the keys or secrets. This protection is by design and is a key component of a cryptographic vault.

Architects need to know how to create a key vault and secrets and then use this knowledge of key vault and secret creation to delve deeper into other key vault features.

Need More Review?

Azure Key Vault

To learn more about key vault, check out the article "What is Azure Key Vault" at https://docs.microsoft.com/en-us/azure/key-vault/key-vault-overview

Creating a key vault, keys, secrets, and certificates

The introduction for this skill looked at the theory of key vaults, the secrets that can be stored within them, and how the actors within an organization would interact with the key vault. You now need to create a key vault to start storing your secrets. Follow these steps:

  1. Select Create A Resource and search for Key Vault. Select Key Vaultand click Create. Enter a name for the key vault. Choose something that’s applicable to the key vault’s use.
  2. Select your Subscription, Resource Group, and Location.
  3. For Pricing Tier, select Standard.
  4. Standard Software protected keys. Free to store, billed per transaction.

    Premium Hardware Security Module—stored keys certified to FIPS 140-2 (required by some regulated industries). Cost to store and per transaction.

  5. Access Policies has one principal selected. If you click Access Policies,you can see your user as the security principle; click your user to view the permissions. The security principle has been given management access to all secret types but not cryptographic permissions; these are the defaults for a key vault creator. Leave everything as it is because you’ll revisit this later. Click OK twice to return to the Key Vault creation blade.
  6. Select Virtual Network Access to view the following options:
  7. All Networks Can be accessed from anywhere, including the public internet.

    Selected Networks Defaults to any trusted service, VMs, Disk

    Encryption, backup. Click the information next to Allow Trusted Microsoft Services To Bypass This Firewall? to see a full list. If you select No here, select the VNets and Public IP address ranges that require access.

    For this example, leave the setting on All Networks, and click Save.

  8. Your key vault creation blade should look as shown in Figure 4-12. Click Create.
Figure 4-12 Create a key vault
Screenshot_109

Now that you have a key vault created, it’s time to add some secrets. In the

portal, navigate to the key vault you just added and use the following steps to add a secret to your newly created vault:

  1. In the key vault plane, select Secrets.
  2. Click Generate/Import at the top.
  3. Under Upload Options, Certificate is deprecated, so you can only selectManual.
  4. Enter a Name that identifies your secret appropriately.
  5. In the Value field, enter the text string for the secret.
  6. In Content Type — Optional, enter a description of the type of secret.
  7. Activation and Expiration Date are for information only. They aren’tenforced by Azure. For this example, you should leave them blank.
  8. Your secret creation blade should look as shown in Figure 4-13. Click Create.
Figure 4-13 Create a key vault secret
Screenshot_110

Your user was granted access to create secrets through the access policy created on the key vaults creation; therefore, the secret is created. Let’s take a further look at access policies.

If List was not enabled for the secrets in the access policy, your user would not be able to list the secrets. You can see whether it’s enabled by choosing Access Policies from the key vault plane. Click the access policy created for your user, and remove List against the Secrets. Log out of Azure and log in to force a permissions refresh. You can no longer see the secret you just added. Add List back in through the access policy.

List the secrets once more and click the secret name. You haven’t updated this secret yet, so there’s only one version displayed. Click this version to see the key details. Your user was granted Get permission on key vault creation, so when you click Show Secret Value at the bottom, you can view the secret. Go back to Access Policies on the key vault plane and remove Get

Permission On The Secret by unchecking it, clicking OK, and then clicking Save. Now go back to the Secrets section of the blade; you can view the secrets list and see the versions, but on clicking on the version, you can no longer see the details of the secret or show the secret value. Reenable the Get permission for the secret in the access policy.

Finally, you’ll need to turn on logging so that your customer’s auditors can review how the secrets are being used. For this, you need a storage account so that key vault can have a container to write the logs into. You can do this in the portal, on the command-line interface (CLI), or in PowerShell, as shown here:

  1. Open a PowerShell session and log in to Azure. Create a storageaccount named vaultLogs to house the logs container.
  2. 		
    $sa = New-AzStorageAccount -ResourceGroupName vaultRg -Name vaultLogs -Type
    Standard_LRS -Location 'northeurope'
    		
    	
  3. Retrieve the key vault definition:
  4. 		
    $kv = Get-AzKeyVault -VaultName 'vaultExample'
    		
    	
  5. Create a new diagnostic setting on the key vault which writes to thenew storage container:
  6. 		
    Set-AzDiagnosticSetting -ResourceId $kv.ResolurceId -StorageAccountId $sa.Id
    -Enabled $true -Category AuditEvent
    		
    	

The logs can be read from the container insights-logs-auditevent.

Need More Review?

Azure Key Vault Security

To learn more about key vault Security through RBAC and access policies, read "Secure access to a key vault" at https://docs.microsoft.com/en-us/azure/key-vault/key-vault-secure-your-key-vault

Throughout this section, the discussion has been predominantly about secrets. The processes to manage and use keys and certificates are similar and are covered in the sections “Encrypt and decrypt data at rest and in transit” (keys) and “SSL/TLS” (certificates).

Reading secrets, keys, and certificates

So far in this section, you’ve explored secrets using the portal. The portal when interacting with the key vault is a wrapper to the key vault REST APIs. The APIs we have touched on so far give you the ability to manage (management plane) the keys, secrets, and certificates. There is another set of APIs that allow you to get and use the secrets (data plane). You can see an example of the REST APIs when you view the secret created in the previous section. When you click the version number of the secret, the secret identifier appears as shown in Figure 4-14.

Figure 4-14 Secret identifier for a key vault secret
Screenshot_111

The identifier takes the following form:

	
https://<keyvaultname>.vault.azure.net/<secrets>/<secretName>/<version>?
	

Here you can replace <secrets> with <keys> or <certificates>, and their corresponding names and versions. The secret identifier make-up is still the same. You can try to access this directly with the URI as shown in the following, which is an example in PowerShell. Note that you need to use the URI from your secret identifier. Make sure you are logged into Azure in PowerShell to execute the following:

	
invoke-restmethod -Uri https://kvvaultexample.vault.azure.net/secrets/myAppDbConnect/
eeabaxxxxxxxxxxxxxxxxxxxxxxxxx31?api-version7.0
invoke-webrequest : The remote server returned an error: (401) Unauthori
	

Even though you’re logged into Azure and therefore authenticated as the user the secret was created with, the API call cannot GET this secret due to the access and refresh OAuth 2.0 tokens not being present as part of the request. The (401) Unauthorized message is returned because the endpoint doesn’t have the authorization yet.

Azure portal, Azure CLI, and PowerShell Az Module are all wrappers to Azure APIs, including the key vault REST API. Using the PowerShell command Get-AzKeyVaultSecret you can try to access the secret once more:

	
$secret=(get-azkeyvaultsecret -VaultName 'kvvaultexample' -secretName 'myAppDbConnect').
SecretValueText
$secret
kvdbexample.database.windows.net,1433
	

Here, you can see part of an Azure SQL Database connect string which was stored in the secret create in the key vault example. The command GetAzKeyVaultSecret is substituting the given VaultName and secretName into the URI for you as part of the wrapping process, along with tokens required for access. Because you haven’t specified the secret version, you will receive the current one.

Note

On Tokens, Api Usage, and Cli/Powershell

It’s possible to retrieve the OAuth 2.0 tokens and access the secret in this way, but it’s way beyond the knowledge required for the exam. This example is to explore what’s happening on the back end.

Reading secrets, keys, and certificates with Managed Service Identity

The walkthroughs so far have looked at what you can do as a user (User Principal). However, as an architect, you need to be able to instruct on how to set up access for your applications to the key vault and its secrets. To grant access to an application or other Azure service, you need to create an access policy to an identity that has been assigned to the Azure service. This can be a Managed Service Identity (MSI) or a User Assigned Identity as explored in Skill 4.1. To see this in action, in the following example you deploy a Linux VM. You then access a key vault secret via MSI and check access from inside the VM. This simulates a developer using a software development kit (SDK) to access a secret.

The following steps break up a single Azure CLI script, describing the tasks each section performs. Execute each section in this script to explore accessing secrets through an MSI:

  1. Initially set up some variables for multiple use throughout the script.Here you’re setting resource group, location, and secret names. These should always be set as appropriate for usage and follow naming conventions. Location should be as required by your customer:
  2. 		
    rgName="kvrgexample"
    rgLocation="northeurope"
    kvName="kvvaultexample"
    kvSecretName="kvsecretexample"
    vmName="kvvmexample"
    		
    	
  3. Create your resource group to house your resources:
  4. 		
    az group create --name $rgName --location $rgLocation
    		
    	
  5. Use the resource group to create a key vault. Key vaults need aresource group and a location. A key vault needs to be in the same location as the services accessing it:
  6. 		
    az keyvault create --name $kvName --resource-group $rgName --locati
    		
    	
  7. Add a secret to your vault and record the ID return value for later use.That’s your URI:
  8. 		
    az keyvault secret set --vault-name $kvName --name $kvSecretName --value "Shhh
    it's secret!"
    		
    	
  9. Create a VM and set up for SSH:
  10. 		
    az vm create --resource-group $rgName --name $vmName --image UbuntuLTS 
    --admin-username sysadmin --generate-ssh-keys
    		
    	
  11. Assign a managed identity to the VM, save the publicIpAddress returnvalue for later use.
  12. 		
    az vm identity assign --name $vmName --resource-group $rgName
    		
    	
  13. Get the service principal ID of the managed identity for use in the keyvault, vault policy addition:
  14. 		
    spID=$(az vm show --resource-group $rgName --name $vmName --query identity.
    principalId --out tsv)
    		
    	
  15. Grant the VM identity access to the secret. The VM only needs list andget to retrieve the secret:
  16. 		
    az keyvault set-policy --name $kvName --object-id $spID --secret-permissions
    get list
    		
    	

    Note

    Azure Key Vault Access

    When access to secrets, keys, and certificates is given, it’s for the entire vault. Therefore, it’s part of best practice to separate key vaults across applications and then again across environments (dev, pre-prod, prod). This ensures that each environment’s applications cannot use the other’s keys. This separation also helps to make sure key vault transaction limits are not hit.

  17. The VM is ready to test. Use SSH to get into the VM using thepublicIpAddress saved in step 6. This will open up a secure shell where you can install CURL:
  18. 		
    ssh publicIpAddress -l sysadmin
    sudo apt-get update
    sudo apt-get install -y curl
    
    		
    	
  19. Staying in the secure shell, use Curl to grab an access token. Thisrequest calls the Azure Instance Metadata service endpoint; it returns metadata about a running VM and can only be accessed from inside the VM. To return the access token, you need to access the identity endpoint as shown here. Note the address of the Azure Instance Metadata service is static:
  20. 		
    curl 'http://169.254.169.254/metadata/identity/oauth2/token?api-version=2018-02-
    01&resource=https%3A%2F%2Fvault.azure.net' -H Metadata:true
    		
    	
  21. Staying in the shell once more, copy the access token that has beenoutput as in Figure 4-15, and use Curl again to grab the secret, substituting the ID from step 4 (URI), and the access token from the last step:
  22. 		
    curl https://<YOUR-SECRET-URI>?api-version=2016-10-01 -H "Authorization: Bearer
    <ACCESS TOKEN>"
    curl https://kvvaultexample.vault.azure.net/secrets/kvsecretexample/
    b8f1xxxxxxxxxxxxxxxx99c?api-version=2016-10-01 -H "Authorization:
    		
    	
Figure 4-15 Azure Instance Metadata service endpoint request and return from command line
Screenshot_112

Once the Curl request is executed, you can view the secret as shown in Figure 4-16 to confirm you have the access directly from the VM. This confirms access via the identity and that the access policy is functioning correctly.

	
{"value":"Shhh it's secret!","id":"https://kvvaultexample.vault.azure.net/secrets
/kvsecretexample/ b8f1xxxxxxxxxxxxxxxx99c","attributes":{"enabled":true,
"created":1559775793,"updated":1559775793,"recoveryLevel":"Purgeable"},"tags":
{"file-encoding":"utf-8"}}
	
Figure 4-16 Retrieving a key vault secret on the command line using a bearer token
Screenshot_113

Exam Tip

You will not be required to know every command to set up access from an Azure Service to a key vault secret; however, knowing the order of setup and where to retrieve an access token is important.

Updating and deleting secrets

Keys, secrets, and certificates can be updated within the key vault. It’s possible to update the metadata or the secret itself. If the update is to the secret itself, a new version of the secret is created that can be referenced in the URI, either by its version number or by leaving the version number off the URI, which always returns the most recent version. Follow these steps in the portal to create a key and update this key to a new version to see versioning of keys in action:

  1. Create a key in your key vault:

    Click Keys on the Key Vault blade, and select Generate/Import.

    Select Import and select any private key from your local machine inFile Upload.

    Enter the password you used when creating the private key in Password. Note that you can only import an RSA Key.

    Enter a name for your key, and click Create.

  2. The Key you have just created is listed in the Keys List. Now you canupdate it:
  3. Click on the key in the Keys list; select New Version.

    Select Import. The information required is identical to that above,except you don’t need to supply a name. You may select the same private key here because it’s a demo. Click Create.

    You are returned to the Keys version list, there are now Current andOlder Versions listed. Both are enabled and can be used.

From an architect standpoint, you need to ensure your customers are updating their secrets regularly. This is called rotation, and for some secrets, such as keys, could be a regulatory requirement.

Note

Secret Rotation

The example uses manual rotation. You either update the version after the rotate, or point your application at a new Key. This task can be automated using Azure Automation and Runbooks.

The deletion of a key or entire vault is a straightforward task in the portal. On the Vault blade, select Overview and Delete. You also can use PowerShell remove-azKeyVault and CLI az keyvault delete to perform vault deletion and associated commands for secret deletion.

The problem you need to be aware of as an architect is accidental deletion of a key vault or secret. In this scenario, unless you have a backup of the secret, you may no longer be able to communicate with your application or be unable to read encrypted data. You could place a resource lock on the key vault to stop it being deleted. However, key vault gives you two flags you can set at the vault level that are recommended for best practice. The following is an updated version of the CLI command used in the VM and key vault walkthrough from earlier in this section. You can execute this command to create a protected key vault:

	
kvName="ssekvexamplecli"
az keyvault create 
--name $kvName 
--resource-group $rgName 
--location $rgLocation 
--enable-soft-delete 
--enable-purge-protection
	

Looking closely, you can see the following commands have been added at the bottom:

--enable-soft-delete If a secret or an entire vault is deleted, you can recover it for up to 90 days after deletion.

--enable-purge-protection If a secret or an entire vault is deleted and has gone into a soft-delete, it cannot be purged until the 90-day period because deletion has passed.

These options are not enabled by default and must be specified on creation or on an update to the Key Vault properties; updates must be performed on the command line. Here is an example in PowerShell of updating a key vault that has not been created with soft delete:

	
($vault = Get-AzResource -ResourceId (Get-AzKeyVault -VaultName $keyVaultName).
ResourceId).Properties `
| Add-Member -MemberType NoteProperty -Name enableSoftDelete -Value 'True'
Set-AzResource -resourceid $vault.ResourceId -Properties $vault.Propert
	

These two lines of PowerShell are retrieving the properties of the key vault, updating them to add the soft delete, and then setting the updated properties back to the vault.

Need More Review?

Soft Delete for Key Vault

To learn more about configuring soft delete and purge protection on a key vault, read "Azure Key Vault soft-delete overview" at https://docs.microsoft.com/en-us/azure/key-vault/key-vault-ovw-softdelete. Also review the other items listed under the section heading concepts from the same documentation page.

Encrypt and decrypt data at rest and in transit

As a solution architect, you need to be educating your customers in designing solutions that protect their data, taking into account all the possible states the data can occur in and the appropriate controls for those states. These data states are

At rest Data that is inactive and stored physically (persisted) in digital form—for example, databases, files, disks, and messages.

In transit Data that is being transferred. This could be between locations, over a network, or between programs or components.

Encryption at rest is designed to prevent a possible attacker from gaining easy access to data at rest on compromised physical media.

So, why use encryption at rest when there’s a low chance of an attacker gaining access to Azure’s physical media? Aside from encryption at rest being part of best practice for Data Security, our customers’ data may have mandatory requirements for data protection from compliance and internal governance—for example, PCI DSS, HIPPA, or perhaps the new European data privacy laws, GDPR. Encryption at rest adds an additional layer of defense on top of Azure’s already highly compliant platform, which is why it’s enabled by default where possible.

By default, Azure resource providers (in this example, Azure Storage) use service-managed symmetric keys to encrypt the data as it is written to storage. This process is transparent to the user; the Azure resource provider manages the key and encryption process. The same key is then used to decrypt the data into memory before the data is accessible in an unencrypted format by the application, meaning that no code changes are required to use this feature. It also carries no cost to the customer. This is server-side encryption at rest and is shown in Figure 4-17.

Figure 4-17 Server-side encryption at rest
Screenshot_114

The following services support encryption at rest with service-managed keys:

Azure Storage (Storage Service Encryption or SSE)

Azure SQL Database (Transparent Data Encryption or TDE)

Azure Cosmos DB

Azure Data Lake

Managed disks (via SSE)

Encryption at rest also supports customer-managed keys on some services With customer-managed keys, you can bring your own key (BYOK), importing a key into key vault, or you can create one directly within key vault. Customer-managed keys give your customers greater control over thei key management process, including

Import your own key or create one in key vault, which lets you decid when to rotate your keys. You can disable a key if compromised.

Define access controls to your key.

Audit of key usage.

Customer-managed keys are supported on Azure Storage (SSE) for blobs and files, Azure SQL Database (TDE), and Azure Disk Encryption for encryption at rest.

Exam Tip

Know what types of data are encrypted by Storage Service Encryption (SSE), Transparent Data Encryption (TDE), and Azure Disk Encryption.

Need More Review?

Azure Data Encryption at Rest

To learn more about encrypting Azure data at rest, visit the Microsoft Docs article "Azure Data Encryption-at-Rest" at https://docs.microsoft.com/en-us/azure/security/azure-security-encryption-atrest

Enable customer-managed keys for Azure Storage encryption

As discussed above, Azure Storage uses Storage Service Encryption (SSE) for encryption at rest by default using service-managed keys. To use customer-managed keys, you need to specify your key for use. Follow these steps to configure a storage account to use customer-managed keys, using a new key vault and generating a new key:

  1. Open a storage account you’ve already created. The one in thiswalkthrough is a Standard Gen Purpose V2 LRS Account.
  2. In the Configuration blade, find the Encryption setting. Click thissetting to see a description on how data is already encrypted at rest, but you can use your own key.
  3. Check the box marked Use Your Own Key. Select A Key From KeyVault is selected by default.
  4. Click Select Under key vault and select Create A New Vault. Whenyou create a new key vault for use with SSE, the storage account and key vault must be in the same region. However, they can be in different subscriptions.
  5. Enter a name for the key vault and select the same resource group as the storage account for simple resource grouping. The location should be set to the same as the storage account. Leave the other options as default and click Create. Once the key vault is deployed, you’re returned to the Use Your Own Key setup.

  6. Click Select under Key and then select Create A New Key. This createsa new key in your new key vault. Enter a name for your key, and leave the other parameters as default. Once the key is added, you’re returned to the Use Your Own Key setup.

Note

SSE Customer-Managed Keys

When creating a key in the key vault, you must select an RSA Key if it is to be used with SSE. You also can generate your own RSA key outside of key vault. To use such a key, import it into the vault, and then select the vault and the imported key in steps 4 and 5.

  • Click Save at the top of the page. The change is submitted, and theStorage Service Encryption for this account will now be using your customer-managed key.
  • If you need to change the key used for encryption, go back to the same Encryption setting and select a new key in the key vault, or uncheck Use Your Own Key to revert to service-managed keys.

    Exam Tip

    Know how the PowerShell command Set-AzStorageAccount and the Azure CLI equivalent command az storage account update are used to accomplish this task.

    Need More Review?

    Configure Customer-Managed Keys for Azure Storage

    To learn more about configuring customer-managed keys on SSE, visit the Microsoft Docs article “Azure Data Encryption-at-Rest” at https://docs.microsoft.com/en-us/azure/storage/common/storageencryption-keys-portal. Follow the linked articles for PowerShell and Azure CLI.

    Azure Disk Encryption

    Unmanaged Disks in Azure Storage are not encrypted at rest by Storage Service Encryption. Managed disks are automatically encrypted by SSE. As a solution architect, you need to be teaching your customers to encrypt at rest where possible because encryption at rest follows best practice in data security. To encrypt unmanaged disks, you must use a customer-managed key and Azure Disk Encryption. Like SSE, Azure Disk Encryption integrates with key vault for management of encryption keys.

    Note

    Azure Disk Encryption

    Azure Security Center will flag unencrypted IaaS disks and recommend the disks be encrypted. This is listed as a High Severity alert.

    Azure Disk Encryption uses the industry-standard features of BitLocker (Windows) and DM-crypt (Linux Kernel 2.6+) to encrypt the Data and operating system disks.

    Now you configure Azure Disk Encryption on an existing VM image with a single unmanaged disk. Before you do anything, back up your VM! Then complete the following steps:

    1. Navigate to your existing VM in the portal, select Disks from theSettings blade, and verify that the operating system disks and any data disks are unencrypted. In this case, the VM only has an operating system disk. See Figure 4-18.
    2. Figure 4-18 Unencrypted operating system disk
      Screenshot_115
    3. To enable Azure Disk Encryption, you need to switch to the command line. For this example, you will work in Azure CLI. The existing VM is named "adevmexample." The VM in this example is Windows Server 2016 with unmanaged disks. You can verify that your existing VM isn’t using encrypted disks in Azure CLI too. Execute the following set of Azure CLI commands. You’ll see the message "Azure Disk Encryption is not enabled" returned to the console:
    4. 	
      vmName="adevmexample"
      rgName="diskEncryption"
      rgLocation="northeurope"
      az vm encryption show --name $vmName --resource-group $rgName
      	
      
    5. Staying in Azure CLI, execute the following to create the key vault. Note the --enabled-for-disk-encryption; you can't encrypt a disk with a key from the vault without it:
    6. 	
      kvName="adekvexamplecli"
      az keyvault create --name $kvName --resource-group $rgName --location
      $rgLocation 
      --enabled-for-disk-encryption true
      	
      
    7. Staying in Azure CLI, execute the following to create a key using the new key vault. The protection setting here is only required if you have a premium tier key vault; you can then pick between hardware and software protection as required:
    8. 	
      keyName="adekeyexamplecli"
      az keyvault key create --vault-name $kvName --name $keyName --prote
      	
      
    9. Staying in Azure CLI, execute the following to create a new serviceprinciple name, and then use this to create the service principle and store the ID and password. You need these to set up security on the key vault and then access the key for encryption:
    10. 	
      spName="https://adespexample"
      read spPassword <<< $(az ad sp create-for-rbac --name $spName --query password
      --output tsv)
      spId=$(az ad sp show --id $spName --query appId --output tsv)
      	
      
    11. Still staying in Azure CLI, execute the following to add a policy on thenew key vault. This will allow the service principle to access keys from this key vault to encrypt the disk with the wrapKey permission:
    12. 	
      az keyvault set-policy --name $kvName --spn $spId --key-permissions wrapKey
      --secret-permissions set
      	
      

      Now that the key vault, key, service principle, and key vault policy are in place, you can encrypt your VM disks using the service principle access to the key vault. Staying in Azure CLI, execute the following to encrypt the disks. In this command, volume-type of all is going to encrypt all your disks. If you add more data disks after encrypting a VM, use the same command with --volume-type data; otherwise your new disks won’t be encrypted:

      	
      az vm encryption enable --resource-group $rgName --name $vmName --aad-client-id
      $spId 
      --aad-client-secret $spPassword --disk-encryption-keyvault $kvName --keyencryption-key $keyName 
      --volume-type all
      	
      
    13. Depending on how many disks there are and their size, this commandmay take a while. It may also reboot the VM. The status can be periodically checked by issuing the following command:
    14. 	
      az vm encryption show --resource-group $rgName --name $vmName --query [osDisk] -o
      tsv
      	
      
    15. Now reissue the encryption status check az vm encryption show fromstep 2. If everything’s working as planned, you see a JSON response with "dataDisk": "Encrypted", "osDisk": "Encrypted" followed by the key vault and key details.
    16. Switching back to the portal, follow the instructions in step 1 oncemore and check that the Disk Encryption is set to “Enabled.” For belt and braces verification, you can look on the VM itself. RDP in, open an Explorer window, and select MyPC. The open padlock sign above the disk shows this is encrypted at rest by BitLocker, as shown in Figure 4-19.
    Figure 4-19 BitLocker encryption at rest
    Screenshot_116

    Need More Review?

    Azure Disk Encryption

    To learn more about encrypting Azure Disks, including some important prerequisites and quickstarts, visit the Microsoft Docs article "Azure Data Encryption-at-Rest" at https://docs.microsoft.com/en-us/azure/security/azure-security-diskencryption-overview.

    Azure Storage client-side encryption

    In the encryption examples, which in this skill have all been server-side so far, it’s possible the data that has been in transit to Azure Storage has arrived through an unencrypted channel. Because the data has not been encrypted, it has been open to attack. When architecting a secure data solution, you need to address unencrypted data in transit because compliance and governance requirements may make encrypted in transit mandatory. Either way, it’s best practice to encrypt in transit.

    Client-side encryption for Azure Storage encrypts the data on the application side; therefore, if the data is intercepted over an unencrypted communication channel, it’s not as easily compromised. Figure 4-20 shows a diagram of Azure Storage Encryption reworked for client-side encryption. All data leaving the application is encrypted.

    Figure 4-20 Client-side encryption process
    Screenshot_117

    Client-side encryption requires the Azure Storage Client Library to be called as part of an application. The encryption is performed using this process:

    1. The Azure Storage client library generates a content encryption key(CEK), which is a one-time-use symmetric key.
    2. The client library encrypts the data with the CEK.
    3. The CEK is encrypted by a Key Encryption Key (KEK). The KEK hasa key identifier, can be symmetric or asymmetric depending on the requirement, and can be stored locally or on Azure Key Vault. The client library doesn’t have access to the KEK; it just uses the key wrapper it provides.
    4. The encrypted data is loaded to Azure. The wrapped CEK is storedwith the data for a blob or inserted (interpolated) into the data (queues and tables).

    This process of wrapping the CEK key is called the envelope technique. During the decryption process, a key-resolver uses the key identifier to work out which key wrapped the CEK, meaning that again the Client Library has no need to access the KEK itself. Just call the KEK’s unwrap algorithm.

    Exam Tip

    You won’t need to know how to invoke the client library in code as part of an application for the exam, but it’s important to have a grasp of the process of encryption/decryption.

    Need More Review?

    Client-Side Encryption for Azure Storage

    To learn more about configuring client-side encryption for Azure Storage, visit the Microsoft Docs article “Configure customermanaged keys for Azure Storage encryption from Azure portal" at https://docs.microsoft.com/en-us/azure/storage/common/storageencryption-keys-portal.

    Azure SQL encryption at rest—Transparent Data Encryption

    Azure Storage is only one of the areas within Azure where data is at rest. The SQL family of products in Azure also stores data at rest in its data files and log files. This includes

    Azure SQL Database

    Azure SQL Managed Instance

    Azure SQL Data Warehouse

    SQL Server (within an IaaS VM on Azure)

    The SQL Database family of products uses Transparent Data Encryption (TDE) to encrypt data at rest. When engaging with customers to design a storage solution on SQL in Azure, you need to ensure that they are aware only new Azure SQL Databases have TDE enabled by default when they’re created. The other solutions and older versions of Azure SQL Database require TDE to be enabled manually.

    TDE protects the data at rest by performing real-time I/O encryption of the data and log files at the page level. The pages are encrypted by a database encryption key (DEK), which is stored in the boot page (record on SQL Server) of the database, allowing startup and database operations within Azure including the following:

    Geo and Self Service Point in time Restore

    Restoration of a deleted database

    Geo-replication

    Database Copy

    The now encrypted pages are then written to disk, and the same DEK is used to decrypt the page before it’s read into memory for use. The master database holds the components required to encrypt or decrypt the pages using a DEK, which is why the master database is not encrypted by TDE. This all happens without interaction from the user or the application developers, and it happens with no extra cost.

    TDE uses service-managed keys by default, with Azure managing the key rotation and so on and storing the keys in its own secure location. TDE also can be configured to use customer-managed keys just like Azure Storage Encryption, which gives your customers the data security implementation they may need for governance and compliance purposes, as described in the skill introduction. Using customer-managed keys is also called bring your own key (BYOK).

    Unlike Azure Storage Encryption, the customer-managed key encrypts the DEK, not the data itself and is therefore known as a TDE protector. Once BYOK is enabled and the DEK is encrypted, it’s stored at the boot page level, replacing the Azure-managed DEK if there was one. The customermanaged key used to encrypt the DEK must be stored in Azure Key Vault, where it can be imported to or created within the key vault. Storing the key in the vault gives your customers the same degree of management of the key as that described earlier in the discussion about Azure Storage Encryption.

    Note

    Exporting an Encrypted Azure SQL Database to BACPAC

    When exporting an encrypted database to BACPAC, the data is first read into memory before it’s sent as part of the BACPAC file. This means that the data in the BACPAC file is unencrypted as TDE unencrypts the data before writing to memory. Therefore, you need to ensure your customers are aware they must secure the BACPAC file by other means once exported.

    Use the following steps to take a look at encrypting an existing Azure SQL Database using a customer-managed key. In this example, you need to use an already created Azure SQL Server/Database:

    1. In Azure portal, navigate to the Azure SQL Server you have alreadycreated, click SQL Databases in the blade, and then select the database you created. On the Security section of the blade, select Transparent Data Encryption. Figure 4-21 shows that it’s not possible to use your own key at the database level; therefore, a customer-managed key is applied at the server level. All databases with Data Encryption turned on will be encrypted by the same key. Switching the Data Encryption to off at this level will turn off encryption, just for this database.
    2. Figure 4-21 Transparent Data Encryption for Azure SQL Database
      Screenshot_118
    3. Switch back to the Security blade for the Azure SQL Server and selectTransparent Data Encryption. Click Yes for Use Your Own Key. You have the option to Select a Key or Enter a Key Identifier. The Key Identifier can be copied off of the properties of a key stored in a key vault. In this example, you create a new key.
    4. Click Select A Key Vault and select the key vault you would like touse. You need to select one with Soft Delete enabled. Once you’ve selected a key vault you’re returned back to the Transparent Data Encryption page.
    5. Select a Key, and then Create a new key. Leave the Options asGenerate and enter a relevant name for your key. TDE only supports RSA, so leave the Key Type as RSA. If the compliance policies your customer needs to follow ask for a minimum RSA Key Size above 2,048 then select a different size. For this walkthrough, leave it at 2,048. If you don’t want your new key to activate straight away, set an activation date. If your key needs to expire on a specific date or time, set the expiration date. Click Create.
    6. The key is created in the key vault, and you’re back to the TDE page.Click Save at the top, and save process will create a key vault access policy for the Azure SQL Server to the new key if possible, and then encrypt the DEK and complete the customer-managed key encryption process.

    Need More Review?

    Transparent Data Encryption

    To learn more about configuring Transparent Data Encryption for Azure Storage, visit the Microsoft Docs article “Transparent data encryption for SQL Database and Data Warehouse” at https://docs.microsoft.com/en-us/azure/sql-database/transparentdata-encryption-azure-sql.

    Encrypt data with Always Encrypted

    You’ve made sure your customers know they need their data to be encrypted at rest, but what if there is some data that some users—even power users— should not be able to read? This is sensitive data.

    Sensitive data could be personally identifiable information such as Social Security number (SSN), email address, date of birth, or perhaps financial data such as a credit card number. You should be ensuring your customers are protected from attackers with encryption for all sensitive data by encrypting at rest, but there are also times when power users—in this case database administrators—shouldn’t be able to view sensitive information. Would you want the database administrator of your employer’s human resource package to be able to view your SSN, data of birth, and so on? A solution architect needs to be able to advise how to prevent this, which is where Always Encrypted comes in.

    Always Encrypted is a security feature within SQL products that is designed to protect sensitive data. This encryption happens on the client side, so it covers while the data is being used, when it is moving between client and server, and when it’s at rest on the server. With Always Encrypted, the data is never viewable in the database in plain text, even in memory. Because the client is handling the encryption and decryption process, the application needs a driver to be installed for this to happen. This can be .Net, ODBC, JDBC, PHP, and so on, which opens up a variety of programming languages.

    However, unlike TDE, Always Encrypted means that there may be some

    code changes required in the application to use Always Encrypted.

    Unlike TDE, which is database wide, Always Encrypted should be set against just the columns that are required to be encrypted. There’s a slight performance hit with Always Encrypted; there also are limitations on types of columns that can be encrypted. It’s important to define a set of rules with your customers as to which fields these should be, although this may be decided for you with compliance and governance requirements.

    When Always Encrypted is set up, it requires two keys: the Column Encryption Key (CEK) and the Column Master Key (CMK). The CEK is used to encrypt the data, which is then encrypted by the CMK and stored in the column encryption key metadata in the database. The CMK is stored outside the database in a trusted key store such as Azure Key Vault, Windows Certificate Store, or Java Key Store. The database just stores metadata about where this key is. As in previous sections, by storing the CMK in a trusted key store, your customers have control of the key management, including rotation and revoking of keys.

    To see how this all works theoretically in an application, follow a standard select statement as shown in Figure 4-22.

    Figure 4-22 Always Encrypted encryption process
    Screenshot_119
    1. The Client Application executes a simple select. It creates aparameterized query and sends this to the Always Encrypted client driver.
    2. The client driver checks against the SQL Server to see if the columnsor parameters it’s selecting against are encrypted. If so, the location of the CMK and the encrypted CEK are returned to the client driver.
    3. The client driver uses the CMK location to retrieve the CMK and usesthe CMK to decrypt the CEK.
    4. The client driver uses the decrypted CEK to encrypt the parameters.
    5. The client driver is now able to execute the SQL statement against thedatabase.
    6. The results are returned to the client driver.
    7. The client driver decrypts the columns returned if required and returnsthe result set to the application.

    In steps 4 and 5, the client library is executing the SQL statement using encrypted parameters in the where clause. The ability to do this depends on what column encryption type you select. There are two column encryption types. Here’s how they differ:

    In the example, the SSN column must have been using Deterministic as its Column Encryption Type; otherwise, SQL Server would have returned an error. It’s not possible to search on a Randomized column.

    It’s time to set up a couple of encrypted columns and test this out. In this example, set up an empty key vault and a simple single Azure SQL Database with an employees table. Make this SQL Database accessible directly to your local IP on the Azure SQL Server Firewall. Use the following steps to set up Always Encrypted and encrypt two of the columns:

    1. Open SQL Server Management Studio (SSMS), log in to yourdatabase, expand the Tables folder in the Object Explorer, and rightclick on the Employees Table.
    2. The Always Encrypted wizard opens. Click Next on the introductionpage. On the Column Selection page, select SSN and Salary. You will want to search on SSN but just display Salary, so SSN is Deterministic encryption and Salary is Randomized, as shown in Figure 4-23. The wizard automatically sets the CEK name, the same Key is used for both columns. Click Next.
    3. Figure 4-23 Column selection and encryption type for Always Encrypted
      Screenshot_120
    4. Select where the CMK is so that the CEK can be protected. SelectAzure Key Vault, log in to Azure, select the key vault created for this walkthrough, and click Next. Figure 4-24 shows the selection of the key vault to store the CMK.
    5. Figure 4-24 Azure Key Vault for Column Master Key store
      Screenshot_121
    6. You can select to generate a PowerShell Script for automation purposes later, or click Proceed to finish now. You see a summary of the choices you have made in the wizard to check against. Click Finish to encrypt the columns.
    7. The wizard asks you to log in to Azure once more so that the CMK canbe created in the chosen key vault. Once this shows each step as passed, the columns are encrypted. You can see the created CEK and CMK by navigating to Security > Always Encrypted Keys > Column Master Keys And Column Encryption Keys in the Object Explorer under the database. If you right-click the CMK and CEK and select Script To Query Window, you can see the URI to the key vault and the Encrypted Key value respectively. You also can see how the table definition has changed by scripting it to the query window.

    Note

    Permissions for Creating the CMK in the Key Vault

    At this point, if you’ve forgotten to set the permissions on the key vault, you see a wrapKey error, and you need to start the wizard again. The permissions required via an access-policy are create, get, list, sign, verify, wrapKey, and unwrapKey.

    You can use SQL Server Management Studio to mimic an application that uses Always Encrypted by using some session settings. These settings are mandatory for an application that wants to use Always Encrypt. Follow these steps to set up the session settings and see Always Encrypt from an application perspective:

    1. Open a new SSMS session to your Azure SQL database, but before youlog in, select the Additional Connection Parameters tab, and enter the following command, and click Connect.
    2. 		
      Column Encryption Setting=enabled
      		
      	
    3. In the menu, select Query; at the bottom of Query Options, clickAdvanced and check Enable Parameterization for Always Encrypted. Click OK. With these two settings, you can perform simple CRUD operations on the encrypted columns, as shown in Figure 4-25
    4. Figure 4-25 Select Always Encrypted columns with encryption setting disabled
      Screenshot_122

    However, if you reconnect and don’t enable Column Encryption Setting, you can’t read the data, as you can see in Figure 4-26, and Create/Update/Delete will error.

    Figure 4-26 Select Always Encrypted columns with encryption setting enabled
    Screenshot_123

    Exam Tip

    Setting up an application with the client library, connection string setting, and code changes required to use Always Encrypted is beyond the scope of this exam. However, it’s important to understand the concepts and naming conventions. It’s also useful as solution architects to see this in action and know how it can be circumvented in SSMS unless permissions to the CMK are set correctly.

    Need More Review? Always Encrypted

    To learn more about configuring Always Encrypted, visit the Microsoft Docs article “Always Encrypted (Database Engine)" athttps://docs.microsoft.com/en-us/sql/relationaldatabases/security/encryption/always-encrypted-database-engine. Have a good look at the limitations—not for the exam, but when designing solutions. You’ll need to keep your customers aware that a range of data types and scenarios are not catered for.

    Implement Azure Confidential Compute and SSL/TLS communications

    Beyond data at rest, there are further scenarios that we have not touched on so far, how to add a secure layer to data in transit, and how data in use can be secured. These are covered in the following sections, “SSL/TLS” and “Azure Confidential Compute.”

    SSL/TLS

    Data in transit that isn’t protected is more susceptible to attack, including session hijacking and man-in-the-middle attacks, which allows malicious users to gain access to possibly confidential data. Transport Layer Security (TLS) and Secure Sockets Layer (SSL) are both cryptographic protocols that provide data encryption and communication between servers and applications over a network, which protects the customer’s data in transit.

    SSL was the predecessor to TLS. With each new version of the SSL protocol, more secure ciphers and algorithms were used; however, more and more vulnerabilities were discovered. For this reason, all SSL protocols have been deprecated by the Internet Engineering Task Force (IETF), along with TLS 1.0 and 1.1.

    Note

    SSL Certificates

    SSL certificates and the SSL protocol are not the same thing. The SSL protocol is deprecated; SSL certificates are used by the SSL and TLS protocols. SSL and TLS certificates are the same

    When recommending protocols for communication of data to customers, the current best practice is TLS 1.2, with settings for SSL to be disabled. You may be thinking, “If SSL is deprecated, why is it still available on many server configurations and often enabled by default?” The answer is backwards compatibility. Some systems still do not support SSL or even the earlier versions of TLS. For example, on a recent project, a proxy server at one customer site did not support TLS 1.2. This meant the proxy server couldn’t initiate the handshake with the Azure Application Gateway that had been configured for TLS 1.2 only. This meant traffic did not flow, and the application appeared to be down. Having configured the Application Gateway to only support TLS 1.2 as per best practice, the changes had to be backed out to TLS 1.0+.

    You’ve already explored cross-premise connections using Azure VPN Gateway and Point-to-Site, Site-to-Site VPNs and ExpressRoute connections in Chapter 2, Skill 2.4 “Integrate an Azure virtual network and an onpremises network.” Each of these resources leverage SSL/TLS for private tunnels/networks between on premise and Azure. Azure SQL Database and Azure Storage also leverage SSL/TLS for communication by default. This leaves traffic over the public internet—most commonly between a browser and webserver.

    Use the following steps to create an Azure Web App service deploying with an app service SSL certificate for a custom domain, checking SSL/TLS 1.0/1.1 and HTTP support and verifying the SSL configuration as secure:

    1. Using knowledge from previous chapters in this book, create a customdomain, if you no longer have one, and a web app service, making sure the tier is at least a B1. Select to publish a Docker Image of QuickStart and sample of Python Hello World on the app service. Note: On the lower App Service tiers, the container can take a while to pull down; enter container settings on the App blade to check the log for progress.
    2. Exam Tip

      It’s good to know which app service tiers support custom domains and SSL bindings and which do not.

    3. Navigate to the web app you have just created within the portal, copythe URL from the Overview section of the blade, paste it into a browser address bar, but change https:// to http:// and press Enter to navigate to the page. Depending on the browser you’re running from,you see either “Not secure” or an open padlock next to the address to verify that HTTP is not secure. Go back to the portal, and on the same blade, select SSL Settings, and change the HTTPS Only setting to On. Refresh the page with the HTTP link; you see that it is now redirected to an HTTPS page; it’s no longer possible to access this page insecurely over HTTP.
    4. To verify the current settings as secure, navigate tohttps://ssllabs.com/ssltest in a browser. Copy and paste the app service URL in for the test; it will take a few minutes. Once complete, you should see the configuration being presented with an A result. The implementation by default in Azure App Service under HTTPS is secure and well configured.
    5. Note

      Ssllabs.com/Ssltest

      Ssllabs.com is a free service that analyzes the configuration of a public internet SSL web server. The resultant grade and report contain details on how to resolve SSL configuration issues, which could leave your server open to attack through vulnerabilities.

    6. Go back to the portal and select SSL Configuration in the App Service blade. Downgrade the minimum TLS version to 1.0 for backward compatibility. Note that SSL versions are not configurable due to their deprecation. Rerun the SSL test from step 2. The SSL configuration is still A grade; however, there are now more orange warnings.
    7. Navigate to Custom Domains in the App Service blade, select to AddCustom Domain, and enter the custom domain name from step 1.
    8. When you click OK, the custom domain is assigned to the Azure App Service, but as there is no SSL certificate for this domain, a warning displays. If you try the SSL test from step 3 with the newly assigned custom domain, the test errors because the SSL certificate doesn’t exist, and the certificate chain points to the *.azurewebsites.net certificate.

    9. Go back to the portal and select to create a new App Service Certificatein the marketplace. Then enter the following responses:

      Name A name for the app service certificate will be identified as this in Azure services.

      Naked Host Domain Enter the custom domain created for this walkthrough.

      Certificate SKU Standard (S1) is enough for this example. Wild Card (W1) certificate is only recommended if multiple subdomain use is required.

      Agree to the legal terms and click Create.

      Note

      Using App Service Certificate

      Even though the price for the SSL certificate is given per month, the entire year will be charged to your account when you click Create.

    10. Navigate back to the App Service Certificate in the portal. Notice thekey vault store warning message at the top of the Overview section. To use the certificate, you need to import it as a secret into a key vault and verify the domain.
    11. Click Configured Required Key Vault Store; create a new vault if required.

      Perform a Domain Verification by copying the Domain Verification Token displayed and creating a TXT record. Manage DNS Settings is held under App Service Domain on the App Service blade rather than Azure AD. Add a Record Set with the following details:

      Name The App Service name.

      Type

      TTL This is the Time To Live, speeds up DNS record update for the example as it is the time until cached address expires. Set to 5 minutes.

      Value Paste in the Domain Verification Token.

      Click OK. It shouldn’t take more than 5 minutes for the DNS records to be updated. The domain can now be verified. Go back to App Service Certificate in the portal and select Domain Verification and verify the domain.

    12. Navigate back to the app service that was created for this walkthrough;select SSL Settings in the blade. Now select the Private Certificates (.pfx) tab and Import an App Service Certificate. Select the certificate created in step 6 and click OK. The certificate is available to bind to the App Service. Note here you could upload your own certificates or import one already uploaded to key vault.
    13. Click the Bindings tab and Add SSL Binding. Select the CustomDomain created in step 1, and then the private certificate imported in step 8. Because this is a B1 tier web app and SNI SSL type, click Add Binding.
    14. Navigate back to the Overview section of the App Service blade. TheURL should have changed to the custom domain. Paste the URL into a browser to check the app service is still responding on the new domain. Enter the URL on the SSL Test page from step 3 to verify the SSL settings for the new domain. Once again, you should see a grade A.
    15. You can use the same process to add an App Service Domain and Certificate to secure a function app on a custom domain.

      For connections coming in over the public internet, it’s also likely you will be architecting solutions to a VM Scale Set or Azure App Service, which requires load balancing or features of the Web Application Firewall (or both). These requirements require placing the application gateway in front of the app service or VM scale set. You’ve already explored the application gateway configuration settings in Chapter 2, Skill 2.3 “Implement application load balancing,” but let’s take a look at the SSL/TLS options:

    16. SSL Termination (Offloading) Traffic flows between the local client and application gateway using HTTPS. The traffic is unencrypted by the application gateway and travels between the application gateway and back-end applications in Azure in plaintext. This moves the overhead of decryption from the app server to the application gateway. To enable SSL Termination in the portal follow these steps. An example configuration is shown in Figure 4-27:
    17. On the Configuration section of Create An Application Gateway,select to add a Routing rule.

      Enter the Listener name. This should be appropriate to the route—in this case, SSL termination.

      Select Public for the Frontend IP, A Protocol of HTTPS, and a Portof 443.

      Upload the PFX certificate from your local machine. Note thedomains must match, or it should be a wildcard subdomain certificate. Enter an appropriate name for the certificate and the password used when you created the private key.

      Leave the Listener type as basic for a single site. Click Add to addthe route.

      Figure 4-27 Azure Application Gateway SSL termination
      Screenshot_124

      Exam Tip

      Understanding that Application Gateway provides SSL termination, and knowing how to configure this on an existing app gateway is useful.

      It’s also possible to use a key vault certificate (in preview) if you’re using an application gateway V2 SKU. This uses a managed identity to access and assign the certificate from the key vault.

    18. End-to-End SSL Traffic flowing between the local client, application gateway, and back-end applications in Azure is encrypted at all times. To communicate using SSL/TLS with the back end, root certificates (v2 SKU) must be trusted on the application gateway, or to communicate with an app service or other Azure Web Service, these are implicitly trusted (v2 SKU). To configure HTTPS on the back end for a v1 SKU application gateway in the portal, complete the following: Open an existing application gateway in the portal. Select HTTP Settings on the blade, and then select the AppGatewayBackendHTTPSettings line. On Protocol select HTTPS. You now need to upload the public key of the pfx certificate from the back-end server/service by uploading it and adding it in the Backend Authentication Certificates section.

    Azure Confidential Compute (ACC)

    When data is in use, it is often required to be plaintext, or “in the clear,” while loaded in memory. It’s often a requirement for processing to happen efficiently. During this time the data is susceptible to attack from malicious power users, hackers, malicious software, or anyone (or anything) that could get access to read a server’s memory. Your customers may have sensitive data, such as financial and medical information, that may require protection during processing in memory. This is particularly problematic in the cloud, where the customer may have less control over the underlying operating system and no control over the security of the hardware. Architects need to be able to advise a technology to address this requirement.

    This is where Azure Confidential Compute comes in. It uses Trusted Execution Environments (TEE) to protect data and code while an application is running. Figure 4-28 shows a high-level overview of how TEEs work:

    Figure 4-28 Trusted Execution Environment
    Screenshot_125

    A TEE can be at the hardware or software (hypervisor) level. In hardware, it’s part of the CPU instruction set. A TEE implementation gives an application the possibility to execute sections of code within a protected area called an enclave. The enclave pretty much acts like a bubble, protecting the application from the host machine. As you can see from Figure 4-28, even the operating system cannot penetrate the bubble to access or tamper with the program or code. In the diagram, the TEE has created two enclaves. Enclave 1 is expanded just to show it has code and data within it.

    An enclave has a trust relationship to a host application. Going back to Figure 4-28, only Host App 1 can access Enclave 1, and Host App 2 can access only Enclave 2. This trust relationship uses attestation, a protocol that ensures the code in the enclave is signed and trusted before sending protected data. The two enclaves have no visibility with each other; they can’t access the other’s data or code. This opens the possibility of running multiple workloads with different sources of protected data all on one server without compromising data security.

    You may not have realized it, but it’s likely you’ve encountered TEE/Enclave technology already on your smartphone, especially if it’s storing your fingerprint/facial recognition (biometric data) for phone or app access. Use cases that can be fulfilled currently within Azure are DC-series VMs running Open Enclave SDK, TEE enabled applications.

    SQL Server (2019+ IaaS) Always Encrypted with secure enclaves running on a DC-series VM. (Brings complex SQL searching, not currently available in Azure SQL.) Multi-source machine learning.

    Confidential Consortium Blockchain, running the COCO framework.

    This is an expanding area; future use cases will include IOT Edge for processing sensitive data before aggregating to the cloud.

    Exam Tip

    Setting up an application to use Confidential Compute via Open Enclave SDK is beyond the scope of the exam. However, understanding the basic concepts and potential use cases will be beneficial.

    To complete this section, follow these steps to provision a Ubuntu DCseries VM ready for confidential computing:

    1. In the portal, navigate to the Search bar and search for confidentialcomputing. Select Confidential Compute VM Deployment from the marketplace. On the overview screen, there are more links that may be useful. Click Create.
    2. On the Basics blade, note the text at the top. There are only a handfulof regions that support ACC. The values entered for this section are no different than a standard VM apart from four places, as shown in Figure 4-29:

      Image Select a Ubuntu or Windows 2016 Datacenter image as required. For this example, select Ubuntu.

      Include Open Enclave SDK Select yes to have this installed for you; however, it can be installed later.

      Resource Group Must be an empty resource group.

      Location Must be one of the regions listed at the top of the blade.

      Figure 4-29 Creating a confidential compute DC series VM
      Screenshot_126
    3. Click OK and enter the VM Settings blade. Once again, the settings arno different to a standard VM. You can configure a standard VNet wit default subgroup. In this walkthrough, allow SSH so that installation o Open Enclave SDK can be verified. Note only DC size VMs are available to select as shown in Figure 4-29. Click OK to validate the configuration, and then click OK to create the VM.
    4. Once the VM has been created, go to Overview on the VM blade andcopy the SSH connect string into a console. SSH into the VM and verify the SDK is installed. If it is, you have an openenclave directory under /opt:
    5. 	
      sysadmin@accsvmexample:/$ cd /opt/openenclave
      sysadmin@accsvmexample:/opt/openenclave$
      	
      

    Need More Review?

    Confidential Computing

    To learn more about Confidential Computing, visit the Solutions Overview “Azure confidential computing” at

    https://azure.microsoft.com/en-gb/solutions/confidential-compute. A further excellent resource is at https://azure.microsoft.com/engb/blog/azure-confidential-computing.

    Chapter summary

    Windows-Integrated Authentication in a legacy application will leverage NTLM or Kerberos for SSO. In the cloud, this is accomplished using AD Connect and the single sign-on feature. AD Connect and single-sign on requires password or pass-through synchronization to be used. Microsoft recommends using conditional access policies for Multi-Factor Authentication using Azure AD.

    App-to-API and API-to-API security via OAuth authorization can be configured in the client application or using Azure API Management. The flow used is dependent on use case to determine how to retrieve the access token. OAuth 2.0 uses authorization and access tokens in the form of JSON Web Tokens (JWTs). JWTs should be validated; in API Management, you should use validate-jwt.

    Managed identities allow you to authenticate against services in Azure without having to use credentials or certificates.

    Authentication is performed through Azure AD. When a managed identity is created, a service principle is automatically created, registered against the Azure AD, and trusted. System-managed identities exist for the lifetime of a resource. User-managed identities can be reused on multiple resources.

    When enabling an identity on an app service/function app, the environment variables MSI_SECRET and MSI_ENDPOINT are created so that an access token can be obtained for use on a resource. VMs use the Azure Instance Metadata Service Endpoint to obtain an access token for use on a resource. This endpoint is only accessible from inside the VM.

    To secure secrets, keys, and certificates, use Key Vault. If you need

    HSM-backed keys for FIPS compliance, you need the premium tier.

    Microsoft can’t read the keys, secrets, or certificates in your vault.

    The key vault APIs can be called directly or via wrappers using Azure portal, PowerShell, CLI, and code libraries. Key vaults should be protected from deletion by mistake.

    Storage Service Encryption (SSE) is data encryption at rest for Azure Storage. System-managed keys support all services; customermanaged keys are for blobs and files. Managed disks are automatically encrypted by SSE and do not support BYOK. Unmanaged disks must use BYOK. Azure security center flags unencrypted, unmanaged disks as a risk.

    Azure SQL supports encryption at rest by Transparent Data

    Encryption (TDE) and column-level encryption with Always Encrypted. Both technologies support BYOK. Always Encrypted requires code changes in the application; TDE does not.

    SSL is the predecessor to TLS. Both are protocols that enable encrypted transmission of data. TLS is the current best practice minimum to support. SSL protocols should be disabled. Application gateway supports SSL Termination (Offloading). Communication with the application gateway is over SSL, but on termination the communication between the application gateway and background services is not encrypted.

    Thought experiment

    In this thought experiment, demonstrate your skills and knowledge of the topics covered in this chapter. You can find answers to this thought experiment in the next section.

    You have been hired as a consultant solutions architect by Contoso Stocks, a company managing investment portfolios. Contoso Stocks have decided to move their in-house software from an on-premises data center to Azure. Business and integration analysis of the current offering has identified the following key requirements:

    Transactional data and user financial data (including bank details) are stored in a SQL Server 2012 Standard Edition database.

    A front-end web application installed on a virtual machine uses HTML forms to authenticate using Windows Authentication (NTLM) against a single AD. The current implementation is Single Sign-On.

    The front-end web application is ASP.Net running on IIS 7 on a Windows 2012 Server. It isn’t published beyond the firewall, so it isn’t deemed worth securing.

    A back-end VM runs a bespoke mathematical algorithm to forecast portfolio growth and feedback into the portfolios on the SQL Database. This is Contoso Stock’s own IP and major value add offering. The VM stores data locally to learn from historically after every run.

    Current Azure/Cloud services being used are that all users are running Office 365 with AD Connect synchronizing domain users.

    Contoso Stock leadership and SecOps requirements include the following:

    All data should be encrypted at rest.

    Data transmitted between users and the Web API should be encrypted.

    Application administrators need to provide extra security for authentication when not in one of the organization’s offices.

    Bank details should not be visible to developers or SQL DBAs at any time.

    SecOps require any encryption keys or secrets to be rotated regularly

    for compliance.

    SecOps would like to see the current hard-coded database, VM, and storage account credentials removed and stored securely.

    SecOps don’t allow password hashes from Windows AD to be stored in the cloud.

    The preference is to keep Single Sign-On if possible; however, the app must secure using the Windows AD credentials.

    It’s desirable to rearchitect the app away from IIS on a VM. However, it’s not possible to make code changes to the mathematical algorithm on the VM until they’re ready to leverage Machine Learning in Azure.

    With this information in mind, please answer the following questions:

    1. What solution(s) would you implement to ensure the user’s financialand transactional information is secured at rest and personally identifiable information is protected?
    2. How could Contoso Stocks meet the requirements of SecOps for theremoval of hard-coded secrets and rotation of encryption keys for compliance?
    3. What would you recommend to Contoso Stocks for the authenticationof the web app users and administrators?

    Thought experiment answers

    This section contains the solution to the thought experiment. Each answer explains why the answer choice is correct.

    1. Data is stored in two resources currently: SQL Server and the VM.These are the resources that need to be secure at rest on migration to the cloud. The data on SQL Server can be migrated to Azure SQL, where Azure SQL would enable Transparent Data Encryption by default. However, this contains financial information, so customermanaged keys should be used to further encrypt the TDE key as best practice.

    The personally identifiable information in the Azure SQL Database includes bank details. To secure this information from possibly malicious employees, you would recommend using Always Encrypted and encrypt the financial information columns. This requires code changes within the web app; however, code changes on the web app were marked as acceptable. You should recommend these changes happen as a rearchitecture to an Azure Web App.

    The data on the VM must also be encrypted at rest. As the VM stores historical financial information, you should be recommending that Contoso Stocks rotate encryption keys. The application code on the VM can’t be altered at this stage; therefore, the recommendation to migrate this VM would be “lift and shift,” utilizing unmanaged disks. By recommending unmanaged disks, Contoso Stocks can encrypt the VM disks with a customer-managed key and Azure Disk Encryption.

    Further encryption mechanisms on the VM cannot be used as they require code changes to implement.

  • The answer in question 1 requires access to customer-managed keysfor all encryption scenarios. Couple this with the requirement for secret management, Azure Key Vault must be recommended as the first part of this solution. Azure Key Vault stores keys and secrets, which gives the Contoso Stocks SecOps team the ability to audit key and secret usage. Azure Key Vault facilitates key rotation either by manual rotation or automated rotation using Azure Automation. You should also recommend setting the soft delete and purge options on the key vault to stop accidental vault deletion, which could lead to loss of data.
  • Managed identities should be recommended for use on the VM and rearchitected Azure Web App. This will remove the necessity for credentials to be hard-coded, with access to these secrets granted to each Identity through Access Policies.

  • The requirements for this question call for using Windows ADcredentials, but there is also a mention that Contoso Stock’s users are already using Office 365 with AD Connect. This points to Azure AD already being available to use for authorization. It may even be possible that any Windows AD security groups for this application
  • have already been synchronized for use in Azure. You should recommend that Contoso Stocks review the AD Connect settings to ensure that the Single Sign-On and Pass-Through Authentication are the chosen settings. This will satisfy the requirement of no password hashes in the cloud from Windows AD and to keep Single Sign-On where possible.

    The second part of this solution is for application administrators to provide further credentials when logging in from outside the offices. Your recommendation for this should be to upgrade Azure AD to a premium tier and implement location conditional access to an administrative Azure AD group created for this application.

    Always be aware there will often be more information given in the requirements than needed to answer use-case style questions.

    UP

    LIMITED OFFER: GET 30% Discount

    This is ONE TIME OFFER

    ExamSnap Discount Offer
    Enter Your Email Address to Receive Your 30% Discount Code

    A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.

    Download Free Demo of VCE Exam Simulator

    Experience Avanset VCE Exam Simulator for yourself.

    Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

    Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.