CompTIA Cloud+ CV0-003 – Domain 2.0 Cloud Security

  1. Security Policies

Let’s talk about company security policies. It is fairly standard for an organisation to define what is acceptable use of its computer assets as well as the proper response to specific security issues. What is the preferred method for addressing threats? Security policy can entail many different things, but in general, let’s just talk about what a security policy should essentially deal with. Security policies govern the processes and procedures taken to protect business or privacy information from intrusion by the use of IT systems or cloud resources. Essentially, we want to make sure that your data is safe and your data is private. When it comes to security policies, your organisation should have established controls and an acceptable use policy. Once again, security policies are going to be time-consuming, but they are needed. If you don’t have rules set around how, when, and why to use your cloud resources, then you should expect your cloud resources to not be used in a proper fashion.

Once again, having an AUP is critical. You need to train your user base as well. One of the most common issues I see is that companies will spend a lot of money migrating data in order to retrain their employees, but the user base never receives any training. Once again, train your user base. This can not only make your job easier, but it can also make your company’s data safer and more secure. Audit your policies every year you need to look at and reestablish what is acceptable, what has changed in the organization.

When it comes to security policies, you want to look at auditing, administration, passwords, bring your own devices, your AUP, monitoring, and remote access as areas that you need to control in the policy. You need to at least address each of these in at least five different ways. For example, you need to understand that you need to audit your environment. In general, auditing will entail some kind of logging and some kind of compliance aspect to ensure that you are auditing. Who logs in when that you could back it up and that you could restore as well.

When it comes to passwords, you need to have strict adherence to a password policy. One of the things that’s very common is that users just love to set passwords that are easy to remember. Unfortunately, with your company’s data, that’s the last thing you want to allow. You need to have very strict password policies. What about when an employee brings their own device? Well, how do you handle that? That’s a really good question, and I’m surprised to see that a lot of organisations haven’t actually dealt with it. There are plenty of MDM solutions out there, and there are plenty of DLP solutions that can be enabled as well to address bring your own device issues.

You could push out policies via typical exchange services. Now you could also use other solutions and third parties to address these concerns. Monitoring is important as well, and enabling remote access addresses that separately. For example, remote access is great. You want to use at least some kind of secure technology to allow your administrators to access resources. You may want to set up an absorption host or whatever is required. You certainly don’t want to enable telnet over the internet. Of course, once again, you want to be aware of what is acceptable. You want to apply acceptable security standards. There are plenty of them out there; look at them, understand them, and apply those standards to your policy as required. When it comes to security guidelines, these are generally not considered mandatory in a lot of organizations. It is a best practice, though. You should implement them. Look at your organization and see if you have specific guidelines. For example, how to address spam or someone writing down passwords where they shouldn’t be, like under the keyboard or something. how your security guidelines address them in your policies. Acceptable use policies need to be specific about resource usage that is acceptable to the organization. So remember, for the exam, an AUP needs to be specific, and this is just going to address cloud resource usage. Here is an exam tip: know what an AUP is. Once again, this is the most heavily tested area in this module.

  1. Encryption Technologies

Data encryption technologies On this exam, you’ll likely see a question or two on data encryption. It’s not a major part of the test, but once again, you want to understand what encryption is and what are the main ways to encrypt data, and we’ll go ahead and talk about those. Encryption refers to any process used to make sensitive data more secure and less likely to be intercepted by those unauthorized to view it. Essentially, that’s what the definition is. Now let’s talk about the data encryption standards. We have what’s called the FIPS processing standards. This includes the Federal Information Processing Standards as well as the Phips directives. This is generally required for US government agencies and organizations, but it is also widely used by industry as well. Phipps Publication 142 You’ll want to review just for your own personal knowledge about essentially what Phipps is and why it’s important when it comes to how data needs to be handled, when it’s transferred, or when it’s at rest, for example. Those are in that publication for the test. I just want you to be aware of what fits with data encryption standards. There are generally two widely accepted standards when it comes to data encryption in the cloud. Again, this is not exclusive, just at a high level.

We want to make sure that you are aware of advanced encryption and data encryption. Standard. Those are going to be more or less what the focus will be on when it comes to encryption technologies. These are some other areas that we want to focus on. I’m not going to go and define each of these per se, but I will touch on these IPsec. Now, IPsec stands for “Internet Protocol Security.” This is a framework of standards, basically how to send and receive data over, essentially, a secure channel. It’s essentially a protocol suite as well. SSL—we’re all probably familiar with it. We use it on a daily basis. SSL is a standard that is primarily concerned with establishing an encrypted link; in the example, it is primarily concerned with how the certificate is addressed. But before that, how does a server essentially validate a certificate? And how is that certificate validated? TLS is once again becoming more widely used in the industry. It is somewhat more secure than SSL. It is also much more flexible in some respects as well.

You see, this we use quite a bit. Of course, it will eventually remove SSL as a supported protocol. When it comes to security key and certificate management, you’ll want to know that public key cryptography is an encryption technique that uses a peer-to-peer, public-and-private algorithm for secure data communications. There are also symmetric and asymmetric keys as well. Once again, I won’t spend a lot of time going through this. I just want you to be aware of what it is. There will be one question on encryption on the exam, and we pretty much covered it when it comes to key and certificate management. Understand that when you send a message, the user uses a public key to encrypt it. Remember that the sender encrypts with the public key, while the receiver encrypts with the private key. One of the questions on the exam that I saw around encryption was: Does a sender use a private key, or does the receiver? If you know that, you should be good. PKI is public key infrastructure. It’s widely used. It uses a public and a private key. This is frequently used for compliance. The federal government uses this extensively as well. Some of the vendors out there that have SafeNet infrastructure and certified security solutions are widely used. When it comes to PKI, there are five components. You don’t need to know this for the test, but from a high-level perspective, I want you to just be aware of why PKI is important and why it’s widely used as well. We have a certification authority. This certification authority is essentially at a high level; we’ll call it an organization. Essentially, they’re the ones saying, “Hey, this is a good way to use the keys.” The keys are good, essentially. It also essentially manages the key distribution as well. We also have a registration authority, a certificate database, a certificate store, and a key archive server. Each of these has its own responsibilities. Remember that SSL is the predecessor to TLS. Note that fewer websites will likely support SSL over time. When it comes to SSL, this is a standard security technology for establishing an encrypted link between a client and the server. One thing I did want to emphasize—and I probably should have done so as an exam tip as well—is that I want to emphasise this again in relation to key and certificate management. Generally, in most deployments of keys and PKI environments, you have the sender. The sender uses a public key to encrypt. The receiver uses a private key to decrypt. Generally, that is something else I’d like you to know for the exam.

  1. Tunneling

One of the main areas of security on the exam was around virtual private networks. At least, that’s what I saw. There were several questions. Again, this was just one of many areas of security that were part of the objectives. But I want to share everything equally to make sure that you have an understanding based on the objectives of the certification. And the reason is because, again, that doesn’t mean that the test pools that I’ve seen on the beta exam will be the same ones that you’re going to have. Of course, I can’t tell you what the questions were, but I can certainly address areas you may want to study. So once again, just be aware that tunneling protocols, at least from what I’ve seen, were sort of tested a little bit heavier than, let’s say, security tools or data classification or API security, or something like that. Let’s talk about tunnelling protocols and VPNs.

An IP tunnel is an Internet protocol network communications channel between two networks. This is essentially point-to-point data encapsulation with VPNs. This is essentially tunneling. VPN tunnelling is essentially port forwarding. There’s also other facets of VPN tunneling. You have PPTP. You have L as well as two TP. PPP and GRE are available to you. Some of these are proprietary to Cisco, but some are also used by Microsoft, for example. PPTP, for example, may also be used quite a bit as well. Anyway, with that said, just be aware that there are different types of tunneling. There are also MS, CHAP, and EAP TLS authentication protocols that are widely used. As an area I’d like to make sure you’re aware of, there are generally two types of VPNs. You have a site-to-site VPN and remote access.

Now, remote access is essentially where you’re going to log in from your home or wherever to access resources. Now, a site-to-site VPN is usually a dedicated link that is going to be between two data centers. This could be used typically for services, replication, et cetera. when it comes to VPN encryption. Generally, these connections are going to have encapsulation authentication and data encryption when it comes to VPN configurations, and generally, you want to configure VPNs in either of these two ways. Generally, you want to have a VPN in front of the firewall or behind the firewall. Again, this is very simple to get, but be aware that you may want to understand the way to configure a VPN at a high level for this exam. So the VPN could go in front of the firewall or behind it. VPNs are critical to securing cloud infrastructure, and there are two types. Remember that you have site-to-site and remote access for the exam. You may want to know that VPN tunnelling is also known as port forwarding. That is your exam tip.

  1. Security Templates

Security templates and ACLs One of the areas around the cloud is that when deploying resources to the cloud or in your cloud, it’s important to understand that you want to establish some kind of baseline for your security. And one of the easiest ways to do that is with security templates. When you deploy resources in the cloud, you generally have the ability to customise your machine images. And this can include adding more templates or creating a template that you would deploy with additional resources linked to it, such as antiviruses if it’s a user-based VM, or firewall settings, intrusion detection systems, or whatever. So the best way to create a baseline for security when it comes to VM images is to use a security template. When it comes to security templates, these are typically pre-configured, repeatable processes that you’re deploying.

You’re going to generally use these with orchestration and automation in your cloud. For example, you could do the same thing on a Windows machine. These are generally in INF formats. These templates ACLs are essentially a list of permissions for operations around an object. If you want to secure an object, create an ACL. ACLs generally describe who, what, when, where, et cetera. Basically, you will create an ACL for a file, a folder, et cetera. Basically, it’s an object. You also want to understand how the different vendors allow you to create, maintain, and manage ACLs. Be aware that what you can do in GCP, AWS, or VMware may differ significantly. ACLs are considered a good practice and a necessary practice, for that matter. When deploying these security templates, ACLs consider the various methods, as well as orchestration and automation workflows. Blueprints and templates use security templates to ensure a baseline. And here is a test tip: ACLs are generally a list of permissions for operations around an object. Remember that ACLs are who, what, when, where, et cetera. And an object is what that’s going to be—a file, a folder, etc.

  1. Security Tools

Security Tools and Concepts Let’s go ahead and discuss some of the tools out there on VMs and cloud providers that you could use to enable better security and much more robust access management. For example, security tools can vary based on the use case. Many tools for VM monitoring, management, and access management are available. Definitely look at your cloud providers and see what they offer. Look at your virtualization provider as well. See what is offered in their solution. When it comes to security tools, the ones you want to understand the most are guest tools, monitoring tools, management tools, and access management tools.

In the cloud, it’s important to understand, especially if you’re going to be using infrastructure as a service. You may need to understand guest tools, and this is especially true if you have a private cloud where you’re going to be managing your VMs as well as the infrastructure under those VMs. So guest tools should be installed on the VM. These guest tools are typically provided by the hypervisor vendor. These are used to manage virtual servers. When it comes to VMware tools, they are comprised of three main components. The first are the VMware device drivers. These are essentially the device drivers that are required to interact with the virtual machine, the hypervisor, as well as the hardware and software. For example, VMware user processes as well as VMware services like tool SD Exe. These are the services that are required to run and utilise these tools. For example, being able to access a VM requires specific services, as does doing anything on a VM as well. When it comes to monitoring and management, these tools allow you to monitor your resources in a single pane.

For example, Google Stackdriver allows you to monitor both Google Cloud and AWS. An extremely powerful tool allows you to aggregate your resources, maintain consistency, and report auditing performance, for example, as well as AWS Cloud Watch Science Logic for the enterprise, for example. Datadog as well are services that you could use to monitor and/or manage your resources. When it comes to access management, this is the process of granting authorised users the right to use a service while preventing access to non-authorised users. For example, with AWS and GCP, you may want to look at IAM utilities. There are also third-party tools like Okta, IAMCloud, and One Login that are frequently used and that I see quite a bit of. When it comes to using tools, be aware that these tools can be used to document trends. Tools can also be used to establish baselines. Tools can then be used to solve problems as well. So, for example, if you’re able to use a tool to create a baseline and then compare against that baseline, that’s a great way to document where you were and where you are now. Access management is the process of granting authorised users the right to use a service while preventing access to non-authorised users. Now, I just want to point out—don’t get confused over authorization and authentication. Here is an exam tip. Understand that tools can be used to assess security, establish baselines, and manage the cloud environment.

  1. Classifying Data

Let’s go ahead and discuss data classifications. Now, it’s important to understand whether you’re in the government sector or the commercial sector when you generally want to classify data. And there are several reasons for this, but let’s just clarify what this means in terms of the objectives for cloud computing. Plus, there are some references to classifying data. And so what we want to make sure you understand is that you categorise data assets based on the values associated with their sensitivity. Essentially, if something is so critical to your business, you need to protect it at a higher level than something that probably isn’t as important. So privacy concerns, commercial secrets, whatever—those would likely be protected at a higher level. Let’s go ahead and talk about data classifications. Now, generally, the government sector is really good at classifying data. Not so much in the commercial sector. Data classification should clearly define data types and confidentiality. Data classification should be part of a solid life cycle and data management process. Generally, you want to have the stakeholders—the data owners, the application owners, etc.—classify the data appropriately. For example, as a cloud architect, it’s really not your job to classify data.

You’re not going to really know what data needs to be classified and what doesn’t. You should be aware before you do any migrations, for example, of what data needs to be protected or complied with, essentially from a compliance standpoint. Other than that, let’s go ahead and talk about setting these classifications. Generally, the first thing you want to do is set criteria—basically, identify what exactly the criteria are or the requirements for data classification in your organization. In other words, is it privacy? Is it compliance? Is it a commercially sensitive issue? Whatever it is, determine the controls for classifying that data. Is it the stakeholders that determine it? Are there audits that are done, etc., to identify the owner of the data? One of the best sources to identify the data is, of course, the owner. Document any exceptions. This is important from a compliance standpoint as well. Identify who the custodian is and the transfer agent. Now, one of the challenges when we get to the cloud is that some folks believe that Amazon Web Services or Google Cloud have some sort of responsibility to be the custodians for your data. That is not true. You are responsible for your data in their cloud.

Once again, you are responsible, not the cloud provider. There may be shared responsibilities, but in the end, you are essentially responsible for the data and its declassification after that. When it comes to commercial data classification levels generally, these are fairly widely used. Once again, assuming that the commercial organisation actually does this, these are generally the classification levels: sensitive, confidential, private, proprietary, and public. so sensitive, meaning that it could pose a significant liability to the company. Confidential, private, or generally could refer to anything from privacy concerns to a commercial secret. Let’s say the company has the recipe for Kentucky Fried Chicken or something else that’s essentially going to be confidential or sensitive. Private is going to be generally human resources information, which is proprietary. Again, these are company-specific requirements that are kept somewhat hidden from public view before being made public. Then, when it gets to the government, one of the areas that you may see on the exam is to ask you about the data classification levels for the government sector. One of the things that CompTIA does really well is certify a lot of people for the US government.

So they seem to typically address the government sector really well. So from an objective standpoint, this is fair game to ask about classification levels. Top secret, secret, confidential, sensitive, unclassified Top secret generally means that it’s going to cause grave harm to the country. The secret is that it has the potential to cause significant damage, and so on. Again, this is confidential and sensitive, but unclassified, which means there is no real classification in terms of sensitivity. Now, this is sensitive but unclassified. I’ve seen it again from my experience in the government sector and the military. Again, there are some other types of what I would call sub classifications, such as “no foreign nationals,” which I would double check. If you have any idea what these are, you may see one question. Classification should clearly define the types of data and confidentiality exams. Make sure that you know that classifications not only define the type of data but also the confidentiality of the exam. Understand that the sensitivity of data used in government military services is classified as top secret, secret, confidential, sensitive, but unclassified, and unclassified.

  1. Segmentation of Resources

Micro segmentation when it comes to software-defined networking and IT security is a great example of how you can use some of the benefits of SDN to actually enhance your security policies. Micro segmentation is a security technique that enables fine-grained security policies to be assigned to data centre apps down to the workload level. So when we talk about micro segmentation, there are numerous benefits. On this exam, we at least want you to be aware of the benefits of micro-segmentation. It integrates security into the virtual machine without the need for a hardware-based firewall. This is actually a fairly robust benefit from a security standpoint. That is where you could instal a firewall locally on each of the VMs, and the firewalls were typically installed in a north-to-south direction. So basically, there was no east-west traffic protection on the corporate network, but there was no east to west traffic protection essentially. So this enables that east-west protection that clearly wasn’t there in a lot of the data centers for synchronized policies.

This enabled you to apply policies to all virtual machine firewalls, configurations, and settings, as well as dedicated segments. Another benefit is that you could deploy specific policies, for example, in specific segments. This is a big deal in the sense that you could have specific development rules, for example, or production rules, without a lot of the manual work that you used to have to do before. When it comes to micro-segmentation, just be aware that there’s some terminology out there like NFV, which is network functions. Virtualization. This is typically where a service is deployed—typically, on an appliance, a virtual appliance. That is where a service provider may need to deploy additional DNS instances, NTP, or something else without the provider having to go out and buy additional hardware. They could simply deploy a virtual application for that service. SDN is different in that it is typically deployed in enterprises and is intended to provide mobility, micro segmentation, scalability, and less reliance on hardware. when it comes to software-defined networking. There are some vendors out there that definitely have a lead in areas like VMware, NSX, as well as Cisco, ACI, Arista, and Open Flow.

Also well known when it comes to software-defined networking, this is a technology aimed at making the network as agile and flexible as the server and storage infrastructure of the modern data center. Essentially, this allows you to avoid having to move VMs and their networking configurations at the same time. You could just move a VM, and its networking configuration essentially stays with it. This gives that agility and mobility. There’s a lot more to it than that. But for this specific exam, we just want you to have an idea of what SDN is when it comes to network function. NFV virtualization decouples network functions such as firewalling, intrusion detection, DNS, and NAT from proprietary hardware so that they can run in the software. It removes the need for that dedicated hardware appliance. So, for example, if you had an organisation that needed to deploy a new data center, what did you have to do? You also had to deploy three more DNS servers, let’s say Cisco ASAS, or whatever. Now, you don’t need to do that as much. Again, there are, of course, scalability and performance issues to look at, but this could definitely be a huge benefit. Before micro segmentation, we had VLANs.

VLANs were really good for defining security granularity to a certain level, that is. But of course, there are security vulnerabilities that could be exposed where a hacker could compromise a host on that VLAN and then have access to all the other hosts on that VLAN. VLANs generally are not unimpeded by firewalls and IDs. So once again, if a hacker compromises a host, guess what? He could hop around that whole network unimpeded. In most cases, unless you have an east-west firewall or each of your hosts is individually firewalled, which is typically not what you would see in a lot of enterprise data centers, Micro segmentation is a security technique that enables fine-grained security policies to be assigned to data centre apps down to the workload level. Remember that micro-segmentation allows you to control that east-west traffic much easier and also allows for much better policy management and numerous other benefits. Make sure you understand what “micro segmentation” is for this exam. You will likely see it again. Here’s a test tip: there are security concerns with VLANs, especially when an attacker compromises or actually compromises a host in the VLAN.

  1. Compliance Requirements

Let’s go ahead and talk about compliance. Now, I like to compare compliance to an old family saying: the redheaded stepchild of the family. Essentially, it is an area that we are aware is a part of it and that we must deal with it, but no one wants to deal with it. So compliance is one of those areas we need to be aware of. When it comes to compliance, we must address and understand that it is either the state of being in accordance with established guidelines or specs or the process of becoming compliant. In other words, compliance is essentially conforming to specific guidelines or specifications. In reality, compliance is somewhat more than that when we consider the broad spectrum of areas that we have to deal with, especially if you’re a US-based company traded on the stock exchange or your healthcare provider is in the US. For example, you have compliance issues you have to deal with. When it comes to types of compliance, we have corporate compliance, we have regulatory compliance, and then we have industry standards. Let’s go ahead and talk about the different standards, for example, around compliance. So first of all, we have regulatory compliance. Regulatory compliance is an organization’s adherence to laws, regulations, guidelines, et cetera.

Now, you are not following the guidance, the regulations, et cetera. Basically, not being compliant could result in financial and criminal impacts. Noncompliance has almost never resulted in criminal consequences, but there may be financial consequences. This is pretty common. You see this quite a bit in the financial industry, for example, with Socks or other related accounting-related compliances like Socks one or Socks two. When it comes to other regulatory compliance, again, there could be tones of other areas that your company may need to deal with. We’ll go ahead and talk more about some of the others here that are coming up.

When it comes to corporate compliance, this is something that’s done internally. Your corporation, your company, and your stakeholders should be defining how your organisations is structured from a policy perspective. They should be proactive in the sense that they’re trying to mitigate any kind of liabilities, any kind of violations of the law, et cetera. That is corporate compliance. When it comes to industry compliance, it’s very common for folks to confuse regulatory compliance with industry compliance. Regulatory compliance is typically determined by government bureaucracy, whereas industry compliance is determined more by things like a trade group or an association, a community of users, and so on. That’s generally where industry is, so for example, PCI is very common. HIPAA, ISO: a lot of organisations follow ISO standards. This is very common.

If you’re in the federal government, you may be looking at FISMA or Fed RAMP, for example, as well. When it comes to compliance, a lot of areas around compliance really need to be addressed in the appropriate manner. For example, geography could make a big difference in compliance requirements. Essentially, if you have us Entities. And you have entities in Europe, Canada, Brazil, Singapore, whatever. You could have numerous local, I guess, geography based compliance issues to deal within Europe, there’s now something called the GDPR. I believe that’s an area of compliance that is really getting a lot of companies stirred up, as we’ve seen lately. There are areas in the United States, particularly around HIPAA, that are of great concern. We have also in Canada the new privacy concerns that come up around PII Data, I think it’s called.

Papita is the acronym for that. Once again, just be aware of where your company is located, the user base that you serve, et cetera. Data classification is important to be aware of as well. You don’t want to make everything compliant with something that doesn’t need to be compliant. For example, if only 10% of your data is required to be kept for, let’s say, PCI audits, then that’s what you need to do. You don’t need to do 100% or 30%. That’s going to cost you additional cloud resources to do that. Encryption is a big deal as well. Again, the controls and each of these compliance requirements have specific instructions. Corporate compliance, again, is corporate compliance is.I won’t read the whole thing to you again. Industry compliance, again, knows what industry compliance means. For this exam, you’ll want to make sure that you understand what compliance is. At a high level, you don’t need to know what PCI is. You don’t need to know what HIPAA is. You certainly don’t need to know Locksor Papita or anything like that.

  1. Security Automation

Security automation. One of the areas in your cloud environment, especially if you have a hybrid cloud, is to automate whatever you can. You definitely don’t want to be involved in the manual efforts. And one of those areas is security and automating the handling of security operations. Everything from moving log files to analyzing log files to looking at logins—you name it—requires You need to automate whatever you can. Some of the benefits of automation should be to reduce manual tasks, increase productivity, increase return on investment, and obtain resolution to issues quicker. These are all just some of the benefits that automation can provide. could also provide better insight, faster decision making, and analytics, to name a few benefits.

When it comes to automation, some other areas to focus on would be monitoring, threat analysis, data insights, incident response, resource permissions like ACLs, business continuity, and disaster recovery. One of the areas—and another area I didn’t put there—was compliance as well. So just one of the things I did want to point out is automation. It’s going to take some time to get it right. You need to automate but also orchestrate everything. You need to set policies, create procedures, document, and identify what you need to accomplish to accomplish a really robust, automated, security policy-driven platform for your environment. Security automation is the automatic handling of security operations, and it’s related to the exam. You want to know the benefits of automation benefits. So again, getting insight, better reacting to events—you name it. There are four major benefits. Make sure you know what they are.

  1. Security Automation Techniques

Security automation techniques When it comes to automating and orchestrating, there are generally two different ways to do this in the cloud. The first is really focused on orchestration or an orchestrator. The second will be scripting. We’ll talk more about those in a second. Security techniques can take many forms, but we’re going to talk mainly about two in this module. This is what you really want to know about the cloud. Plus, when it comes to security techniques, we have a cloud orchestrator, and you can use scripting. For example, Cloud Orchestrator is essentially a programmed that manages the interconnections and interactions among cloud-based and on-premises business units. When you choose to deploy an EC2 instance in AWS, it will be provisioned by a cloud orchestrator. Essentially, you as the user request that the VM do everything in the background for you. With orchestration, essentially, it provides a single pane of glass.

It is essentially a portal to deploy resources. This puts together all the workflows so that automation can happen. An example is called Ansible; this is a common tool that is used with AWS. Scripting is also used, for example, with the Google Cloud Platform. Cloud Deployment Manager is commonly used for custom virtual machines in JSON-supported formats, such as YAML. So make sure you understand what an orchestrator is. This is what manages the interconnections and interactions among the cloud-based business units. One of the areas I always want to point out is how common it is for people to get confused between automation and orchestration. Just remember that orchestration is essentially what’s going to manage the automation essentially. where automation is usually a single task. So, think of orchestration as the automation of many tasks, essentially in a workflow, and make sure you understand what scripting is used for. Generally, it’s YAML or JSON. That’s pretty actively used now, not exclusively, but generally, that’s what you’d see in a cloud infrastructure.