PL-300 Microsoft Practice Test Questions and Exam Dumps


Question No 1:

You are managing a project management application that is fully integrated and hosted within Microsoft Teams. The application was developed using Microsoft Power Apps.
You have been tasked with creating a Power BI report that directly connects to the project's data.

Which data connector should you use to connect Power BI to the project management app's underlying data?

A. Microsoft Teams Personal Analytics
B. SQL Server database
C. Dataverse
D. Dataflows

Answer:

C. Dataverse

Explanation:

Microsoft Power Apps applications, particularly those integrated within Microsoft Teams, typically store their data in Microsoft Dataverse or Dataverse for Teams. Dataverse is a secure, scalable, and cloud-based storage platform for managing data used by business applications. It acts as the database behind many Power Platform solutions, including Power Apps.

Here’s why Dataverse is the correct choice:

  • Integration with Power Apps: When an app is developed using Power Apps, especially in Teams, it stores its tables and data in Dataverse.

  • Native Support in Power BI: Power BI provides a native connector to Dataverse, allowing you to connect directly to the tables and create reports based on that data.

  • Security and Access Control: Dataverse includes built-in security features that manage data access, ensuring compliance and secure reporting.

  • Minimized Complexity: Using the Dataverse connector is straightforward and minimizes the complexity of building reports compared to setting up external SQL Server databases or building dataflows.

Why the other options are incorrect:

  • Microsoft Teams Personal Analytics (A): This connector is used for personal Teams usage statistics, not app-specific data.

  • SQL Server database (B): Unless the app was explicitly designed to store data in an external SQL Server, this would not apply.

  • Dataflows (D): Dataflows are used for data preparation and transformation but are not the direct database source of Power Apps apps.

Thus, selecting C. Dataverse is the correct choice for building your Power BI report efficiently.

Question No 2:

You have published a Power BI report for your company's sales department. This report imports its data from a Microsoft Excel file that is stored inside a Microsoft SharePoint folder.The data model used in the report already contains several calculated measures and transformations.
Now, you are required to create a new Power BI report based on the existing dataset while minimizing development effort.

Which type of data source should you use for the new report?

A. Power BI dataset
B. SharePoint folder
C. Power BI dataflows
D. Excel workbook

Answer: A. Power BI dataset

Explanation:

In Power BI, reusing existing datasets is highly recommended to avoid duplicating work, especially when the existing dataset already contains:

  • Relationships

  • Measures

  • Calculated columns

  • Business logic

By connecting to a Power BI dataset, you can reuse all the previously built modeling work without needing to rebuild the data model, thus saving significant development effort.

Here’s why Power BI dataset is the best choice:

  • Reuse Existing Models: Directly connect to an existing dataset and create new reports without reimporting or restructuring the underlying data.

  • Consistency: Using the existing dataset ensures that the same measures, calculations, and KPIs are consistently used across all reports.

  • Efficiency: Minimizes the time and effort needed to build new reports since you start with a fully modeled and structured dataset.

  • Performance Optimization: Since the model is already optimized, performance tuning and transformations have been handled previously.

Why the other options are incorrect:

  • SharePoint folder (B): Connecting directly to the SharePoint folder would require rebuilding the model, creating measures, and applying transformations again.

  • Power BI dataflows (C): Dataflows prepare and standardize data but still require a model to be built in Power BI Desktop.

  • Excel workbook (D): Connecting directly to the Excel file means starting fresh with data modeling, which is more time-consuming.

Thus, A. Power BI dataset is the best choice to minimize development effort while maintaining consistency.

Question No 3:

You are working in Power Query and have imported two separate Microsoft Excel tables named Customer and Address.

The structure of each table is as follows:

  • Customer Table:

    • Customer ID

    • Customer Name

    • Phone

    • Email Address

    • Address ID

  • Address Table:

    • Address ID

    • Address Line 1

    • Address Line 2

    • City

    • State/Region

    • Country

    • Postal Code

In this setup:

  • Each Customer ID uniquely identifies a customer in the Customer table.

  • Each Address ID uniquely identifies an address in the Address table.

Your goal is to create a query where each row represents one customer, and for each customer, you need to display their City, State/Region, and Country alongside their other customer information.

Which action should you take in Power Query to achieve this result?

A. Merge the Customer and Address tables.
B. Group the Customer and Address tables by the Address ID column.
C. Transpose the Customer and Address tables.
D. Append the Customer and Address tables.

Answer: A. Merge the Customer and Address tables

Explanation:

The correct approach is to merge the Customer and Address tables.

Merging in Power Query is similar to performing a JOIN operation in SQL. It allows you to combine rows from two tables based on a related column. In this case:

  • The Address ID is the common key between the two tables.

  • By merging on the Address ID, you can bring in address-related information (City, State/Region, and Country) for each customer.

Here’s how the merge works:

  1. In Power Query, select the Customer table.

  2. Choose the Merge Queries option.

  3. Select the Address table as the second table.

  4. Use Address ID from both tables as the matching column.

  5. After merging, expand the fields you want (City, State/Region, and Country) into your Customer table.

Why the other options are incorrect:

  • Group (B): Grouping organizes data based on shared values and typically aggregates it, which isn't needed here. We want individual customer records, not grouped summaries.

  • Transpose (C): Transposing would flip rows and columns, which is not the desired outcome.

  • Append (D): Appending combines rows from two tables with the same structure (e.g., stacking tables), not joining related information based on a common field.

Thus, merging is the only method that effectively combines customer data with their related address data into one table with one row per customer.

By using Merge, you will meet the requirement: one row per customer with their City, State/Region, and Country correctly associated.

Question No 4:

You are working with a Microsoft SharePoint Online site that contains multiple document libraries.One specific document library stores manufacturing reports, all of which are saved as Microsoft Excel files.
All the manufacturing report files have the same data structure (i.e., identical columns and format).Your goal is to use Power BI Desktop to load only the manufacturing reports into a table for further analysis.

Which approach should you take to correctly import and filter the required files?

A. Use Get data from a SharePoint folder, enter the site URL, select Transform, and filter the data based on the folder path corresponding to the manufacturing reports library.
B. Use Get data from a SharePoint list, enter the site URL, select Combine & Transform, and filter based on the folder path of the manufacturing reports library.
C. Use Get data from a SharePoint folder, enter the site URL, and then select Combine & Load directly.
D. Use Get data from a SharePoint list, enter the site URL, and then select Combine & Load directly.

Answer:

A. Get data from a SharePoint folder, enter the site URL, select Transform, then filter by the folder path to the manufacturing reports library.

Explanation:

The correct solution is to use the SharePoint folder connector, transform the data, and filter to only include files from the manufacturing reports library.

When you connect to a SharePoint folder in Power BI Desktop:

  • You enter the site URL (not the library URL).

  • Power BI retrieves a list of all files across all document libraries within that site.

  • This includes all kinds of files — Excel files, Word documents, PDFs — across different folders and libraries.

Since the site contains multiple libraries (not just manufacturing reports), you must filter the file list based on the folder path or directory name that points specifically to the manufacturing reports.

After filtering:

  • You can combine (i.e., consolidate) the Excel files into a single table because all the files have the same data structure.

  • Then, you can load the data into Power BI for further analysis.

Why the other options are incorrect:

  • Option B and D (SharePoint list): SharePoint lists are not meant for file storage. They are used for storing structured data like tasks, contacts, or issues. You cannot reliably extract Excel files from a list connection.

  • Option C (Combine & Load immediately): Using Combine & Load directly without transforming and filtering would load all files from the entire SharePoint site, not just the manufacturing reports.

Thus, Option A is the correct approach:

Question No 5:

You have a CSV file containing user complaints data.
One of the columns in the file is named Logged, which records the date and time of each complaint.
The format of the Logged column values is as follows:
2018-12-31 at 08:59

You need to prepare the data so that you can analyze complaints based on the date and take advantage of Power BI’s built-in date hierarchy (Year, Quarter, Month, Day).

What action should you take?

Options:

A. Apply a transformation to extract the last 11 characters of the Logged column and set the data type of the new column to Date.
B. Change the data type of the Logged column to Date.
C. Split the Logged column using "at" as the delimiter.
D. Apply a transformation to extract the first 11 characters of the Logged column.

Answer:

D. Apply a transformation to extract the first 11 characters of the Logged column.

Explanation:

The Logged column contains both a date and a time, but they are separated by the word "at" rather than a typical space or time delimiter.
The goal is to extract only the date so you can use it for time-based analysis in Power BI.

  • Option D is correct because:

    • The first 11 characters (2018-12-31) represent the date portion of the field.

    • By extracting these first 11 characters, you isolate the date.

    • After extraction, you can set the data type of the resulting column to Date.

    • Once the column is properly typed as Date, Power BI automatically creates a date hierarchy, allowing you to easily slice and filter by Year, Quarter, Month, and Day.

  • Why not the others?

    • Option A: Extracting the last 11 characters would pull the time portion (at 08:59), not the date.
      Option B: Simply changing the type without cleaning the data would fail, because "at" makes it an invalid date format.

    • Option C: Splitting on "at" is possible, but more work than just extracting the date directly.

Thus, the quickest and cleanest method is to extract the first 11 characters to isolate the date part of the Logged field. 

Question No 6:

You are developing a Power BI report that sources data from an Azure SQL Database named erp1.
You have imported the following tables into Power BI:

  • Orders

  • Order Line Items

  • Products

You are required to perform the following types of analyses:

  • Analyze orders sold over time, including a measure for the total order value.

  • Analyze orders by attributes of the products sold, such as category or brand.

Additionally, your solution must aim to minimize update times when users interact with visuals in the report (i.e., improve performance during filtering and slicing).

What action should you perform first to meet these requirements?

Options:

A. In Power Query, merge the Order Line Items query with the Products query.
B. Create a calculated column to add product category information to the Orders table using a DAX function.
C. Calculate the count of orders per product by using a DAX function.
D. In Power Query, merge the Orders query with the Order Line Items query.

Answer:

A. In Power Query, merge the Order Line Items query and the Products query.

Explanation:

To efficiently analyze orders over time and orders by product attributes while minimizing update times:

  • Order Line Items contain details such as products sold per order and quantities.

  • Products contain product-specific attributes like category, brand, etc.

In Power BI, merging these two tables at the Power Query stage (before loading data into the model) helps:

  • Combine product attributes (from Products) with each line item (from Order Line Items).

  • Avoid runtime lookups when users interact with visuals, significantly improving performance.

  • Precompute relationships that would otherwise require DAX JOINs or lookups during report interaction.

By merging in Power Query, you create a single, flattened table that the visuals can query more efficiently.
This approach reduces the model complexity and speeds up the filtering and aggregations at report runtime.

  • Option B and Option C suggest using DAX, which would introduce additional runtime computations, slowing down visual interactions.

  • Option D suggests merging Orders and Order Line Items, but you specifically need product attributes first to analyze orders by product properties.

Thus, merging Order Line Items with Products is the first and best step to achieve both analytical goals and performance optimization. 

UP

LIMITED OFFER: GET 30% Discount

This is ONE TIME OFFER

ExamSnap Discount Offer
Enter Your Email Address to Receive Your 30% Discount Code

A confirmation link will be sent to this email address to verify your login. *We value your privacy. We will not rent or sell your email address.

Download Free Demo of VCE Exam Simulator

Experience Avanset VCE Exam Simulator for yourself.

Simply submit your e-mail address below to get started with our interactive software demo of your free trial.

Free Demo Limits: In the demo version you will be able to access only first 5 questions from exam.