Crack the DP-700 in 2025: A Step-by-Step Guide for Aspiring Data Engineers

Understanding the DP-700 Microsoft Fabric Data Engineering Certification

Introduction to DP-700 Certification

The DP-700 certification, formally known as the Microsoft Fabric Data Engineering Certification, is a professional-level exam offered by Microsoft. It is designed to validate an individual’s proficiency in using Microsoft Fabric to create, manage, and optimize data engineering solutions. Microsoft Fabric is a unified data platform that incorporates tools for the entire data lifecycle, from ingestion to visualization, and integrates AI features to improve efficiency and insights.

The purpose of this certification is to ensure that professionals can effectively work with data workflows using Microsoft Fabric’s various tools and services. It is part of Microsoft’s broader initiative to promote cloud-based, AI-driven data solutions in enterprise environments. This certification has become particularly valuable as businesses move toward cloud-native data engineering infrastructures.

What Is Microsoft Fabric?

Microsoft Fabric is a comprehensive analytics and data management platform. It brings together various technologies under a unified framework to manage data across its lifecycle. From a data engineering perspective, Microsoft Fabric provides tools to:

  • Ingest data from diverse sources

  • Perform data transformations and cleansing.

  • Store data using modern storage models like lakehouses and warehouses

  • Analyze data using queries, AI models, and statistical tools.

  • Visualize data insights using dashboards and reports.

The platform supports both code-first and low-code approaches, allowing flexibility for developers and data engineers of varying expertise levels. It also includes features for collaboration, version control, security, and deployment, making it suitable for enterprise-grade data workflows.

Objectives of the DP-700 Certification

The DP-700 certification assesses the skills required to work with data engineering tools and processes within Microsoft Fabric. The main objectives are to test the candidate’s ability to:

  • Implement and manage data pipelines and analytics solutions

  • Ingest and transform data using multiple tools within the platform.

  • Monitor, troubleshoot, and optimize data workflows for performance and accuracy.

  • Leverage different types of data stores depending on the use case.

  • Use real-time and batch processing techniques to meet specific business needs.

Professionals who pass this certification are expected to contribute effectively to data-driven decision-making processes within organizations by building reliable, scalable, and secure data pipelines.

Benefits of the Certification

There are several benefits to earning the DP-700 Microsoft Fabric Data Engineering Certification, especially in today’s data-centric job market.

Career Advancement

Certified professionals often gain access to better job opportunities, including roles such as data engineer, analytics engineer, business intelligence developer, and data platform architect. The certification helps differentiate candidates in competitive job markets and adds value to their professional profiles.

Validation of Skills

Earning this certification proves that the individual has hands-on experience and theoretical understanding of modern data engineering principles using Microsoft Fabric. It serves as a recognized credential for employers and clients.

Increased Productivity

Certified professionals can utilize Microsoft Fabric more efficiently, leading to optimized workflows, better collaboration across teams, and faster implementation of data engineering solutions. The skills gained during the certification process also contribute to a deeper understanding of how to troubleshoot, secure, and maintain data systems.

Preparation for Future Learning

The DP-700 certification lays the groundwork for other advanced certifications in data science, machine learning, or cloud architecture. It also aligns well with other Microsoft learning paths, allowing professionals to diversify their expertise.

Prerequisites and Ideal Candidates

While there are no formal prerequisites for taking the DP-700 exam, it is recommended that candidates have some foundational experience in data engineering or related fields. Familiarity with cloud environments, SQL, data transformation, and analytics concepts is advantageous.

Ideal candidates for the certification include:

  • Data engineers seeking to validate their skills in Microsoft Fabric

  • Business intelligence developers want to upgrade their technical capabilities.

  • ETL developers and database administrators transitioning into data engineering roles

  • IT professionals who manage data workflows or support analytics operations

  • Students or entry-level professionals aiming to build a career in data engineering.

Candidates should be comfortable with both visual and code-first tools and have an interest in working on end-to-end data solutions.

Format and Structure of the DP-700 Exam

The DP-700 exam typically includes 50 to 60 multiple-choice and scenario-based questions. Among these, there is usually one detailed case study that includes approximately 10 interrelated questions. This format ensures that the candidate not only understands theoretical concepts but also knows how to apply them in practical situations.

Exam Structure Overview

  • Number of Questions: 50 to 60

  • Case Study: Yes, includes one scenario with around 10 questions

  • Time Allotted: 100 minutes

  • Passing Score: 700 out of 1000

  • Certification Validity: 12 months

  • Cost: Approximately $165

The questions span across three main functional areas: implementing and managing analytics solutions, ingesting and transforming data, and monitoring and optimizing performance.

Role of AI and Cloud in Microsoft Fabric

One of the standout features of Microsoft Fabric is its built-in support for AI and cloud integration. The platform is hosted on Microsoft Azure, allowing it to scale dynamically based on workloads and integrate seamlessly with other Azure services like Synapse, Azure ML, and Power BI.

Fabric also employs AI to optimize pipeline execution, automate error detection, and recommend performance improvements. This integration makes the platform especially powerful for organizations looking to deploy advanced analytics solutions with minimal manual effort.

Data engineers using Microsoft Fabric can build real-time pipelines, apply machine learning models, and analyze results within a unified ecosystem. This reduces complexity, shortens project timelines, and improves overall efficiency.

Use Cases of Microsoft Fabric in the Industry

Microsoft Fabric is widely used across different industries due to its flexibility and scalability. Here are some example use cases:

Financial Services

Banks and insurance companies use Microsoft Fabric to ingest transaction data, apply fraud detection models, and visualize financial risks in real-time.

Healthcare

Healthcare providers leverage the platform to process patient records, optimize treatment plans using machine learning, and ensure compliance with data privacy regulations.

Retail

Retailers use Fabric to manage customer data, analyze purchasing patterns, and optimize supply chains by integrating IoT and inventory data in real-time.

Manufacturing

Manufacturers implement Fabric solutions for predictive maintenance, real-time equipment monitoring, and quality control analysis using streaming data.

These use cases demonstrate the platform’s versatility and make the DP-700 certification valuable for professionals in a wide range of industries.

The DP-700: Microsoft Fabric Data Engineering Certification is a key credential for professionals working in or aspiring to enter the field of data engineering. It validates the ability to implement real-world data workflows using a combination of Microsoft Fabric tools. The certification is particularly relevant in today’s job market, where cloud-based, AI-powered analytics solutions are becoming the standard.

Professionals who earn this certification are well-positioned to support complex data projects across industries. The certification not only boosts career potential but also equips candidates with the practical knowledge and confidence to design and operate scalable, secure, and intelligent data systems.

Core Learning Areas and Essential Tools in the DP-700 Exam

Overview of the Exam Focus

The DP-700 exam is structured around assessing practical capabilities in designing, building, monitoring, and optimizing data engineering solutions using Microsoft Fabric. While Microsoft Fabric includes a variety of services, the exam primarily focuses on the most impactful areas that reflect real-world data engineering responsibilities.

The exam structure includes approximately 50–60 questions, which are grouped into three core domains. Each of these domains tests critical skill sets that a Microsoft Fabric data engineer is expected to have.

Core Learning Area 1: Implement and Manage an Analytics Solution

This section evaluates the candidate’s proficiency in setting up and managing a complete analytics ecosystem within Microsoft Fabric. Candidates must understand not just how to configure components but also how to ensure their effective operation, security, and lifecycle management.

Key Concepts and Tasks

  • Workspace Settings: Understanding how to create, configure, and manage workspaces is crucial. This includes roles, permissions, linking data connections, and assigning users.

  • Version Control: You will need to know how to implement version control using Git repositories and how version history impacts pipeline and notebook development.

  • Deployment Practices: Deploying resources between environments (dev, test, production) using automation techniques or deployment pipelines is a tested skill.

  • Access Control: You will be expected to manage data access using Microsoft Entra ID, assign appropriate workspace roles, and ensure role-based access control to sensitive data.

  • Data Governance: Skills in tagging, classification, data cataloging, and policy enforcement are part of governance.

  • Orchestration: You need to understand how to coordinate workflows using triggers, dependencies, and execution schedules for automated solutions.

These tasks represent the backbone of any production-level analytics solution, and mastering them is essential for real-world success.

Core Learning Area 2: Ingest and Transform Data

This section tests knowledge of the tools and methods used to collect, clean, shape, and load data from various sources into Microsoft Fabric. It covers both batch and streaming ingestion patterns.

Key Concepts and Tasks

  • Data Ingestion: You must demonstrate how to bring in data from databases, cloud services, on-premises sources, APIs, or file systems using pipelines or event-based tools.

  • Data Loading Techniques: Understand the difference between full loading and incremental loading. You’ll be expected to use watermarking, change tracking, and timestamp-based methods for efficient loading.

  • Data Stores: A clear understanding of when to use Lakehouse, Warehouse, or Event House is required. You’ll need to decide the best fit for storing structured, semi-structured, or real-time data.

  • Data Transformation: Using both low-code (Dataflow Gen2) and code-first (Notebook) tools, you’ll need to clean, filter, enrich, and reshape datasets for analytics.

  • Streaming Data: You should be familiar with processing data in motion using Eventstream, including schema inference, transformations, and output routing.

Data ingestion and transformation make up a large portion of the real-world tasks handled by data engineers, which is why this is one of the most emphasized domains in the exam.

Core Learning Area 3: Monitor and Optimize an Analytics Solution

This domain focuses on ensuring that solutions built on Microsoft Fabric operate reliably, efficiently, and can scale when necessary. It also includes troubleshooting and error management skills.

Key Concepts and Tasks

  • Monitoring of Fabric Items: You need to know how to monitor pipelines, notebooks, lakehouses, and event streams. This includes setting up monitoring dashboards and alerts.

  • Error Handling: Managing errors in data flows, handling failures in pipelines, and implementing retry logic are crucial exam topics.

  • Performance Optimization: You’ll be expected to optimize queries, manage partitioning, understand data formats, and reduce memory or compute usage across data stores and transformations.

This domain tests your ability to ensure system health, optimize performance for cost and speed, and resolve problems before they impact business operations.

Types of Questions

In each of these three domains, the exam presents a mix of multiple-choice, case study, and scenario-based questions. The case study section, typically comprising around 10 questions, presents a detailed business scenario requiring candidates to recommend or implement the correct technical solutions based on best practices.

Essential Microsoft Fabric Tools to Know for the Exam

To complete the DP-700 exam, you must be proficient with a range of tools offered within the Microsoft Fabric ecosystem. These tools are central to most tasks a data engineer will perform and are heavily featured throughout the exam questions.

Category 1: Data Movement and Transformation Tools

These tools allow for orchestrating data workflows, transforming datasets, and building data pipelines using both visual and coding methods.

Data Pipeline

This tool is a visual orchestration engine for building ETL workflows. It allows the integration of various data sources, movement of data between systems, and scheduling of workflows.

  • Used for both batch and streaming operations

  • Supports triggers, conditional paths, and error handling

  • Integrates easily with Lakehouse, Warehouse, and Dataflow Gen2

Dataflow Gen2

This is a visual, low-code data transformation tool designed to prepare data for analytics. It supports filtering, joining, splitting, grouping, and aggregating data.

  • Works well for users familiar with Power Query or Excel

  • Good for building repeatable transformations without coding

  • Easily connects to multiple data sources.

Notebook

Notebooks in Microsoft Fabric provide a flexible, code-first interface that supports languages like Python, R, and Scala. These are used for complex transformations, data exploration, machine learning, and scripting.

  • Ideal for data scientists and engineers who prefer code

  • Excellent for integrating AI/ML workflows into pipelines

  • Supports interactive visualizations and Markdown-based documentation

Eventstream

This tool is designed for real-time data ingestion and processing. It allows the capture, transformation, and routing of streaming data from events, sensors, and transactional systems.

  • Handles high-velocity data streams

  • Can output to dashboards or storage layers

  • Useful for use cases like fraud detection, telemetry, or instant alerts

Category 2: Data Storage Tools

These tools are used to store and manage data in Microsoft Fabric for both structured and unstructured formats. Each one is optimized for specific types of workloads.

Lakehouse

Lakehouse combines the advantages of data lakes and data warehouses. It allows for storing raw, semi-structured, and structured data in one place with support for SQL querying.

  • Supports Parquet, Delta Lake, and open file formats

  • Used for both data exploration and analytical querying

  • Integrates with all Fabric tools for easy ingestion and processing

Warehouse

This is a traditional SQL-based data storage option within Microsoft Fabric. It’s used for structured data where performance, indexing, and query optimization are critical.

  • Supports T-SQL

  • Optimized for reporting and analytical queries

  • Works seamlessly with Dataflow Gen2 and Pipelines

Event House

Event House is a specialized storage component for managing real-time streaming data. It retains incoming events and allows querying in near real-time.

  • Ideal for streaming applications

  • Stores large volumes of time-series or event-driven data

  • Often used in combination with Eventstream for ingestion and analysis.

The DP-700 Microsoft Fabric Data Engineering Certification centers around three main functional areas: implementing analytics solutions, ingesting and transforming data, and monitoring and optimizing performance. Within these areas, the exam focuses on real-world scenarios and practical implementations using a set of key tools.

Mastering tools like Data Pipeline, Dataflow Gen2, Notebooks, Eventstream, Lakehouse, Warehouse, and Event House is critical to passing the exam. These tools enable the full range of tasks a data engineer would perform on Microsoft Fabric, from ingesting and preparing data to deploying and maintaining complex data systems.

By focusing on the core learning areas and dedicating time to hands-on practice with these tools, candidates will be well-prepared to take on the DP-700 exam and apply their skills in modern data engineering roles.

Who Should Take the DP-700 Microsoft Fabric Data Engineering Certification?

Introduction

Microsoft’s DP-700 certification is not just for traditional data engineers. It is a valuable credential for a wide range of IT professionals who are interested in working with cloud-based data platforms, building scalable data pipelines, and enabling analytics across their organizations. Whether you’re already involved in data workflows or looking to transition into the data engineering field, this certification offers a structured path to enhance your knowledge and improve your career prospects.

Understanding who this certification is best suited for can help you evaluate whether it aligns with your career goals, current role, or technical interests.

Career Paths That Benefit from DP-700

There are several roles in technology and data-driven industries that benefit significantly from the DP-700 certification. Below is a breakdown of the professionals who should consider this certification and how it can enhance their skills and responsibilities.

Business Intelligence (BI) Developers

BI Developers are responsible for creating dashboards, reports, and other visualization tools that help stakeholders make informed business decisions. Although BI roles traditionally focus more on reporting, modern BI tools are increasingly integrated with data engineering processes.

By taking the DP-700 certification, BI developers can:

  • Learn to build and manage robust data pipelines using Fabric

  • Work directly with data stores like Lakehouse and Warehouse to prepare data.

  • Ingest and clean data before it enters visualization tools.

  • Enhance collaboration with data engineers and data scientists.

This certification allows BI professionals to gain more control over their data sources and reduce dependency on separate data engineering teams.

ETL Developers

ETL (Extract, Transform, Load) developers are already familiar with data ingestion and transformation. However, many ETL processes are being modernized with cloud-native tools. Microsoft Fabric offers new low-code and code-first tools for these tasks.

The DP-700 certification enables ETL developers to:

  • Transition from traditional ETL tools to Fabric’s Data Pipelines and Dataflow Gen2

  • Incorporate real-time data processing using Eventstream.

  • Work within a scalable, cloud-native architecture

  • Improve performance, monitoring, and reliability of data workflows.

This certification is a natural progression for ETL developers who want to stay current with modern data stack technologies.

Database Administrators (DBAs)

Database administrators manage data storage systems, ensure data integrity, and monitor performance. As organizations adopt cloud-first strategies, DBAs are expected to understand data engineering workflows and analytics infrastructure.

By completing the DP-700 certification, DBAs can:

  • Learn how to integrate traditional database systems into cloud-based workflows

  • Manage Lakehouse and Warehouse components within Microsoft Fabric.

  • Participate in data pipeline design and transformation logic.

  • Optimize query performance and storage strategy.

The certification expands a DBA’s role into cloud data management and makes them more versatile in hybrid environments.

Quality Assurance (QA) Engineers

QA professionals involved in data-centric projects often need to validate data accuracy, monitor workflows, and ensure end-to-end reliability of data systems.

Through this certification, QA engineers can:

  • Learn how to monitor Fabric pipelines and event streams

  • Implement automated validation steps using notebooks.

  • Understand orchestration workflows to test them more effectively.

  • Work closely with data engineers to improve data system reliability.

It is especially useful for QA engineers who work in data migration, analytics, or business intelligence projects.

AI and Machine Learning Professionals

Data scientists and machine learning engineers often work with large volumes of structured and unstructured data. Microsoft Fabric provides tools like Notebooks, which support Python, R, and Scala, making it an ideal environment for both data preprocessing and model training.

This certification benefits AI professionals by helping them:

  • Set up data ingestion pipelines that feed machine learning models

  • Use Notebooks to clean and analyze data before modeling.

  • Store large datasets in optimized formats for training

  • Deploy AI workflows as part of a scalable, automated solution.

Understanding the underlying data infrastructure allows AI professionals to reduce latency, improve efficiency, and deploy models more reliably.

IT Students and Entry-Level Professionals

Students and recent graduates looking to enter the data field can use the DP-700 certification as a strong entry point. Since the certification doesn’t have strict prerequisites, it provides a great opportunity to gain foundational knowledge in data engineering using Microsoft Fabric.

For students, the benefits include:

  • Building hands-on experience with real-world data tools

  • Gaining credibility in job applications and interviews

  • Learning core concepts that apply to multiple roles in data and analytics

  • Becoming familiar with low-code and code-first development environments

It also provides a structured learning path for those who are unsure of which specific data role they want to pursue.

Data Warehouse Developers and Architects

Professionals responsible for building and managing data warehouses are increasingly moving toward architectures that combine the best features of data lakes and warehouses. Microsoft Fabric supports this transition through Lakehouse and Warehouse tools.

For data warehouse specialists, the certification helps to:

  • Understand how to integrate Lakehouse and Warehouse in Fabric

  • Design modern storage solutions using T-SQL and low-code tools.

  • Support real-time data processing needs alongside historical data models.

  • Plan architecture that aligns with business requirements for scale and speed

This knowledge is essential for architects looking to modernize legacy systems or design greenfield cloud-native platforms.

Aspiring Data Engineers

For those aiming to specialize in data engineering, this certification provides one of the most targeted learning experiences available. Microsoft Fabric offers a comprehensive environment for mastering ingestion, transformation, storage, and analysis workflows.

Aspiring data engineers will benefit by:

  • Learning all the tools and skills needed to start building production-ready pipelines

  • Practicing real-world tasks in a cloud-native platform

  • Acquiring knowledge applicable across industries and use cases

  • Becoming job-ready with both theory and practical implementation knowledge

The certification positions new entrants well for junior roles and internships in data teams.

Cross-Industry Relevance

One of the most compelling reasons to pursue the DP-700 certification is its broad relevance across industries. Organizations in healthcare, finance, education, logistics, manufacturing, and e-commerce all rely on data for operational efficiency and competitive advantage. Microsoft Fabric is designed to serve these needs with its scalable architecture, real-time processing capabilities, and integration with analytics and AI.

Some examples of industry-specific applications include:

  • Healthcare: Managing patient records and predicting treatment outcomes

  • Retail: Analyzing sales data and tracking inventory trends

  • Finance: Monitoring fraud in real-time and forecasting risk

  • Education: Tracking student performance and resource allocation

  • Manufacturing: Predictive maintenance and quality assurance using sensor data

The DP-700 certification ensures that professionals have the skills to participate in or lead data initiatives in any of these contexts.

The DP-700 Microsoft Fabric Data Engineering Certification is suitable for a diverse group of professionals, from seasoned data engineers to students entering the field. It is particularly useful for those in business intelligence, ETL, database administration, AI, and quality assurance roles who are ready to transition into or expand their capabilities in cloud-based data engineering.

By focusing on Microsoft Fabric, this certification positions candidates at the forefront of modern data architecture, combining the power of real-time analytics, AI integration, and scalable cloud storage. Whether you’re looking to advance your career, switch roles, or simply broaden your technical expertise, this certification can serve as a foundational step.

How to Study for the DP-700 Exam and Frequently Asked Questions

Introduction

Successfully passing the DP-700: Microsoft Fabric Data Engineering Certification exam requires more than just theoretical knowledge. It involves hands-on familiarity with Microsoft Fabric’s tools, a strong understanding of cloud-based data architecture, and regular practice with realistic scenarios. Because the exam is designed to assess both conceptual knowledge and practical implementation, a well-rounded study strategy is essential.

This section will guide you through the study process step-by-step, from understanding the exam modules to developing an effective preparation plan, using mock tests, and addressing common concerns about the certification.

How to Study for the DP-700 Exam

Understand the Syllabus Structure

The DP-700 exam is organized into five major learning modules. Each module is built around a practical theme or technical objective. Understanding this modular breakdown allows candidates to focus their study efforts more efficiently.

Module 1: Ingesting Data Using Microsoft Fabric

This module focuses on how to import data from various sources into Microsoft Fabric.

  • Learn how to use Data Pipelines to ingest data from SQL, CSV, APIs, and cloud platforms.

  • Practice incremental and full data loading scenarios.

  • Use Eventstream for handling streaming ingestion in real time.

  • Explore options for scheduled and triggered ingestion jobs.

Practical tasks to perform:

  • Build a pipeline that pulls data from a CSV in cloud storage into a Lakehouse.

  • Use Eventstream to receive simulated sensor data and push it into an Event House.

Module 2: Implementing a Lakehouse Using Microsoft Fabric

This module centers on understanding the Lakehouse architecture and how it is used in modern data platforms.

  • Learn how to create and manage Lakehouse objects.

  • Practice creating tables, writing queries, and storing different types of data.

  • Understand how Lakehouse combines the flexibility of a data lake with the structure of a data warehouse.

Practical tasks to perform:

  • Create a Lakehouse and ingest structured and semi-structured data.

  • Use Dataflow Gen2 or Notebooks to clean and transform the data before storing it in the Lakehouse.

Module 3: Implementing Real-Time Intelligence Using Microsoft Fabric

This module focuses on real-time data processing, often referred to as stream analytics.

  • Use Eventstream to capture and transform real-time event data.

  • Analyze real-time data for KPIs, metrics, or alerts.

  • Configure output destinations to dashboards or Lakehouse tables.

Practical tasks to perform:

  • Build a simple alerting system based on sensor data.

  • Stream data from an API or synthetic generator into Microsoft Fabric.

Module 4: Implementing a Data Warehouse Using Microsoft Fabric

This module teaches you how to use Fabric’s Warehouse features for structured analytical workloads.

  • Learn to create T-SQL-based Warehouses.

  • Understand best practices for indexing, partitioning, and querying.

  • Load and transform data using Pipelines and Dataflow Gen2.

Practical tasks to perform:

  • Design a schema for a retail database and implement it in a Warehouse.

  • Write complex queries to extract business insights from transactional data.

Module 5: Managing a Microsoft Fabric Environment

This module deals with administration, monitoring, optimization, and deployment.

  • Configure workspaces, manage user access, and apply governance policies.

  • Set up monitoring dashboards and alerting for performance issues.

  • Learn techniques for optimizing cost, resource usage, and reliability.

Practical tasks to perform:

  • Use Fabric monitoring tools to track pipeline failures or bottlenecks.

  • Set up a version-controlled workspace with Git integration for reproducibility.

Recommended Study Strategy

Step 1: Create a Study Plan

Divide the learning modules over a 4 to 6 week period. Focus on one module per week while spending extra time on hands-on labs during weekends. This pacing allows you to absorb information gradually without being overwhelmed.

Step 2: Learn the Theory

Use official Microsoft Learn documentation and free online tutorials to understand each module. Read about Fabric components like Lakehouse, Warehouse, Eventstream, and Notebooks. Focus on use cases and architecture diagrams to develop a clear understanding.

Step 3: Perform Hands-On Labs

Learning by doing is crucial. Use a Microsoft Fabric trial account or access through a work/school subscription. Practice every key tool covered in the exam. Document each project or task you complete for future reference.

Step 4: Take Practice Exams

Mock tests provide insight into how questions are asked, what areas are frequently targeted, and how to manage your time. Begin taking them after completing at least three modules. Review your incorrect answers carefully to identify knowledge gaps.

Step 5: Schedule the Exam

Commit to a test date once you’ve completed all modules and feel confident with your mock test performance. Booking a date keeps you accountable and helps you stay focused during the final weeks of preparation.

Final Thoughts

The DP-700 exam is not just a test of memorization but an assessment of practical capability in building and managing data solutions on Microsoft Fabric. Preparing for it requires a balanced approach, including learning the theory, practicing hands-on tasks, reviewing exam-like questions, and understanding how the tools integrate in real-world projects.

Whether you are advancing in your career or starting a new journey into data engineering, this certification provides a valuable, recognized path to validate your skills and open new opportunities in the data-driven economy.

 

img