Available Number of Questions: Maximum of
401 Questions
Exam Name: Google Cloud Certified Professional Data Engineer
Exam Duration: 120 Minutes
Related Certification(s):
Google Cloud Certified Certification
Google Professional Data Engineer Exam Topics - You’ll Be Tested in Actual Exam
The Google Professional-Data-Engineer exam is a comprehensive assessment designed to evaluate your expertise in data engineering and its practical application within the Google Cloud Platform (GCP) ecosystem. This exam delves into various critical aspects of data engineering, including designing and building data processing systems, leveraging GCP tools for efficient data management, ensuring data security and privacy, and optimizing data solutions for performance and scalability. You'll also explore data modeling and architecture, data ingestion and transformation techniques, and the effective use of BigQuery, Cloud Storage, and other GCP services. Additionally, the exam covers best practices for data engineering, such as version control, testing, and documentation. By passing this exam, you'll demonstrate a strong understanding of data engineering principles and the ability to implement them using GCP tools, making you a valuable asset for organizations seeking to leverage data-driven insights and solutions.
Google Professional Data Engineer Exam Short Quiz
Attempt this Google Professional Data Engineer exam quiz to self-assess your preparation for the actual Google Cloud Certified Professional Data Engineer exam. CertBoosters also provides premium Google Professional Data Engineer exam questions to pass the Google Cloud Certified Professional Data Engineer exam in the shortest possible time. Be sure to try our free practice exam software for the Google Professional Data Engineer exam.
1of 0 questions |
Google Professional Data Engineer Exam Quiz
✓ 0 answered
🔖 0 bookmarked
GoogleProfessional Data Engineer
Q1:
You are building a data pipeline on Google Cloud. You need to prepare data using a casual method for a
machine-learning process. You want to support a logistic regression model. You also need to monitor and
adjust for null values, which must remain real-valued and cannot be removed. What should you do?
○
AUse Cloud Dataprep to find null values in sample source data. Convert all nulls to 'none' using a Cloud
Dataproc job.
○
BUse Cloud Dataprep to find null values in sample source data. Convert all nulls to 0 using a Cloud
Dataprep job.
○
CUse Cloud Dataflow to find null values in sample source data. Convert all nulls to 'none' using a Cloud
Dataprep job.
○
DUse Cloud Dataflow to find null values in sample source data. Convert all nulls to using a custom script.
GoogleProfessional Data Engineer
Q2:
You are planning to use Cloud Storage as pad of your data lake solution. The Cloud Storage bucket will contain objects ingested from external systems. Each object will be ingested once, and the access patterns of individual objects will be random. You want to minimize the cost of storing and retrieving these objects. You want to ensure that any cost optimization efforts are transparent to the users and applications. What should you do?
○
ACreate a Cloud Storage bucket with Autoclass enabled.
○
BCreate a Cloud Storage bucket with an Object Lifecycle Management policy to transition objects from Standard to Coldline storage class if an object age reaches 30 days.
○
CCreate a Cloud Storage bucket with an Object Lifecycle Management policy to transition objects from Standard to Coldline storage class if an object is not live.
○
DCreate two Cloud Storage buckets. Use the Standard storage class for the first bucket, and use the Coldline storage class for the second bucket. Migrate objects from the first bucket to the second bucket after 30 days.
GoogleProfessional Data Engineer
Q3:
You have a Standard Tier Memorystore for Redis instance deployed in a production environment. You need to simulate a Redis instance failover in the most accurate disaster recovery situation, and ensure that the failover has no impact on production dat
a. What should you do?
○
ACreate a Standard Tier Memorystore for Redis instance in a development environment. Initiate a manual failover by using the force-data-loss data protection mode.
○
BInitiate a manual tailover by using the limited-data-loss data protection mode to the Memorystore for Redis instance in the
production environment.
○
CIncrease one replica to Redis instance in production environment. Initiate a manual failover by using the force-data-loss data
protection mode.
○
DCreate a Standard Tier Memorystore for Redis instance in the development environment. Initiate a manual failover by using the limited-data-loss data protection mode.
GoogleProfessional Data Engineer
Q4:
You are administering a BigQuery on-demand environment. Your business intelligence tool is submitting hundreds of queries each day that aggregate a large (50 TB) sales history fact table at the day and month levels. These queries have a slow response time and are exceeding cost expectations. You need to decrease response time, lower query costs, and minimize maintenance. What should you do?
○
ABuild materialized views on top of the sales table to aggregate data at the day and month level.
○
BBuild authorized views on top of the sales table to aggregate data at the day and month level.
○
CEnable Bl Engine and add your sales table as a preferred table.
○
DCreate a scheduled query to build sales day and sales month aggregate tables on an hourly basis.
GoogleProfessional Data Engineer
Q5:
You have one BigQuery dataset which includes customers' street addresses. You want to retrieve all occurrences of street addresses from the dataset. What should you do?
○
ACreate a deep inspection job on each table in your dataset with Cloud Data Loss Prevention and create an inspection template that includes the STREET_ADDRESS infoType.
○
BCreate a de-identification job in Cloud Data Loss Prevention and use the masking transformation.
○
CWrite a SQL query in BigQuery by using REGEXP_CONTAINS on all tables in your dataset to find rows where the word 'street' appears.
○
DCreate a discovery scan configuration on your organization with Cloud Data Loss Prevention and create an inspection template that
includes the STREET_ADDRESS infoType.
🎉 Google Professional Data Engineer Quiz Complete!