Google Professional Cloud Developer Exam Questions
Exam number/code:
Professional Cloud Developer
Release/Update Date:
01 May, 2026
Available Number of Questions: Maximum of
265 Questions
Exam Name: Professional Cloud Developer
Related Certification(s):
Google Cloud Certified Certification
Google Professional Cloud Developer Exam Topics - You’ll Be Tested in Actual Exam
The Google Professional-Cloud-Developer exam is a comprehensive assessment designed to evaluate your expertise in cloud architecture and development using Google Cloud Platform (GCP). This exam covers a wide range of topics, including designing and developing cloud-based solutions, implementing security and privacy measures, managing GCP resources, and utilizing various GCP services and tools. You'll need a solid understanding of cloud computing fundamentals, such as networking, storage, and virtualization, as well as hands-on experience with GCP's core services like Compute Engine, Cloud Storage, and BigQuery. Additionally, the exam tests your ability to optimize and troubleshoot cloud applications, ensure data integrity and availability, and implement best practices for cloud-native development. With a focus on practical skills and real-world scenarios, the Google Professional-Cloud-Developer exam is an essential step for professionals looking to validate their cloud development expertise and advance their careers in the dynamic world of cloud computing.
Google Professional Cloud Developer Exam Short Quiz
Attempt this Google Professional Cloud Developer exam quiz to self-assess your preparation for the actual Google Professional Cloud Developer exam. CertBoosters also provides premium Google Professional Cloud Developer exam questions to pass the Google Professional Cloud Developer exam in the shortest possible time. Be sure to try our free practice exam software for the Google Professional Cloud Developer exam.
1of 0 questions |
Google Professional Cloud Developer Exam Quiz
✓ 0 answered
🔖 0 bookmarked
GoogleProfessional Cloud Developer
Q1:
A governmental regulation was recently passed that affects your application. For compliance purposes, you are now required to send a duplicate of specific application logs from your application's project to a project that is restricted to the security team. What should you do?
○
AModify the _Default tog bucket sink rules to reroute the logs into the security team's log bucket.
○
BCreate user-defined log buckets in the security team's project. Configure a Cloud Logging sink to route your application s logs to log buckets in the security team's project.
○
CCreate a job that copies the System Event logs from the _Required log bucket into the security team's log bucket in their project.
○
DCreate a job that copies the togs from the _Required log bucket into the security team's log bucket in their project.
GoogleProfessional Cloud Developer
Q2:
You are monitoring a web application that is written in Go and deployed in Google Kubemetes Engine. You notice an increase in CPU and memory utilization. You need to determine which function is consuming the most CPU and memory resources. What should you do?
○
AImport the Cloud Profiler package into your application, and initialize the Profiler agent. Review the generated flame graph in the Google Cloud console to identify time-intensive functions.
○
BCreate a Cloud Logging query that gathers the web application's logs. Write a Python script that calculates the difference between the
timestamps from the beginning and the end of the application's longest functions to identity time-intensive functions.
○
CImport OpenTelemetry and Trace export packages into your application, and create the trace provider. Review the latency data for your
application on the Trace overview page, and identify which functions cause the most latercy.
○
DAdd print commands to the application source code to log when each function is called, and redeploy the application.
GoogleProfessional Cloud Developer
Q3:
You work on an application that relies on Cloud Spanner as its main datastore. New application features have occasionally caused performance regressions. You want to prevent performance issues by running an automated performance test with Cloud Build for each commit made. If multiple commits are made at the same time, the tests might run concurrently. What should you do?
○
ACreate a new project with a random name for every build. Load the required data. Delete the project after the test is run.
○
BCreate a new Cloud Spanner instance for every build. Load the required data. Delete the Cloud Spanner instance after the test is run.
○
CCreate a project with a Cloud Spanner instance and the required data. Adjust the Cloud Build build file to automatically restore the data to its previous state after the test is run.
○
DStart the Cloud Spanner emulator locally. Load the required data. Shut down the emulator after the test is run.
GoogleProfessional Cloud Developer
Q4:
You manage a microservice-based ecommerce platform on Google Cloud that sends confirmation emails to a third-party email service provider using a Cloud Function. Your company just launched a marketing campaign, and some customers are reporting that they have not received order confirmation emails. You discover that the services triggering the Cloud Function are receiving HTTP 500 errors. You need to change the way emails are handled to minimize email loss. What should you do?
○
AIncrease the Cloud Function's timeout to nine minutes.
○
BConfigure the sender application to publish the outgoing emails in a message to a Pub/Sub topic. Update the Cloud Function configuration to consume the Pub/Sub queue.
○
CConfigure the sender application to write emails to Memorystore and then trigger the Cloud Function. When the function is triggered, it reads the email details from Memorystore and sends them to the email service.
○
DConfigure the sender application to retry the execution of the Cloud Function every one second if a request fails.
GoogleProfessional Cloud Developer
Q5:
You are working on a new application that is deployed on Cloud Run and uses Cloud Functions Each time new features are added, new Cloud Functions and Cloud Run services are deployed You use ENV variables to keep track of the services and enable interservice communication but the maintenance of the ENV variables has become difficult. You want to implement dynamic discovery in a scalable way. What should you do?
○
ACreate a Service Directory Namespace Use API calls to register the services during deployment, and query during runtime.
○
BConfigure your microservices to use the Cloud Run Admin and Cloud Functions APIs to query for deployed Cloud Run services and Cloud Functions in the Google Cloud project.
○
CDeploy Hashicorp Consul on a single Compute Engine Instance Register the services with Consul during deployment and query during runtime
○
DRename the Cloud Functions and Cloud Run services endpoints using a well-documented naming
convention
🎉 Google Professional Cloud Developer Quiz Complete!