Month End Sale - Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 65percent

Welcome To DumpsPedia

SAA-C03 Sample Questions Answers

Questions 4

A company uses an Amazon EC2 Auto Scaling group to host an API. The EC2 instances are in a target group that is associated with an Application Load Balancer (ALB). The company stores data in an Amazon Aurora PostgreSQL database.

The API has a weekly maintenance window. The company must ensure that the API returns a static maintenance response during the weekly maintenance window.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Create a table in Aurora PostgreSQL that has fields to contain keys and values. Create a key for a maintenance flag. Set the flag when the maintenance window starts. Configure the API to query the table for the maintenance flag and to return a maintenance response if the flag is set. Reset the flag when the maintenance window is finished.

B.

Create an Amazon Simple Queue Service (Amazon SQS) queue. Subscribe the EC2 instances to the queue. Publish a message to the queue when the maintenance window starts. Configure the API to return a maintenance message if the instances receive a maintenance start message from the queue. Publish another message to the queue when the maintenance window is finished to restore normal operation.

C.

Create a listener rule on the ALB to return a maintenance response when the path on a request matches a wildcard. Set the rule priority to one. Perform the maintenance. When the maintenance window is finished, delete the listener rule.

D.

Create an Amazon Simple Notification Service (Amazon SNS) topic Subscribe the EC2 instances to the topic Publish a message to the topic when the maintenance window starts. Configure the API to return a maintenance response if the instances receive the maintenance start message from the topic. Publish another message to the topic when the maintenance window finshes to restore normal operation.

Buy Now
Questions 5

A developer needs to export the contents of several Amazon DynamoDB tables into Amazon S3 buckets to comply with company data regulations. The developer uses the AWS CLI to runcommands to export from each table to the proper S3 bucket. The developer sets up AWS credentials correctly and grants resources appropriate permissions. However, the exports of some tables fail.

What should the developer do to resolve this issue?

Options:

A.

Ensure that point-in-time recovery is enabled on the DynamoDB tables.

B.

Ensure that the target S3 bucket is in the same AWS Region as the DynamoDB table.

C.

Ensure that DynamoDB streaming is enabled for the tables.

D.

Ensure that DynamoDB Accelerator (DAX) is enabled.

Buy Now
Questions 6

A media company hosts a web application on AWS for uploading videos. Only authenticated users should upload within a specified time frame after authentication.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Configure the application to generate IAM temporary security credentials for authenticated users.

B.

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Buy Now
Questions 7

A company runs its production workload on an Amazon Aurora MySQL DB cluster that includes six Aurora Replicas. The company wants near-real-time reporting queries from one of its departments to be automatically distributed across three of the Aurora Replicas. Those three replicas have a different compute and memory specification from the rest of the DB cluster.

Which solution meets these requirements?

Options:

A.

Create and use a custom endpoint for the workload.

B.

Create a three-node cluster clone and use the reader endpoint.

C.

Use any of the instance endpoints for the selected three nodes.

D.

Use the reader endpoint to automatically distribute the read-only workload.

Buy Now
Questions 8

A company has separate AWS accounts for its finance, data analytics, and development departments. Because of costs and security concerns, the company wants to control which services each AWS account can use

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS Systems Manager templates to control which AWS services each department can use

B.

Create organization units (OUs) for each department in AWS Organizations. Attach service control policies (SCPs) to the OUs.

C.

Use AWS CloudFormation to automatically provision only the AWS services that each department can use.

D.

Set up a list of products in AWS Service Catalog in the AWS accounts to manage and control the usage of specific AWS services

Buy Now
Questions 9

A company runs an application on Amazon EC2 instances. The application needs to access an Amazon RDS database. The company wants to grant the EC2 instances access permissions to the RDS database while following the principle of least privilege.

Which solution will meet these requirements?

Options:

A.

Create an IAM user that has a policy that grants administrative permissions. Use the IAM user's access keys on the EC2 instances to access the RDS database.

B.

Create an IAM user that has a policy that grants the minimum required permissions to access the RDS database. Embed the IAM user's access keys on the EC2 instances to access the RDS database.

C.

Create an IAM role that has a policy that grants the minimum required permissions to access the RDS database. Attach the IAM role access key and the IAM role secret key to the EC2 instance profile.

D.

Create an IAM role that has a policy that grants the minimum required permissions to access the RDS database. Attach the IAM role to an EC2 instance profile. Associate the instance profile with the instances.

Buy Now
Questions 10

A company is testing an application that runs on an Amazon EC2 Linux instance. A single 500 GB Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume is attached to the EC2 instance.

The company will deploy the application on multiple EC2 instances in an Auto Scaling group. All instances require access to the data that is stored in the EBS volume. The company needs a highly available and resilient solution that does not introduce significant changes to the application's code.

Which solution will meet these requirements?

Options:

A.

Provision an EC2 instance that uses NFS server software. Attach a single 500 GB gp2 EBS volume to the instance.

B.

Provision an Amazon FSx for Windows File Server file system. Configure the file system as an SMB file store within a single Availability Zone.

C.

Provision an EC2 instance with two 250 GB Provisioned IOPS SSD EBS volumes.

D.

Provision an Amazon Elastic File System (Amazon EFS) file system. Configure the file system to use General Purpose performance mode.

Buy Now
Questions 11

A company uses a set of Amazon EC2 instances to host a website. The website uses an Amazon S3 bucket to store images and media files.

The company wants to automate website infrastructure creation to deploy the website to multiple AWS Regions. The company also wants to provide the EC2 instances access to the S3 bucket so the instances can store and access data by using AWS Identity and Access Management (IAM).

Which solution will meet these requirements MOST securely?

Options:

A.

Create an AWS Cloud Format ion template for the web server EC2 instances. Save an IAM access key in the UserData section of the AWS;:EC2::lnstance entity in the CloudFormation template.

B.

Create a file that contains an IAM secret access key and access key ID. Store the file in a new S3 bucket. Create an AWS CloudFormation template. In the template, create a parameter to specify the location of the S3 object that contains the access key and access key ID.

C.

Create an IAM role and an IAM access policy that allows the web server EC2 instances to access the S3 bucket. Create an AWS CloudFormation template for the web server EC2 instances that contains an IAM instance profile entity that references the IAM role and the IAM access policy.

D.

Create a script that retrieves an IAM secret access key and access key ID from IAM and stores them on the web server EC2 instances. Include the script in the UserData section of the AWS::EC2::lnstance entity in an AWS CloudFormation template.

Buy Now
Questions 12

A company runs a web application in a single AWS Region. A solutions architect wants to ensure that the web application can continue to operate if the application becomes unavailable in the Region.

Which solution will meet this requirement?

Options:

A.

Deploy the application in multiple Regions. Use Amazon Route 53 DNS health checks to route traffic to a healthy Region.

B.

Deploy the application in multiple Availability Zones within a single Region. Use Amazon Route 53 DNS health checks to route traffic to healthy application resources.

C.

Deploy the application in multiple Regions. Use an Amazon Route 53 simple routing record to route traffic to a healthy Region.

D.

Deploy the application in multiple Availability Zones within a single Region. Use an Amazon Route 53 latency record in each Availability Zone to route traffic to a healthy Availability Zone.

Buy Now
Questions 13

A company has a web application that has thousands of users. The application uses 8-10 user-uploaded images to generate Al images. Users can download the generated Al Images once every 6 hours. The company also has a premium user option that gives users the ability to download the generated Al images anytime

The company uses the user-uploaded images to run Al model training twice a year. The company needs a storage solution to store the images.

Which storage solution meets these requirements MOST cost-effectively?

Options:

A.

Move uploaded images to Amazon S3 Glacier Deep Archive. Move premium user-generated Al images to S3 Standard. Move non-premium user-generated Al images to S3 Standard-Infrequent Access (S3 Standard-IA).

B.

Move uploaded images to Amazon S3 Glacier Deep Archive. Move all generated Al images to S3 Glacier Flexible Retrieval.

C.

Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA) Move premium user-generated Al images to S3 Standard. Move non-premium user-generated Al images to S3 Standard-Infrequent Access (S3 Standard-IA).

D.

Move uploaded images to Amazon S3 One Zone-Infrequent Access {S3 One Zone-IA) Move all generated Al images to S3 Glacier Flexible Retrieval

Buy Now
Questions 14

A company tracks customer satisfaction by using surveys that the company hosts on its website. The surveys sometimes reach thousands of customers every hour. Survey results are currently sent in email messages to the company so company employees can manually review results and assess customer sentiment.

The company wants to automate the customer survey process. Survey results must be available for the previous 12 months.

Which solution will meet these requirements in the MOST scalable way?

Options:

A.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

B.

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

C.

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

D.

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Buy Now
Questions 15

A solutions architect is provisioning an Amazon Elastic File System (Amazon EFS) file system to provide shared storage across multiple Amazon EC2 instances. The instances all exist in the same VPC across multiple Availability Zones. There are two instances in each Availability Zone. The solutions architect must make the file system accessible to each instance with the lowest possible latency.

Which solution will meet these requirements?

Options:

A.

Create a mount target for the EFS file system in the VPC. Use the mount target to mount the file system on each of the instances.

B.

Create a mount target for the EFS file system in one Availability Zone of the VPC. Use the mount target to mount the file system on the instances in that Availability Zone. Share the directory with the other instances.

C.

Create a mount target for each instance. Use each mount target to mount the EFS file system on each respective instance.

D.

Create a mount target in each Availability Zone of the VPC. Use the mount target to mount the EFS file system on the instances in the respective Availability Zone.

Buy Now
Questions 16

A company maintains its accounting records in a custom application that runs on Amazon EC2 instances. The company needs to migrate the data to an AWS managed service for development and maintenance of the application data. The solution must require minimal operational support and provide immutable, cryptographically verifiable logs of data changes.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Copy the records from the application into an Amazon Redshift cluster.

B.

Copy the records from the application into an Amazon Neptune cluster.

C.

Copy the records from the application into an Amazon Timestream database.

D.

Copy the records from the application into an Amazon Quantum Ledger Database (Amazon QLDB) ledger.

Buy Now
Questions 17

A company needs a cloud-based solution for backup, recovery, and archiving while retaining encryption key material control.

Which combination of solutions will meet these requirements? (Select TWO)

Options:

A.

Create an AWS Key Management Service (AWS KMS) key without key material. Import the company's key material into the KMS key.

B.

Create an AWS KMS encryption key that contains key material generated by AWS KMS.

C.

Store the data in Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Use S3 Bucket Keyswith AWS KMS keys.

D.

Store the data in an Amazon S3 Glacier storage class. Use server-side encryption with customer-provided keys (SSE-C).

E.

Store the data in AWS Snowball devices. Use server-side encryption with AWS KMS keys (SSE-KMS).

Buy Now
Questions 18

A company is developing a new application that uses a relational database to store user data and application configurations. The company expects the application to have steady user growth. The company expects the database usage to be variable and read-heavy, with occasional writes.

The company wants to cost-optimize the database solution. The company wants to use an AWS managed database solution that will provide the necessary performance.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy the database on Amazon RDS. Use Provisioned IOPS SSD storage to ensure consistent performance for read and write operations.

B.

Deploy the database on Amazon Aurora Serveriess to automatically scale the database capacity based on actual usage to accommodate the workload.

C.

Deploy the database on Amazon DynamoDB. Use on-demand capacity mode to automatically scale throughput to accommodate the workload.

D.

Deploy the database on Amazon RDS Use magnetic storage and use read replicas to accommodate the workload

Buy Now
Questions 19

A company wants to migrate applications from its on-premises servers to AWS. As a first step, the company is modifying and migrating a non-critical application to a single Amazon EC2 instance. The application will store information in an Amazon S3 bucket. The company needs to follow security best practices when deploying the application on AWS.

Which approach should the company take to allow the application to interact with Amazon S3?

Options:

A.

Store the files in an Amazon S3 bucket. Use the S3 Glacier Instant Retrieval storage class. Create an S3 Lifecycle policy to transition the files to the S3 Glacier Deep Archive storage class after 1 year.

B.

Store the files in an Amazon S3 bucket. Use the S3 Standard storage class. Create an S3 Lifecycle policy to transition the files to the S3 Glacier Flexible Retrieval storage class after 1 year.

C.

Store the files on an Amazon Elastic Block Store (Amazon EBS) volume. Use Amazon Data Lifecycle Manager to create snapshots of the EBS volumes and to store those snapshots in Amazon S3.

D.

Store the files on an Amazon Elastic File System (Amazon EFS) mount. Configure EFS lifecycle management to transition the files to the EFS Standard-Infrequent Access (Standard-IA) storage class after 1 year.

Buy Now
Questions 20

A company uses AWS Cost Explorer to monitor its AWS costs. The company notices that Amazon Elastic Block Store (Amazon EBS) storage and snapshot costs increase every month. However, the company does not purchase additional EBS storage every month. The company wants to optimize monthly costs for its current storage usage.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use logs in Amazon CloudWatch Logs to monitor the storage utilization of Amazon EBS. Use Amazon EBS Elastic Volumes to reduce the size of the EBS volumes.

B.

Use a custom script to monitor space usage. Use Amazon EBS Elastic Volumes to reduce the size of the EBS volumes.

C.

Delete all expired and unused snapshots to reduce snapshot costs.

D.

Delete all nonessential snapshots. Use Amazon Data Lifecycle Manager to create and manage the snapshots according to the company's snapshot policy requirements.

Buy Now
Questions 21

A company is designing a new multi-tier web application that consists of the following components:

• Web and application servers that run on Amazon EC2 instances as part of Auto Scaling groups

• An Amazon RDS DB instance for data storage

A solutions architect needs to limit access to the application servers so that only the web servers can access them. Which solution will meet these requirements?

Options:

A.

Deploy AWS PrivateLink in front of the application servers. Configure the network ACL to allow only the web servers to access the application servers.

B.

Deploy a VPC endpoint in front of the application servers Configure the security group to allow only the web servers to access the application servers

C.

Deploy a Network Load Balancer with a target group that contains the application servers' Auto Scaling group Configure the network ACL to allow only the web servers to access the application servers.

D.

Deploy an Application Load Balancer with a target group that contains the application servers' Auto Scaling group. Configure the security group to allow only the web servers to access the application servers.

Buy Now
Questions 22

Question:

A company uses Apache Hadoop and Spark on-prem. The infrastructure is complex and not scalable. They want to reduce operational complexity but keep data processing on-premises.

Options:

Options:

A.

Use Site-to-Site VPN to access on-prem HDFS. Use Amazon EMR to process the data.

B.

Use AWS DataSync to connect to on-prem HDFS. Use Amazon EMR to process the data.

C.

Migrate to Amazon EMR on AWS Outposts.

D.

Use AWS Snowball to migrate data to S3. Use EMR to process.

Buy Now
Questions 23

A company is developing software that uses a PostgreSQL database schema. The company needs to configure development environments and test environments for its developers.

Each developer at the company uses their own development environment, which includes a PostgreSQL database. On average, each development environment is used for an 8-hour workday. The test environments will be used for load testing that can take up to 2 hours each day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure development environments and test environments with their own Amazon Aurora Serverless v2 PostgreSQL database.

B.

For each development environment, configure an Amazon RDS for PostgreSQL Single-AZ DB instance. For the test environment, configure a single Amazon RDS for PostgreSQL Multi-AZ DB instance.

C.

Configure development environments and test environments with their own Amazon Aurora PostgreSQL DB cluster.

D.

Configure an Amazon Aurora global database. Allow developers to connect to the database with their own credentials.

Buy Now
Questions 24

A company needs to store confidential files on AWS. The company accesses the files every week. The company must encrypt the files by using envelope encryption, and the encryption keys must be rotated automatically. The company must have an audit trail to monitor encryption key usage.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Store the confidential files in Amazon S3.

B.

Store the confidential files in Amazon S3 Glacier Deep Archive.

C.

Use server-side encryption with customer-provided keys (SSE-C).

D.

Use server-side encryption with Amazon S3 managed keys (SSE-S3).

E.

Use server-side encryption with AWS KMS managed keys (SSE-KMS).

Buy Now
Questions 25

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

Options:

A.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date.

B.

Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC).

C.

Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket.

D.

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.

Buy Now
Questions 26

A company is migrating a new application from an on-premises data center to a new VPC in the AWS Cloud. The company has multiple AWS accounts and VPCs that share many subnets and applications. The company wants to have fine-grained access control for the new application.The company wants to ensure that all network resources across accounts and VPCs that are granted permission to access the new application can access the application.

Which solution will meet these requirements?

Options:

A.

Set up a VPC peering connection for each VPC that needs access to the new application VPC. Update route tables in each VPC to enable connectivity.

B.

Deploy a transit gateway in the account that hosts the new application. Share the transit gateway with each account that needs to connect to the application. Update route tables in the VPC that hosts the new application and in the transit gateway to enable connectivity.

C.

Use an AWS PrivateLink endpoint service to make the new application accessible to other VPCs. Control access to the application by using an endpoint policy.

D.

Use an Application Load Balancer (ALB) to expose the new application to the internet. Configure authentication and authorization processes to ensure that only specified VPCs can access the application.

Buy Now
Questions 27

A company needs to collect streaming data from several sources and store the data in the AWS Cloud. The dataset is heavily structured, but analysts need to perform several complex SQL queries and need consistent performance. Some of the data is queried more frequently than the rest. The company wants a solution that meets its performance requirements in a cost-effective manner.

Which solution meets these requirements?

Options:

A.

Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to ingest the data to save it to Amazon S3. Use Amazon Athena to perform SQL queries over the ingested data.

B.

Use Amazon Managed Streaming for Apache Kafka (Amazon MSK) to ingest the data to save it to Amazon Redshift. Enable Amazon Redshift workload management (WLM) to prioritize workloads.

C.

Use Amazon Data Firehose to ingest the data to save it to Amazon Redshift. Enable Amazon Redshift workload management (WLM) to prioritize workloads.

D.

Use Amazon Data Firehose to ingest the data to save it to Amazon S3. Load frequently queried data to Amazon Redshift using the COPY command. Use Amazon Redshift Spectrum for less frequently queried data.

Buy Now
Questions 28

A company is building a serverless application to process orders from an ecommerce site. The application needs to handle bursts of traffic during peak usage hours and to maintain high availability. The orders must be processed asynchronously in the order the application receives them.

Which solution will meet these requirements?

Options:

A.

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use an AWS Lambda function to process the orders.

B.

Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to receive orders. Use an AWS Lambda function to process the orders.

C.

Use an Amazon Simple Queue Service (Amazon SQS) standard queue to receive orders. Use AWS Batch jobs to process the orders.

D.

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use AWS Batch jobs to process the orders.

Buy Now
Questions 29

A company has a transaction-processing application that is backed by an Amazon RDS MySQL database. When the load on the application increases, a large number of database connections are opened and closed frequently, which causes latency for the database transactions.

A solutions architect determines that the root cause of the latency is poor connection handling by the application. The solutions architect cannot modify the application code. The solutions architect needs to manage database connections to improve the database performance during periods of high load.

Which solution will meet these requirements?

Options:

A.

Upgrade the database instance to a larger instance type to handle a large number of database connections.

B.

Configure Amazon RDS storage autoscaling to dynamically increase the provisioned IOPS.

C.

Use Amazon RDS Proxy to pool and share database connections.

D.

Convert the database instance to a Multi-AZ deployment.

Buy Now
Questions 30

A company has an application that processes information from documents that users upload. When a user uploads a new document to an Amazon S3 bucket, an AWS Lambda function is invoked. The Lambda function processes information from the documents.

The company discovers that the application did not process many recently uploaded documents. The company wants to ensure that the application processes each document with retries if there is an error during the first attempt to process the document.

Which solution will meet these requirements?

Options:

A.

Create an Amazon API Gateway REST API that has a proxy integration to the Lambda function. Update the application to send requests to the REST API.

B.

Configure a replication policy on the S3 bucket to stage the documents in another S3 bucket that an AWS Batch job processes on a daily schedule.

C.

Deploy an Application Load Balancer in front of the Lambda function that processes the documents.

D.

Configure an Amazon Simple Queue Service (Amazon SQS) queue as an event source for the Lambda function. Configure an S3 event notification on the S3 bucket to send new document upload events to the SQS queue.

Buy Now
Questions 31

A solutions architect is designing the cloud architecture for a new stateless application that will be deployed on AWS. The solutions architect created an Amazon Machine Image (AMI) and launch template for the application.

Based on the number of jobs that need to be processed, the processing must run in parallel while adding and removing application Amazon EC2 instances as needed. The application must be loosely coupled. The job items must be durably stored.

Which solution will meet these requirements?

Options:

A.

Create an Amazon Simple Notification Service (Amazon SNS) topic to send the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on CPU usage.

B.

Create an Amazon Simple Queue Service (Amazon SQS) queue to hold the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on network usage.

C.

Create an Amazon Simple Queue Service (Amazon SQS) queue to hold the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on the number of items in the SQS queue.

D.

Create an Amazon Simple Notification Service (Amazon SNS) topic to send the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on the number of messages published to the SNS topic.

Buy Now
Questions 32

A company recently migrated a monolithic application to an Amazon EC2 instance and Amazon RDS. The application has tightly coupled modules. The existing design of the application gives the application the ability to run on only a single EC2 instance.

The company has noticed high CPU utilization on the EC2 instance during peak usage times. The high CPU utilization corresponds to degraded performance on Amazon RDS for read requests. The company wants to reduce the high CPU utilization and improve read request performance.

Which solution will meet these requirements?

Options:

A.

Resize the EC2 instance to an EC2 instance type that has more CPU capacity. Configure an Auto Scaling group with a minimum and maximum size of 1. Configure an RDS read replica for read requests.

B.

Resize the EC2 instance to an EC2 instance type that has more CPU capacity. Configure an Auto Scaling group with a minimum and maximum size of 1. Add an RDS read replica and redirect all read/write traffic to the replica.

C.

Configure an Auto Scaling group with a minimum size of 1 and maximum size of 2. Resize the RDS DB instance to an instance type that has more CPU capacity.

D.

Resize the EC2 instance to an EC2 instance type that has more CPU capacity Configure an Auto Scaling group with a minimum and maximum size of 1. Resize the RDS DB instance to an instance type that has more CPU capacity.

Buy Now
Questions 33

A company is migrating applications from an on-premises Microsoft Active Directory that the company manages to AWS. The company deploys the applications in multiple AWS accounts. The company uses AWS Organizations to manage the accounts centrally.

The company's security team needs a single sign-on solution across all the company's AWS accounts. The company must continue to manage users and groups that are in the on-premises Active Directory

Which solution will meet these requirements?

Options:

A.

Create an Enterprise Edition Active Directory in AWS Directory Service for Microsoft Active Directory. Configure the Active Directory to be the identity source for AWS IAM Identity Center

B.

Enable AWS IAM Identity Center. Configure a two-way forest trust relationship to connect the company's self-managed Active Directory with IAM Identity Center by using AWS Directory Service for Microsoft Active Directory.

C.

Use AWS Directory Service and create a two-way trust relationship with the company's self-managed Active Directory.

D.

Deploy an identity provider (IdP) on Amazon EC2. Link the IdP as an identity source within AWS IAM Identity Center.

Buy Now
Questions 34

A company stores sensitive customer data in an Amazon DynamoDB table. The company frequently updates the data. The company wants to use the data to personalize offers for customers.

The company's analytics team has its own AWS account. The analytics team runs an application on Amazon EC2 instances that needs to process data from the DynamoDB tables. The company needs to follow security best practices to create a process to regularly share data from DynamoDB to the analytics team.

Which solution will meet these requirements?

Options:

A.

Export the required data from the DynamoDB table to an Amazon S3 bucket as multiple JSON files. Provide the analytics team with the necessary IAM permissions to access the S3 bucket.

B.

Allow public access to the DynamoDB table. Create an IAM user that has permission to access DynamoDB. Share the IAM user with the analytics team.

C.

Allow public access to the DynamoDB table. Create an IAM user that has read-only permission for DynamoDB. Share the IAM user with the analytics team.

D.

Create a cross-account IAM role. Create an IAM policy that allows the AWS account ID of the analytics team to access the DynamoDB table. Attach the IAM policy to the IAM role. Establish a trust relationship between accounts.

Buy Now
Questions 35

A company has a VPC with multiple private subnets that host multiple applications. The applications must not be accessible to the internet. However, the applications need to access multiple AWS services. The applications must not use public IP addresses to access the AWS services.

Options:

A.

Configure interface VPC endpoints for the required AWS services. Route traffic from the private subnets through the interface VPC endpoints.

B.

Deploy a NAT gateway in each private subnet. Route traffic from the private subnets through the NAT gateways.

C.

Deploy internet gateways in each private subnet. Route traffic from the private subnets through the internet gateways.

D.

Set up an AWS Direct Connect connection between the private subnets. Route traffic from the private subnets through the Direct Connect connection.

Buy Now
Questions 36

A financial services company must retain log data for 1 year. The company stores log files in an Amazon S3 bucket and wants to prevent any user from deleting or overwriting the log files during this period. The data must remain available for read-only requests.

Options:

A.

Enable S3 Versioning on the bucket. Use Object Lock in compliance mode with a 1-year retention period.

B.

Enable S3 Transfer Acceleration on the bucket. Create an S3 Lifecycle Configuration rule to move objects to Amazon S3 Glacier Flexible Retrieval after 1 year.

C.

Enable S3 Versioning on the bucket. Create an S3 Lifecycle Configuration rule to move objects to Amazon S3 Glacier Flexible Retrieval after 1 year.

D.

Create an AWS Lambda function to programmatically check the timestamp of S3 data and to move the data to Amazon S3 Glacier Deep Archive if the data is older than 1 year.

Buy Now
Questions 37

A companyQUESTION NO: 24

A company has launched an Amazon RDS for MySQL DB instance. Most of the connections to the database come from serverless applications. Application traffic to the database changes significantly at random intervals. At times of high demand, users report that their applications experience database connection rejection errors.

Which solution will resolve this issue with the LEAST operational overhead?

Options:

A.

Create a proxy in RDS Proxy. Configure the users' applications to use the DB instance through RDS Proxy.

B.

Deploy Amazon ElastiCache (Memcached) between the users' applications and the DB instance.

C.

Migrate the DB instance to a different instance class that has higher I/O capacity. Configure the users' applications to use the new DB instance.

D.

Configure Multi-AZ for the DB instance. Configure the users' applications to switch between the DB instances.

Buy Now
Questions 38

A global company is migrating its workloads from an on-premises data center to AWS. The AWS environment includes multiple AWS accounts. IAM roles. AWS Config rules, and a VPC.

The company wants an automated process to provision new accounts on demand when the company's business units require new accounts.

Which solution will meet these requirements with LEAST effort?

Options:

A.

Use AWS Control Tower to set up an organization in AWS Organizations. Use AWS Control Tower Account Factory for Terraform (AFT) to provision new AWS accounts.

B.

Create an organization in AWS Organizations. Use the AWS CLI CreateAccount API action to provision new AWS accounts. Organize the business units with organizational units (OUs).

C.

Create an AWS Lambda function that uses the AWS Organizations API to create new accounts. Invoke the Lambda function from an AWS CloudFormation template in AWS Service Catalog.

D.

Create an organization in AWS Organizations. Use AWS Step Functions to orchestrate the account creation process. Send account creation requests to an Amazon API Gateway API endpoint to invoke an AWS Lambda function that creates new accounts.

Buy Now
Questions 39

Question:

A company recently migrated a large amount of research data to an Amazon S3 bucket. The company needs an automated solution to identify sensitive data in the bucket. A security team also needs to monitor access patterns for the data 24 hours a day, 7 days a week to identify suspicious activities or evidence of tampering with security controls.

Options:

Options:

A.

Set up AWS CloudTrail reporting, and grant the security team read-only access to the CloudTrail reports. Set up an Amazon S3 Inventory report to identify sensitive data. Review the findings with the security team.

B.

Enable Amazon Macie and Amazon GuardDuty on the account. Grant the security team access to Macie and GuardDuty. Review the findings with the security team.

C.

Set up an Amazon S3 Inventory report. Use Amazon Athena and Amazon QuickSight to identify sensitive data. Create a dashboard for the security team to review findings.

D.

Use AWS Identity and Access Management (IAM) Access Advisor to monitor for suspicious activity and tampering. Create a dashboard for the security team. Set up an Amazon S3 Inventory report to identify sensitive data. Review the findings with the security team.

Buy Now
Questions 40

An ecommerce company is launching a new marketing campaign. The company anticipates the campaign to generate ten times the normal number of daily orders through the company's ecommerce application. The campaign will last 3 days.

The ecommerce application architecture is based on Amazon EC2 instances in an Auto Scaling group and an Amazon RDS for MySQL database. The application writes order transactions to an Amazon Elastic File System (Amazon EFS) file system before the application writes orders to the database. During normal operations, the application write operations peak at 5,000 IOPS.

A solutions architect needs to ensure that the application can handle the anticipated workload during the marketing campaign.

Which solution will meet this requirement?

Options:

A.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Bursting throughput.

B.

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Elastic throughput.

C.

Convert the database to a Multi-AZ deployment. Set the Amazon EFS throughput mode to Elastic throughput for the duration of the campaign.

D.

Use AWS Database Migration Service (AWS DMS) to convert the database to RDS for PostgreSQL. Set the Amazon EFS throughput mode to Bursting throughput.

Buy Now
Questions 41

A company uses an AWS Transfer for SFTP public server endpoint and Amazon S3 storage to host large datasets for its customers. The company provides customers SSH private keys to authenticate and download their datasets. The Transfer for SFTP server is configured with structured logging that is saved to an S3 bucket. The company wants to charge customers based on their monthly data download usage. Which solution will meet these requirements?

Options:

A.

Configure VPC Flow Logs to write to a new S3 bucket. Run monthly queries on the flow logs to identify customer usage and calculate cost. Add the charges to the customers' monthly bills.

B.

Each month, use AWS Cost Explorer to examine the costs for Transfer for SFTP and obtain a breakdown by customer. Add the charges to the customers' monthly bills.

C.

Enable requester pays on the S3 bucket that hosts the software. Allocate the charges to each customer based on the customer's requests.

D.

Run Amazon Athena queries on the logging S3 bucket monthly to identify customer usage and calculate costs. Add the charges to the customers' monthly bills.

Buy Now
Questions 42

A company collects 10 GB of telemetry data every day from multiple devices. The company stores the data in an Amazon S3 bucket that is in a source data account.

The company has hired several consulting agencies to analyze the company's data. Each agency has a unique AWS account. Each agency requires read access to the company's data.

The company needs a secure solution to share the data from the source data account to the consulting agencies.

Which solution will meet these requirements with the LEAST operational effort?

Options:

A.

Set up an Amazon CloudFront distribution. Use the S3 bucket as the origin.

B.

Make the S3 bucket public for a limited time. Inform only the agencies that the bucket is publicly accessible.

C.

Configure cross-account access for the S3 bucket to the accounts that the agencies own.

D.

Set up an IAM user for each agency in the source data account. Grant each agency IAM user access to the company's S3 bucket.

Buy Now
Questions 43

A company has an application that receives and processes purchase orders. The application supports only XML data. The company needs to configure the application to accept orders in JSON format. The company does not want to modify the application.

A solutions architect is using an Amazon API Gateway HTTP API to create a new purchase order API. The solutions architect needs to modify the application DNS record to point to the new HTTP API.

Options:

A.

Use an HTTP proxy integration to pass XML requests to the application. For JSON requests, use API Gateway mappings to convert the purchase orders to XML. Use an AWS Lambda function that is integrated with API Gateway to call the application.

B.

Use an HTTP proxy integration to pass XML requests to the application. For JSON requests, use an AWS Lambda function that is integrated with API Gateway to convert the purchase orders from JSON to XML and to call the application.

C.

Use an HTTP custom integration to pass XML requests to the application. For JSON requests, use API Gateway mappings to convert the purchase orders to XML. Use an AWS Lambda function that is integrated with API Gateway to call the application.

D.

Use an HTTP custom integration to pass XML requests to the application. For JSON requests, use an AWS Lambda function that is integrated with API Gateway to convert the purchase orders to JSON and to call the application.

Buy Now
Questions 44

A company wants to store a large amount of data as objects for analytics and long-term archiving. Resources from outside AWS need to access the data. The external resources need to access the data with unpredictable frequency. However, the external resource must have immediate access when necessary.

The company needs a cost-optimized solution that provides high durability and data security.

Which solution will meet these requirements?

Options:

A.

Store the data in Amazon S3 Standard. Apply S3 Lifecycle policies to transition older data to S3 Glacier Deep Archive.

B.

Store the data in Amazon S3 Intelligent-Tiering.

C.

Store the data in Amazon S3 Glacier Flexible Retrieval. Use expedited retrieval to provide immediate access when necessary.

D.

Store the data in Amazon Elastic File System (Amazon EFS) Infrequent Access (IA). Use lifecycle policies to archive older files.

Buy Now
Questions 45

A company is developing a social media application that must scale to meet demand spikes and handle ordered processes.

Which AWS services meet these requirements?

Options:

A.

ECS with Fargate, RDS, and SQS for decoupling.

B.

ECS with Fargate, RDS, and SNS for decoupling.

C.

DynamoDB, Lambda, DynamoDB Streams, and Step Functions.

D.

Elastic Beanstalk, RDS, and SNS for decoupling.

Buy Now
Questions 46

A company has an application that uses a MySQL database that runs on an Amazon EC2 instance. The instance currently runs in a single Availability Zone. The company requires a fault-tolerant database solution that provides a recovery time objective (RTO) and a recovery point objective (RPO) of 2 minutes or less. Which solution will meet these requirements?

Options:

A.

Migrate the MySQL database to Amazon RDS. Create a read replica in a second Availability Zone. Create a script that detects availability interruptions and promotes the read replica when needed.

B.

Migrate the MySQL database to Amazon RDS for MySQL. Configure the new RDS for MySQL database to use a Multi-AZ deployment.

C.

Create a second MySQL database in a second Availability Zone. Use native MySQL commands to sync the two databases every 2 minutes. Create a script that detects availability interruptions and promotes the second MySQL database when needed.

D.

Create a copy of the EC2 instance that runs the MySQL database. Deploy the copy in a second Availability Zone. Create a Network Load Balancer. Add both instances as targets.

Buy Now
Questions 47

A company is building a serverless application that processes large volumes of data from a mobile app. The application uses an AWS Lambda function to process the data and store the data in an Amazon DynamoDB table.

The company needs to ensure that the application can recover from failures and continue processing data without losing any records.

Which solution will meet these requirements?

Options:

A.

Configure the Lambda function to use a dead-letter queue with an Amazon Simple Queue Service (Amazon SQS) queue. Configure Lambda to retry failed records from the dead-letter queue. Use a retry mechanism by implementing an exponential backoff algorithm.

B.

Configure the Lambda function to read records from Amazon Data Firehose. Replay the Firehose records in case of any failures.

C.

Use Amazon OpenSearch Service to store failed records. Configure AWS Lambda to retry failed records from OpenSearch Service. Use Amazon EventBridge to orchestrate the retry logic.

D.

Use Amazon Simple Notification Service (Amazon SNS) to store the failed records. Configure Lambda to retry failed records from the SNS topic. Use Amazon API Gateway to orchestrate the retry calls.

Buy Now
Questions 48

A company has an application that runs on Amazon EC2 instances within a private subnet in a VPC. The instances access data in an Amazon S3 bucket in the same AWS Region. The VPC contains a NAT gateway in a public subnet to access the S3 bucket. The company wants to reduce costs by replacing the NAT gateway without compromising security or redundancy.

Which solution meets these requirements?

Options:

A.

Replace the NAT gateway with a NAT instance.

B.

Replace the NAT gateway with an internet gateway.

C.

Replace the NAT gateway with a gateway VPC endpoint.

D.

Replace the NAT gateway with an AWS Direct Connect connection.

Buy Now
Questions 49

A company runs a critical public application on Amazon Elastic Kubernetes Service (Amazon EKS) clusters. The application has a microservices architecture. The company needs to implement a solution that collects, aggregates, and summarizes metrics and logs from the application in a centralized location.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Run the Amazon CloudWatch agent in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

B.

Configure a data stream in Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to read events and to deliver the events to an Amazon S3 bucket. Use Amazon Athena to view the events.

C.

Configure AWS CloudTrail to capture data events. Use Amazon OpenSearch Service to query CloudTrail.

D.

Configure Amazon CloudWatch Container Insights in the existing EKS cluster. Use a CloudWatch dashboard to view the metrics and logs.

Buy Now
Questions 50

A company runs an order management application on AWS. The application allows customers to place orders and pay with a credit card. The company uses an Amazon CloudFront distribution to deliver the application. A security team has set up logging for all incoming requests. The security team needs a solution to generate an alert if any user modifies the logging configuration.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Configure an Amazon EventBridge rule that is invoked when a user creates or modifies a CloudFront distribution. Add the AWS Lambda function as a target of the EventBridge rule.

B.

Create an Application Load Balancer (ALB). Enable AWS WAF rules for the ALB. Configure an AWS Config rule to detect security violations.

C.

Create an AWS Lambda function to detect changes in CloudFront distribution logging. Configure the Lambda function to use Amazon Simple Notification Service (Amazon SNS) to send notifications to the security team.

D.

Set up Amazon GuardDuty. Configure GuardDuty to monitor findings from the CloudFront distribution. Create an AWS Lambda function to address the findings.

E.

Create a private API in Amazon API Gateway. Use AWS WAF rules to protect the private API from common security problems.

Buy Now
Questions 51

A company's expense tracking application gives users the ability to upload images of receipts. The application analyzes the receipts to extract information and stores the raw images in Amazon S3. The application is written in Java and runs on Amazon EC2 On-Demand Instances in an Auto Scaling group behind an Application Load Balancer.

The compute costs and storage costs have increased with the popularity of the application.

Which solution will provide the MOST cost savings without affecting application performance?

Options:

A.

Purchase a Compute Savings Plan for the maximum number of necessary EC2 instances. Store the uploaded files in Amazon Elastic File System (Amazon EFS).

B.

Decrease the minimum number of EC2 instances in the Auto Scaling group. Use On-Demand Instances for peak scaling. Store the uploaded files in Amazon Elastic File System (Amazon EFS).

C.

Decrease the maximum number of EC2 instances in the Auto Scaling group. Set up S3 Lifecycle policies to archive the raw images to lower-cost storage tiers after 30 days.

D.

Purchase a Compute Savings Plan for the minimum number of necessary EC2 instances. Use On-Demand Instances for peak scaling. Set up S3 Lifecycle policies to archive the raw images to lower-cost storage tiers after 30 days.

Buy Now
Questions 52

Question:

A company operates an online photo-sharing service and stores data in AWS Account A in a centralized Amazon S3 bucket. The company wants to grant a second AWS account named Account B access to the centralized S3 bucket. The company owns Account B.

Options:

Options:

A.

Enable S3 Transfer Acceleration to provide Account B access to the centralized S3 bucket in Account A.

B.

Enable cross-Region replication between Account A and Account B to share the S3 bucket data.

C.

Use Amazon CloudFront to distribute the S3 bucket contents. Grant Account B access to the bucket contents through a signed URL.

D.

Create a bucket policy that grants Account B permission to access the centralized S3 bucket in Account A.

Buy Now
Questions 53

A marketing company receives a large amount of new clickstream data in Amazon S3 from a marketing campaign The company needs to analyze the clickstream data in Amazon S3 quickly. Then the company needs to determine whether to process the data further in the data pipeline.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data

B.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to query the data.

C.

Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR to query the data.

D.

Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis Data Analytics to use SQL to query the data

Buy Now
Questions 54

A healthcare company is developing an AWS Lambda function that publishes notifications to an encrypted Amazon Simple Notification Service (Amazon SNS) topic. The notifications contain protected health information (PHI).

The SNS topic uses AWS Key Management Service (AWS KMS) customer-managed keys for encryption. The company must ensure that the application has the necessary permissions to publish messages securely to the SNS topic.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.

Create a resource policy for the SNS topic that allows the Lambda function to publish messages to the topic.

B.

Use server-side encryption with AWS KMS keys (SSE-KMS) for the SNS topic instead of customer-managed keys.

C.

Create a resource policy for the encryption key that the SNS topic uses that has the necessary AWS KMS permissions.

D.

Specify the Lambda function's Amazon Resource Name (ARN) in the SNS topic's resourcepolicy.

E.

Associate an Amazon API Gateway HTTP API with the SNS topic to control access to the topic by using API Gateway resource policies.

F.

Configure a Lambda execution role that has the necessary IAM permissions to use a customer-managed key in AWS KMS.

Buy Now
Questions 55

A company runs an application on several Amazon EC2 instances. Multiple Amazon Elastic Block Store (Amazon EBS) volumes are attached to each EC2 instance. The company needs to back up the configurations and the data of the EC2 instances every night. The application must be recoverable in a secondary AWS Region.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Configure an AWS Lambda function to take nightly snapshots of the application's EBS volumes and to copy the snapshots to a secondary Region.

B.

Create a backup plan in AWS Backup to take nightly backups. Copy the backups to a secondary Region. Add the EC2 instances to a resource assignment as part of the backup plan.

C.

Create a backup plan in AWS Backup to take nightly backups. Copy the backups to a secondary Region. Add the EBS volumes to a resource assignment as part of the backup plan.

D.

Configure an AWS Lambda function to take nightly snapshots of the application's EBS volumes and to copy the snapshots to a secondary Availability Zone.

Buy Now
Questions 56

A company is developing a latency-sensitive application. Part of the application includes several AWS Lambda functions that need to initialize as quickly as possible. The Lambda functions are written in Java and contain initialization code outside the handlers to load libraries, initialize classes, and generate unique IDs.

Which solution will meet the startup performance requirement MOST cost-effectively?

Options:

A.

Move all the initialization code to the handlers for each Lambda function. Activate Lambda SnapStart for each Lambda function. Configure SnapStart to reference the $LATEST version of each Lambda function.

B.

Publish a version of each Lambda function. Create an alias for each Lambda function. Configure each alias to point to its corresponding version. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding alias.

C.

Publish a version of each Lambda function. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding version. Activate Lambda SnapStart for the published versions of the Lambda functions.

D.

Update the Lambda functions to add a pre-snapshot hook. Move the code that generates unique IDs into the handlers. Publish a version of each Lambda function. Activate Lambda SnapStart for the published versions of the Lambda functions.

Buy Now
Questions 57

A company is redesigning its data intake process. In the existing process, the company receives data transfers and uploads the data to an Amazon S3 bucket every night. The company uses AWS Glue crawlers and jobs to prepare the data for a machine learning (ML) workflow.

The company needs a low-code solution to run multiple AWS Glue jobs in sequence and provide a visual workflow.

Which solution will meet these requirements?

Options:

A.

Use an Amazon EC2 instance to run a cron job and a script to check for the S3 files and call the AWS Glue jobs. Create an Amazon CloudWatch dashboard to visualize the workflow.

B.

Use Amazon EventBridge to call an AWS Step Functions workflow for the AWS Glue jobs. Use Step Functions to create a visual workflow.

C.

Use S3 Event Notifications to invoke a series of AWS Lambda functions and AWS Glue jobs in sequence. Use Amazon QuickSight to create a visual workflow.

D.

Create an Amazon Elastic Container Service (Amazon ECS) task that contains a Python script that manages the AWS Glue jobs and creates a visual workflow. Use Amazon EventBridge Scheduler to start the ECS task.

Buy Now
Questions 58

A company wants to optimize costs for its AWS infrastructure. The company wants to receive notifications when actual costs or forecasted costs exceed a specified budget. The company does not want to develop a custom solution.

Which solution will meet these requirements?

Options:

A.

Use AWS Trusted Advisor to set up budget notifications. Configure Amazon CloudWatch to monitor costs. Export CloudWatch data to Amazon S3. Use machine learning (ML) to estimate future trends based on the CloudWatch data.

B.

Create a budget in AWS Budgets that has a specified cost threshold. Create an AWS Lambda function that sends a notification to the company when costs reach the specified threshold. Use AWS Billing and Cost Management reports to monitor costs.

C.

Use AWS Cost Explorer to set a specified budget threshold. Create an AWS Lambda function to calculate cost estimates. Configure the Lambda function to send a notification to an Amazon Simple Notification Service (Amazon SNS) topic if estimated costs exceed the specified threshold.

D.

Create a budget in AWS Budgets that has a specified cost threshold. Configure AWS Budgets to send budget alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Use AWS Cost Explorer to monitor costs.

Buy Now
Questions 59

Question:

A company wants to migrate an application to AWS. The application runs on Docker containers behind an Application Load Balancer (ALB). The application stores data in a PostgreSQL database. The cloud-based solution must use AWS WAF to inspect all application traffic. The application experiences most traffic on weekdays. There is significantly less traffic on weekends. Which solution will meet these requirements in the MOST cost-effective way?

Options:

Options:

A.

Use a Network Load Balancer (NLB). Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the NLB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon RDS for PostgreSQL as the database.

B.

Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the ALB. Run the application on Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon RDS for PostgreSQL as the database.

C.

Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the ALB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon Aurora Serverless as the database.

D.

Use a Network Load Balancer (NLB). Create a web access control list (web ACL) in AWS WAF that has the necessary rules. Attach the web ACL to the NLB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon Aurora Serverless as the database.

Buy Now
Questions 60

A company has a website that handles dynamic traffic loads. The website architecture is based on Amazon EC2 instances in an Auto Scaling group that is configured to use scheduled scaling. Each EC2 instance runs code from an Amazon Elastic File System (Amazon EFS) volume and stores shared data back to the same volume.

The company wants to optimize costs for the website.

Which solution will meet this requirement?

Options:

A.

Reconfigure the Auto Scaling group to set a desired number of instances. Turn off scheduled scaling.

B.

Create a new launch template version for the Auto Scaling group that uses larger EC2 instances.

C.

Reconfigure the Auto Scaling group to use a target tracking scaling policy.

D.

Replace the EFS volume with instance store volumes.

Buy Now
Questions 61

A company has an ecommerce application that users access through multiple mobile apps and web applications. The company needs a solution that will receive requests from the mobile apps and web applications through an API.

Request traffic volume varies significantly throughout each day. Traffic spikes during sales events. The solution must be loosely coupled and ensure that no requests are lost.

Options:

A.

Create an Application Load Balancer (ALB). Create an AWS Elastic Beanstalk endpoint to process the requests. Add the Elastic Beanstalk endpoint to the target group of the ALB.

B.

Set up an Amazon API Gateway REST API with an integration to an Amazon Simple Queue Service (Amazon SQS) queue. Configure a dead-letter queue. Create an AWS Lambda function to poll the queue to process the requests.

C.

Create an Application Load Balancer (ALB). Create an AWS Lambda function to process the requests. Add the Lambda function as a target of the ALB.

D.

Set up an Amazon API Gateway HTTP API with an integration to an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to process the requests. Subscribe the function to the SNS topic to process the requests.

Buy Now
Questions 62

A company wants to create a long-term storage solution that will allow users to upload terabytes of images and videos. The company will use the images and videos to train machine learning (ML) models. The storage solution must be scalable and cost-optimized.

Which solution will meet these requirements?

Options:

A.

Provision an Amazon S3 bucket for users to upload images and videos. Copy the data from the S3 bucket to an Amazon FSx for Lustre file system to make the data available for ML model training.

B.

Provision an Amazon S3 bucket for users to upload images and videos. Configure the S3 bucket to make the data available to Amazon SageMaker AI training. Store the data in the S3 Intelligent-Tiering storage class.

C.

Configure an Amazon SageMaker AI notebook instance with 16 GB of storage. Create a custom application to allow users to upload images and videos directly to the notebook instance.

D.

Provision an Amazon S3 bucket for users to upload images and videos. Copy the data from the S3 bucket to an Amazon Elastic File System (Amazon EFS) file system to make the data available for ML model training.

Buy Now
Questions 63

A healthcare provider is planning to store patient data on AWS as PDF files. To comply with regulations, the company must encrypt the data and store the files in multiple locations. The data must be available for immediate access from any environment.

Options:

A.

Store the files in an Amazon S3 bucket. Use the Standard storage class. Enable server-side encryption with Amazon S3 managed keys (SSE-S3) on the bucket. Configure cross-Region replication on the bucket.

B.

Store the files in an Amazon Elastic File System (Amazon EFS) volume. Use an AWS KMS managed key to encrypt the EFS volume. Use AWS DataSync to replicate the EFS volume to a second AWS Region.

C.

Store the files in an Amazon Elastic Block Store (Amazon EBS) volume. Configure AWS Backup to back up the volume on a regular schedule. Use an AWS KMS key to encrypt the backups.

D.

Store the files in an Amazon S3 bucket. Use the S3 Glacier Flexible Retrieval storage class. Ensure that all PDF files are encrypted by using client-side encryption before the files are uploaded. Configure cross-Region replication on the bucket.

Buy Now
Questions 64

A company is developing a highly available natural language processing (NLP) application. The application handles large volumes of concurrent requests. The application performs NLP tasks such as entity recognition, sentiment analysis, and key phrase extraction on text data.

The company needs to store data that the application processes in a highly available and scalable database.

Options:

Options:

A.

Create an Amazon API Gateway REST API endpoint to handle incoming requests. Configure the REST API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Comprehend to perform NLP tasks on the text data. Store the processed data in Amazon DynamoDB.

B.

Create an Amazon API Gateway HTTP API endpoint to handle incoming requests. Configure the HTTP API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Translate to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

C.

Create an Amazon SQS queue to buffer incoming requests. Deploy the NLP application on Amazon EC2 instances in an Auto Scaling group. Use Amazon Comprehend to perform NLP tasks. Store the processed data in an Amazon RDS database.

D.

Create an Amazon API Gateway WebSocket API endpoint to handle incoming requests. Configure the WebSocket API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Textract to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

Buy Now
Questions 65

A media company uses an Amazon CloudFront distribution to deliver content over the internet The company wants only premium customers to have access to the media streams and file content. The company stores all content in an Amazon S3 bucket. The company also delivers content on demand to customers for a specific purpose, such as movie rentals or music downloads.

Which solution will meet these requirements?

Options:

A.

Generate and provide S3 signed cookies to premium customers

B.

Generate and provide CloudFront signed URLs to premium customers.

C.

Use origin access control (OAC) to limit the access of non-premium customers

D.

Generate and activate field-level encryption to block non-premium customers.

Buy Now
Questions 66

A company runs a production database on Amazon RDS for MySQL. The company wants to upgrade the database version for security compliance reasons. Because the database contains critical data, the company wants a quick solution to upgrade and test functionality without losing any data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an RDS manual snapshot. Upgrade to the new version of Amazon RDS for MySQL.

B.

Use native backup and restore. Restore the data to the upgraded new version of Amazon RDS for MySQL.

C.

Use AWS Database Migration Service (AWS DMS) to replicate the data to the upgraded new version of Amazon RDS for MySQL.

D.

Use Amazon RDS Blue/Green Deployments to deploy and test production changes.

Buy Now
Questions 67

A company wants to provide a third-party system that runs in a private data center with access to its AWS account. The company wants to call AWS APIs directly from the third-party system. The company has an existing process for managing digital certificates. The company does not want to use SAML or OpenID Connect (OIDC) capabilities and does not want to store long-term AWS credentials.

Which solution will meet these requirements?

Options:

A.

Configure mutual TLS to allow authentication of the client and server sides of the communication channel.

B.

Configure AWS Signature Version 4 to authenticate incoming HTTPS requests to AWS APIs.

C.

Configure Kerberos to exchange tickets for assertions that can be validated by AWS APIs.

D.

Configure AWS Identity and Access Management (IAM) Roles Anywhere to exchange X.509 certificates for AWS credentials to interact with AWS APIs.

Buy Now
Questions 68

A company is building an application on AWS that connects to an Amazon RDS database. The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials.

B.

Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials.

C.

Use an encrypted application configuration file Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials.

D.

Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials.

Buy Now
Questions 69

A company wants to use a data lake that is hosted on Amazon S3 to provide analytics services for historical data. The data lake consists of 800 tables but is expected to grow to thousands of tables. More than 50 departments use the tables, and each department has hundreds of users. Different departments need access to specific tables and columns.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an IAM role for each department. Use AWS Lake Formation based access control to grant each IAM role access to specific tables and columns. Use Amazon Athena to analyze the data.

B.

Create an Amazon Redshift cluster for each department. Use AWS Glue to ingest into the Redshift cluster only the tables and columns that are relevant to that department. Create Redshift database users. Grant the users access to the relevant department's Redshift cluster. Use Amazon Redshift to analyze the data.

C.

Create an IAM role for each department. Use AWS Lake Formation tag-based access control to grant each IAM role access to only the relevant resources. Create LF-tags that are attached to tables and columns. Use Amazon Athena to analyze the data.

D.

Create an Amazon EMR cluster for each department. Configure an IAM service role for each EMR cluster to access relevant S3 files. For each department's users, create an IAM role that provides access to the relevant EMR cluster. Use Amazon EMR to analyze the data.

Buy Now
Questions 70

A company has an employee web portal. Employees log in to the portal to view payroll details. The company is developing a new system to give employees the ability to upload scanned documents for reimbursement. The company runs a program to extract text-based data from the documents and attach the extracted information to each employee's reimbursement IDs for processing.

The employee web portal requires 100% uptime. The document extract program runs infrequently throughout the day on an on-demand basis. The company wants to build a scalable and cost-effective new system that will require minimal changes to the existing web portal. The company does not want to make any code changes.

Which solution will meet these requirements with the LEAST implementation effort?

Options:

A.

Run Amazon EC2 On-Demand Instances in an Auto Scaling group for the web portal. Use an AWS Lambda function to run the document extract program. Invoke the Lambda function when an employee uploads a new reimbursement document.

B.

Run Amazon EC2 Spot Instances in an Auto Scaling group for the web portal. Run the document extract program on EC2 Spot Instances Start document extract program instances when an employee uploads a new reimbursement document.

C.

Purchase a Savings Plan to run the web portal and the document extract program. Run the web portal and the document extract program in an Auto Scaling group.

D.

Create an Amazon S3 bucket to host the web portal. Use Amazon API Gateway and an AWS Lambda function for the existing functionalities. Use the Lambda function to run the document extract program. Invoke the Lambda function when the API that is associated with a new document upload is called.

Buy Now
Questions 71

A company has an application that runs on a single Amazon EC2 instance. The application uses a MySQL database that runs on the same EC2 instance. The company needs a highly available and automatically scalable solution to handle increased traffic.

Which solution will meet these requirements?

Options:

A.

Deploy the application to EC2 instances that run in an Auto Scaling group behind an Application Load Balancer. Create an Amazon Redshift cluster that has multiple MySQL-compatible nodes.

B.

Deploy the application to EC2 instances that are configured as a target group behind an Application Load Balancer. Create an Amazon RDS for MySQL cluster that has multiple instances.

C.

Deploy the application to EC2 instances that run in an Auto Scaling group behind an Application Load Balancer. Create an Amazon Aurora Serverless MySQL cluster for the database layer.

D.

Deploy the application to EC2 instances that are configured as a target group behind an Application Load Balancer. Create an Amazon ElastiCache (Redis OSS) cluster that uses the MySQL connector.

Buy Now
Questions 72

A company hosts an Amazon EC2 instance in a private subnet in a new VPC. The VPC also has a public subnet that has the default route set to an internet gateway. The private subnet does not have outbound internet access.

The EC2 instance needs to have the ability to download monthly security updates from an outside vendor. However, the company must block any connections that are initiated from the internet.

Which solution will meet these requirements?

Options:

A.

Configure the private subnet route table to use the internet gateway as the default route.

B.

Create a NAT gateway in the public subnet. Configure the private subnet route table to use the NAT gateway as the default route.

C.

Create a NAT instance in the private subnet. Configure the private subnet route table to use the NAT instance as the default route.

D.

Create a NAT instance in the private subnet. Configure the private subnet route table to use the internet gateway as the default route.

Buy Now
Questions 73

A company needs to set up a centralized solution to audit API calls to AWS for workloads that run on AWS services and non AWS services. The company must store logs of the audits for 7 years.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Set up a data lake in Amazon S3. Incorporate AWS CloudTrail logs and logs from non AWS services into the data lake. Use CloudTrail to store the logs for 7 years.

B.

Configure custom integrations for AWS CloudTrail Lake to collect and store CloudTrail events from AWS services and non AWS services. Use CloudTrail to store the logs for 7 years.

C.

Enable AWS CloudTrail for AWS services. Ingest non AWS services into CloudTrail to store the logs for 7 years

D.

Create new Amazon CloudWatch Logs groups. Send the audit data from non AWS services to the CloudWatch Logs groups. Enable AWS CloudTrail for workloads that run on AWS. Use CloudTrail to store the logs for 7 years.

Buy Now
Questions 74

A company is deploying an application that processes streaming data in near-real time. The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to provide the lowest possible latency between nodes.

Which networking solution meets these requirements?

Options:

A.

Place the EC2 instances in multiple VPCs, and configure VPC peering.

B.

Attach an Elastic Fabric Adapter (EFA) to each EC2 instance.

C.

Run the EC2 instances in a spread placement group.

D.

Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.

Buy Now
Questions 75

A weather forecasting company collects temperature readings from various sensors on a continuous basis. An existing data ingestion process collects the readings and aggregates the readings into larger Apache Parquet files. Then the process encrypts the files by using client-side encryption with KMS managed keys (CSE-KMS). Finally, the process writes the files to an Amazon S3 bucket with separate prefixes for each calendar day.

The company wants to run occasional SQL queries on the data to take sample moving averages for a specific calendar day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Configure Amazon Athena to read the encrypted files. Run SQL queries on the data directly in Amazon S3.

B.

Use Amazon S3 Select to run SQL queries on the data directly in Amazon S3.

C.

Configure Amazon Redshift to read the encrypted files Use Redshift Spectrum and Redshift query editor v2 to run SQL queries on the data directly in Amazon S3.

D.

Configure Amazon EMR Serverless to read the encrypted files. Use Apache SparkSQL to run SQL queries on the data directly in Amazon S3.

Buy Now
Questions 76

A company has a large fleet of vehicles that are equipped with internet connectivity to send telemetry to the company. The company receives over 1 million data points every 5 minutes from the vehicles. The company uses the data in machine learning (ML) applications to predict vehicle maintenance needs and to preorder parts. The company produces visual reports based on the captured data. The company wants to migrate the telemetry ingestion, processing, and visualization workloads to AWS. Which solution will meet these requirements?

Options:

A.

Use Amazon Timestream for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon QuickSight to visualize the data.

B.

Use Amazon DynamoDB to store the data points. Use DynamoDB Connector to ingest data from DynamoDB into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.

C.

Use Amazon Neptune to store the data points. Use Amazon Kinesis Data Streams to ingest data from Neptune into an AWS Lambda function for processing. Use Amazon QuickSight to visualize the data.

D.

Use Amazon Timestream to for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon Athena to visualize the data.

Buy Now
Questions 77

A company is building a new application that uses multiple serverless architecture components. The application architecture includes an Amazon API Gateway REST API and AWS Lambda functions to manage incoming requests.

The company needs a service to send messages that the REST API receives to multiple target Lambda functions for processing. The service must filter messages so each target Lambda function receives only the messages the function needs.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Send the requests from the REST API to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe multiple Amazon Simple Queue Service (Amazon SQS) queues to the SNS topic. Configure the target Lambda functions to poll the SQS queues.

B.

Send the requests from the REST API to a set of Amazon EC2 instances that are configured to process messages. Configure the instances to filter messages and to invoke the target Lambda functions.

C.

Send the requests from the REST API to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Configure Amazon MSK to publish the messages to the target Lambda functions.

D.

Send the requests from the REST API to multiple Amazon Simple Queue Service (Amazon SQS) queues. Configure the target Lambda functions to poll the SQS queues.

Buy Now
Questions 78

A company plans to use AWS to run high-performance computing (HPC) workloads and analytics workloads. The company will run HPC workloads on Amazon EC2 instances. The workloads require a high-performance file system that can scale to millions of input/output operations per second (IOPS). Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.

Use Amazon Elastic File System (Amazon EFS) as a high-performance file system.

B.

Use Amazon FSx for Lustre as a high-performance file system.

C.

Create an Auto Scaling group of Amazon EC2 instances. Use Reserved Instances. Configure a spread placement group. Use AWS Batch to run the analytics workloads.

D.

Use Mountpoint for Amazon S3 as a high-performance file system.

E.

Create an Auto Scaling group of Amazon EC2 instances. Use a mix of On-Demand Instances, Reserved Instances, and Spot Instances. Configure a cluster placement group. Use Amazon EMR to run the analytics workloads.

Buy Now
Questions 79

A company needs to run a critical data processing workload that uses a Python script every night. The workload takes 1 hour to finish.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type. Use the Fargate Spot capacity provider. Schedule the job to run once every night.

B.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type. Schedule the job to run once every night.

C.

Create an AWS Lambda function that uses the existing Python code. Configure Amazon EventBridge to invoke the function once every night.

D.

Create an Amazon EC2 On-Demand Instance that runs Amazon Linux. Migrate the Python script to the instance. Use a cron job to schedule the script. Create an AWS Lambda function to start and stop the instance once every night.

Buy Now
Questions 80

A company has stored millions of objects across multiple prefixes in an Amazon S3 bucket by using the Amazon S3 Glacier Deep Archive storage class. The company needs to delete all data older than 3 years except for a subset of data that must be retained. The company has identified the data that must be retained and wants to implement a serverless solution.

Which solution will meet these requirements?

Options:

A.

Use S3 Inventory to list all objects. Use the AWS CLI to create a script that runs on an Amazon EC2 instance that deletes objects from the inventory list.

B.

Use AWS Batch to delete objects older than 3 years except for the data that must be retained

C.

Provision an AWS Glue crawler to query objects older than 3 years. Save the manifest file of old objects. Create a script to delete objects in the manifest.

D.

Enable S3 Inventory. Create an AWS Lambda function to filter and delete objects. Invoke the Lambda function with S3 Batch Operations to delete objects by using the inventory reports.

Buy Now
Questions 81

A company is developing a public web application that needs to access multiple AWS services. The application will have hundreds of users who must log in to the application first before using the services.

The company needs to implement a secure and scalable method to grant the web application temporary access to the AWS resources.

Which solution will meet these requirements?

Options:

A.

Create an IAM role for each AWS service that the application needs to access. Assign the roles directly to the instances that the web application runs on.

B.

Create an IAM role that has the access permissions the web application requires. Configure the web application to use AWS Security Token Service (AWS STS) to assume the IAM role. Use STS tokens to access the required AWS services.

C.

Use AWS IAM Identity Center to create a user pool that includes the application users. Assign access credentials to the web application users. Use the credentials to access the required AWS services.

D.

Create an IAM user that has programmatic access keys for the AWS services. Store the access keys in AWS Systems Manager Parameter Store. Retrieve the access keys from Parameter Store. Use the keys in the web application.

Buy Now
Questions 82

A company has developed an API by using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static content and dynamic content to users worldwide. The company wants to decrease the latency of transferring the content for API requests. Which solution will meet these requirements?

Options:

A.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

B.

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

C.

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

D.

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

Buy Now
Questions 83

A company that has multiple AWS accounts maintains an on-premises Microsoft Active Directory. The company needs a solution to implement Single Sign-On for its employees. The company wants to use AWS IAM Identity Center.

The solution must meet the following requirements:

Allow users to access AWS accounts and third-party applications by using existing Active Directory credentials.

Enforce multi-factor authentication (MFA) to access AWS accounts.

Centrally manage permissions to access AWS accounts and applications.

Options:

Options:

A.

Create an IAM identity provider for Active Directory in each AWS account. Ensure that Active Directory users and groups access AWS accounts directly through IAM roles. Use IAM Identity Center to enforce MFA in each account for all users.

B.

Use AWS Directory Service to create a new AWS Managed Microsoft AD Active Directory. Configure IAM Identity Center in each account to use the new AWS Managed Microsoft AD Active Directory as the identity source. Use IAM Identity Center to enforce MFA for all users.

C.

Use IAM Identity Center with the existing Active Directory as the identity source. Enforce MFA for all users. Use AWS Organizations and Active Directory groups to manage access permissions for AWS accounts and application access.

D.

Use AWS Lambda functions to periodically synchronize Active Directory users and groups with IAM users and groups in each AWS account. Use IAM roles and policies to manage application access. Create a second Lambda function to enforce MFA.

Buy Now
Questions 84

A company is migrating its databases to Amazon RDS for PostgreSQL. The company is migrating its applications to Amazon EC2 instances. The company wants to optimize costs for long-running workloads.

Which solution will meet this requirement MOST cost-effectively?

Options:

A.

Use On-Demand Instances for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year Compute Savings Plan with the No Upfront option for the EC2 instances.

B.

Purchase Reserved Instances for a 1 year term with the No Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the No Upfront option for the EC2 instances.

C.

Purchase Reserved Instances for a 1 year term with the Partial Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the Partial Upfront option for the EC2 instances.

D.

Purchase Reserved Instances for a 3 year term with the All Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 3 year EC2 Instance Savings Plan with the All Upfront option for the EC2 instances.

Buy Now
Questions 85

A company is planning to connect a remote office to its AWS infrastructure. The office requires permanent and secure connectivity to AWS. The connection must provide secure access to resources in two VPCs. However, the VPCs must not be able to access each other.

Options:

A.

Create two transit gateways. Set up one AWS Site-to-Site VPN connection from the remote office to each transit gateway. Connect one VPC to the transit gateway. Configure route table propagation to the appropriate transit gateway based on the destination VPC IP range.

B.

Set up one AWS Site-to-Site VPN connection from the remote office to each of the VPCs. Update the VPC route tables with static routes to the remote office resources.

C.

Set up one AWS Site-to-Site VPN connection from the remote office to one of the VPCs. Set up VPC peering between the two VPCs. Update the VPC route tables with static routes to the remote office and peered resources.

D.

Create a transit gateway. Set up an AWS Direct Connect gateway and one Direct Connect connection between the remote office and the Direct Connect gateway. Associate the transit gateway with the Direct Connect gateway. Configure a separate private virtual interface (VIF) for each VPC, and configure routing.

Buy Now
Questions 86

How can trade data from DynamoDB be ingested into an S3 data lake for near real-time analysis?

Options:

A.

Use DynamoDB Streams to invoke a Lambda function that writes to S3.

B.

Use DynamoDB Streams to invoke a Lambda function that writes to Data Firehose, which writes to S3.

C.

Enable Kinesis Data Streams on DynamoDB. Configure it to invoke a Lambda function that writes to S3.

D.

Enable Kinesis Data Streams on DynamoDB. Use Data Firehose to write to S3.

Buy Now
Questions 87

A company hosts an application on AWS. The application gives users the ability to upload photos and store the photos in an Amazon S3 bucket. The company wants to use Amazon CloudFront and a custom domain name to upload the photo files to the S3 bucket in the eu-west-1 Region.

Which solution will meet these requirements? (Select TWO.)

Options:

A.

Use AWS Certificate Manager (ACM) to create a public certificate in the us-east-1 Region. Use the certificate in CloudFront

B.

Use AWS Certificate Manager (ACM) to create a public certificate in eu-west-1. Use the certificate in CloudFront.

C.

Configure Amazon S3 to allow uploads from CloudFront. Configure S3 Transfer Acceleration.

D.

Configure Amazon S3 to allow uploads from CloudFront origin access control (OAC).

E.

Configure Amazon S3 to allow uploads from CloudFront. Configure an Amazon S3 website endpoint.

Buy Now
Questions 88

A company stores petabytes of historical medical information on premises. The company has a process to manage encryption of the data to comply with regulations. The company needs a cloud-based solution for data backup, recovery, and archiving. The company must retain control over the encryption key material. Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Create an AWS Key Management Service (AWS KMS) key without key material. Import the company's key material into the KMS key.

B.

Create an AWS Key Management Service (AWS KMS) encryption key that contains key material generated by AWS KMS.

C.

Store the data in Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage. Use S3 Bucket Keys with AWS Key Management Service (AWS KMS) keys.

D.

Store the data in an Amazon S3 Glacier storage class. Use server-side encryption with customer-provided keys (SSE-C).

E.

Store the data in AWS Snowball devices. Use server-side encryption with AWS KMS keys (SSE-KMS).

Buy Now
Questions 89

A company is migrating an online marketplace application from a mainframe system to an Auto Scaling group of Amazon EC2 instances. The EC2 instances access an Amazon Aurora cluster. The application requires a scalable, persistent caching solution to store the results of in-progress transactions and SQL queries.

Options:

A.

Use an Amazon ElastiCache (Redis OSS) cluster to serve transaction and query results.

B.

Use an Amazon CloudFront distribution with an Amazon S3 bucket as the origin to cache the transactions. Add an Amazon EC2 instance store volume to the EC2 instances for query result caching.

C.

Use an Amazon ElastiCache (Memcached) cluster to serve transaction and query results.

D.

Use an Amazon ElastiCache (Redis OSS) cluster to cache the transactions. Add an Amazon EC2 instance store volume to the EC2 instances for query result caching.

Buy Now
Questions 90

A company runs an online order management system on AWS. The company stores order and inventory data for the previous 5 years in an Amazon Aurora MySQL database. The company deletes inventory data after 5 years.

The company wants to optimize costs to archive data.

Which solution will meet this requirement?

Options:

A.

Create an AWS Glue crawler to export data to Amazon S3. Create an AWS Lambda function to compress the data.

B.

Use the SELECT INTO OUTFILE S3 query on the Aurora database to export the data to Amazon S3. Configure S3 Lifecycle rules on the S3 bucket.

C.

Create an AWS Glue DataBrew job to migrate data from Aurora to Amazon S3. Configure S3 Lifecycle rules on the S3 bucket.

D.

Use the AWS Schema Conversion Tool (AWS SCT) to replicate data from Aurora to Amazon S3. Use the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.

Buy Now
Questions 91

A company uses an Amazon DynamoDB table to store data that the company receives from devices. The DynamoDB table supports a customer-facing website to display recent activity oncustomer devices The company configured the table with provisioned throughput for writes and reads

The company wants to calculate performance metrics for customer device data on a daily basis. The solution must have minimal effect on the table's provisioned read and write capacity

Which solution will meet these requirements?

Options:

A.

Use an Amazon Athena SQL query with the Amazon Athena DynamoDB connector to calculate performance metrics on a recurring schedule.

B.

Use an AWS Glue job with the AWS Glue DynamoDB export connector to calculate performance metrics on a recurring schedule.

C.

Use an Amazon Redshift COPY command to calculate performance metrics on a recurring schedule.

D.

Use an Amazon EMR job with an Apache Hive external table to calculate performance metrics on a recurring schedule.

Buy Now
Questions 92

A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2 instances. The application has a U1 that uses Amazon DynamoDB and data services that use Amazon S3 as part of the application deployment.

The company must ensure that the EKS Pods for the U1 can access only Amazon DynamoDB and that the EKS Pods for the data services can access only Amazon S3. The company uses AWS Identity and Access Management |IAM).

Which solution meets these requirements?

Options:

A.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach both IAM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to Amazon S3 or DynamoDB (or the respective EKS Pods.

B.

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach the Amazon S3 IAM policy directly to the EKS Pods (or the data services and the DynamoDB policy to the EKS Pods for the U1.

C.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Attach the Amazon S3 Full Access policy to the data services account and the AmazonDynamoDBFullAccess policy to the U1 service account.

D.

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Use IAM Role for Service Accounts (IRSA) to provide access to the EKS Pods for the U1 to Amazon S3 and the EKS Pods for the data services to DynamoDB.

Buy Now
Questions 93

A company is implementing a new policy to enhance the security of its AWS environment. The policy requires all administrative actions that users perform on the AWS Management Console to be secured by multi-factor authentication (MFA).

Which solution will allow the company to enforce this policy in the MOST operationally efficient way?

Options:

A.

Enable MFA on the root account. Ensure that all administrators use the root account to perform administrative actions.

B.

Create an IAM policy that requires MFA to be enabled for the IAM roles that administrators assume to perform administrative actions.

C.

Configure an Amazon CloudWatch alarm that sends an email notification when an administrator performs an administrative action without MFA.

D.

Use AWS Config to periodically audit IAM users and to automatically attach an IAM policy that requires MFA when AWS Config detects administrative actions.

Buy Now
Questions 94

A company stores data in an on-premises Oracle relational database. The company needs to make the data available in Amazon Aurora PostgreSQL for analysis The company uses an AWS Site-to-Site VPN connection to connect its on-premises network to AWS.

The company must capture the changes that occur to the source database during the migration to Aurora PostgreSQL.

Which solution will meet these requirements?

Options:

A.

Use the AWS Schema Conversion Tool (AWS SCT) to convert the Oracle schema to Aurora PostgreSQL schema. Use the AWS Database Migration Service (AWS DMS) full-load migration task to migrate the data.

B.

Use AWS DataSync to migrate the data to an Amazon S3 bucket. Import the S3 data to Aurora PostgreSQL by using the Aurora PostgreSQL aws_s3 extension.

C.

Use the AWS Schema Conversion Tool (AWS SCT) to convert the Oracle schema to Aurora PostgreSQL schema. Use AWS Database Migration Service (AWS DMS) to migrate the existing data and replicate the ongoing changes.

D.

Use an AWS Snowball device to migrate the data to an Amazon S3 bucket. Import the S3 data to Aurora PostgreSQL by using the Aurora PostgreSQL aws_s3 extension.

Buy Now
Questions 95

A company is building a gaming application that needs to send unique events to multiple leaderboards, player matchmaking systems, and authentication services concurrently. The company requires an AWS-based event-driven system that delivers events in order and supports a publish-subscribe model. The gaming application must be the publisher, and the leaderboards, matchmaking systems, and authentication services must be the subscribers.

Which solution will meet these requirements?

Options:

A.

Amazon EventBridge event buses

B.

Amazon Simple Notification Service (Amazon SNS) FIFO topics

C.

Amazon Simple Notification Service (Amazon SNS) standard topics

D.

Amazon Simple Queue Service (Amazon SQS) FIFO queues

Buy Now
Questions 96

A company wants to deploy an AWS Lambda function that will read and write objects to Amazon S3 bucket. The Lambda function must be connected to the company's VPC. The company must deploy the Lambda function only to private subnets in the VPC. The Lambda function must not be allowed to access the internet.

Which solutions will meet these requirements? (Select TWO.)

Options:

A.

Create a private NAT gateway to access the S3 bucket.

B.

Attach an Elastic IP address to the NAT gateway.

C.

Create a gateway VPC endpoint for the S3 bucket.

D.

Create an interface VPC endpoint for the S3 bucket.

E.

Create a public NAT gateway to access the S3 bucket.

Buy Now
Questions 97

An ecommerce company runs a multi-tier application on AWS. The frontend and backend tiers both run on Amazon EC2 instances. The database tier runs on an Amazon RDS for MySQL DB instance. The backend tier communicates with the RDS DB instance.

The application makes frequent calls to return identical datasets from the database. The frequent calls on the database cause performance slowdowns. A solutions architect must improve the performance of the application backend.

Which solution will meet this requirement?

Options:

A.

Configure an Amazon Simple Notification Service (Amazon SNS) topic between the EC2 instances and the RDS DB instance.

B.

Configure an Amazon ElastiCache (Redis OSS) cache. Configure the backend EC2 instances to read from the cache.

C.

Configure an Amazon DynamoDB Accelerator (DAX) cluster. Configure the backend EC2 instances to read from the cluster.

D.

Configure Amazon Data Firehose to stream the calls to the database.

Buy Now
Questions 98

A company has an ordering application that stores customer information in Amazon RDS for MySQL. During regular business hours, employees run one-time queries for reporting purposes. Timeouts are occurring during order processing because the reporting queries are taking a long time to run. The company needs to eliminate the timeouts without preventing employees from performing queries.

Options:

A.

Create a read replica. Move reporting queries to the read replica.

B.

Create a read replica. Distribute the ordering application to the primary DB instance and the read replica.

C.

Migrate the ordering application to Amazon DynamoDB with on-demand capacity.

D.

Schedule the reporting queries for non-peak hours.

Buy Now
Questions 99

A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.

B.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

C.

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.

D.

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

Buy Now
Questions 100

A company has set up hybrid connectivity between an on-premises data center and AWS by using AWS Site-to-Site VPN. The company is migrating a workload to AWS.

The company sets up a VPC that has two public subnets and two private subnets. The company wants to monitor the total packet loss and round-trip-time (RTT) between the data center and AWS.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use Amazon CloudWatch Network Monitor to set up Internet Control Message Protocol (ICMP) probe monitoring from each subnet to the on-premises destination.

B.

Create an Amazon EC2 instance in each subnet. Create a scheduled job to send Internet Control Message Protocol (ICMP) packets to the on-premises destination.

C.

Create an AWS Lambda function in each subnet. Write a script to perform Internet Control Message Protocol (ICMP) connectivity checks.

D.

Create an AWS Batch job in each subnet. Write a script to perform Internet Control Message Protocol (ICMP) connectivity checks.

Buy Now
Questions 101

A company is planning to migrate an on-premises online transaction processing (OLTP) database that uses MySQL to an AWS managed database management system. Several reporting and analytics applications use the on-premises database heavily on weekends and at the end of each month. The cloud-based solution must be able to handle read-heavy surges during weekends and at the end of each month.

Which solution will meet these requirements?

Options:

A.

Migrate the database to an Amazon Aurora MySQL cluster. Configure Aurora Auto Scaling to use replicas to handle surges.

B.

Migrate the database to an Amazon EC2 instance that runs MySQL. Use an EC2 instance type that has ephemeral storage. Attach Amazon EBS Provisioned IOPS SSD (io2) volumes to the instance.

C.

Migrate the database to an Amazon RDS for MySQL database. Configure the RDS for MySQL database for a Multi-AZ deployment, and set up auto scaling.

D.

Migrate from the database to Amazon Redshift. Use Amazon Redshift as the database for both OLTP and analytics applications.

Buy Now
Questions 102

A company is launching a new gaming application. The company will use Amazon EC2 Auto Scaling groups to deploy the application. The application stores user data in a relational database.

The company has office locations around the world that need to run analytics on the user data in the database. The company needs a cost-effective database solution that provides cross-Region disaster recovery with low-latency read performance across AWS Regions.

Which solution will meet these requirements?

Options:

A.

Create an Amazon ElastiCache for Redis cluster in the Region where the application is deployed. Create read replicas in Regions where the company offices are located. Ensure the company offices read from the read replica instances.

B.

Create Amazon DynamoDB global tables. Deploy the tables to the Regions where the company offices are located and to the Region where the application is deployed. Ensure that each company office reads from the tables that are in the same Region as the office.

C.

Create an Amazon Aurora global database. Configure the primary cluster to be in the Region where the application is deployed. Configure the secondary Aurora replicas to be in the Regions where the company offices are located. Ensure the company offices read from the Aurora replicas.

D.

Create an Amazon RDS Multi-AZ DB cluster deployment in the Region where the application is deployed. Ensure the company offices read from read replica instances.

Buy Now
Questions 103

An ecommerce company hosts a three-tier web application in a VPC. The web tier runs on Amazon EC2 instances in two Availability Zones. The company stores a product catalog and customer sales information in Amazon DynamoDB.

The company's finance team uses a reporting application to generate reports of daily product sales. When the finance team runs the daily reports, a sudden performance decrease affects website customers.

The company wants to improve the performance of the system.

Which solution will meet these requirements with MINIMAL changes to the current architecture?

Options:

A.

Migrate the application to larger EC2 instances. Migrate the database to Amazon RDS for MySQL. Configure a read replica of the database in a second Availability Zone.

B.

Increase the compute capacity of the EC2 instances. Migrate the database to Amazon ElastiCache (Memcached).

C.

Implement DynamoDB Accelerator (DAX).

D.

Configure DynamoDB streams.

Buy Now
Questions 104

A company is developing a SaaS solution for customers. The solution runs on Amazon EC2 instances that have Amazon Elastic Block Store (Amazon EBS) volumes attached.

Within the SaaS application, customers can request how much storage they need. The application needs to allocate the amount of block storage each customer requests.

A solutions architect must design an operationally efficient solution that meets the storage scaling requirement.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Migrate the data from the EBS volumes to an Amazon S3 bucket. Use the Amazon S3 Standard storage class.

B.

Migrate the data from the EBS volumes to an Amazon Elastic File System (Amazon EFS) file system. Use the EFS Standard storage class. Invoke an AWS Lambda function to increase the EFS volume capacity based on user input.

C.

Migrate the data from the EBS volumes to an Amazon FSx for Windows File Server file system. Invoke an AWS Lambda function to increase the capacity of the file system based on user input.

D.

Invoke an AWS Lambda function to increase the size of EBS volumes based on user input by using EBS Elastic Volumes.

Buy Now
Questions 105

A company hosts an application on AWS that stores files that users need to access. The application uses two Amazon EC2 instances. One instance is in Availability Zone A, and the second instance is in Availability Zone B. Both instances use Amazon Elastic Block Store (Amazon EBS) volumes. Users must be able to access the files at any time without delay. Users report that the two instances occasionally contain different versions of the same file. Users occasionally receive HTTP 404 errors when they try to download files. The company must address the customer issues. The company cannot make changes to the application code. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.

Run the robocopy command on one of the EC2 instances on a schedule to copy files from the Availability Zone A instance to the Availability Zone B instance.

B.

Configure the application to store the files on both EBS volumes each time a user writes or updates a file.

C.

Mount an Amazon Elastic File System (Amazon EFS) file system to the EC2 instances. Copy the files from the EBS volumes to the EFS file system. Configure the application to store files in the EFS file system.

D.

Create an EC2 instance profile that allows the instance in Availability Zone A to access the S3 bucket. Re-associate the instance profile to the instance in Availability Zone B when needed.

Buy Now
Questions 106

A company is developing a social media application. The company anticipates rapid and unpredictable growth in users and data volume. The application needs to handle a continuous high volume of user requests. User requests include long-running processes that store large amounts of user-generated content and user profiles in a relational format. The processes must run in a specific order. The company requires an architecture that can scale resources to meet demand spikes without downtime or performance degradation. The company must ensure that the components of the application can evolve independently without affecting other parts of the system. Which combination of AWS services will meet these requirements?

Options:

A.

Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Queue Service (Amazon SQS) to decouple message processing between components.

B.

Deploy the application on Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.

C.

Use Amazon DynamoDB as the database. Use AWS Lambda functions to implement the application. Configure Amazon DynamoDB Streams to invoke the Lambda functions. Use AWS Step Functions to manage workflows between services.

D.

Use an AWS Elastic Beanstalk environment with auto scaling to deploy the application. Use Amazon RDS as the database. Use Amazon Simple Notification Service (Amazon SNS) to decouple message processing between components.

Buy Now
Questions 107

A company is developing a new online gaming application. The application will run on Amazon EC2 instances in multiple AWS Regions and will have a high number of globally distributed users. A solutions architect must design the application to optimize network latency for the users.

Which actions should the solutions architect take to meet these requirements? (Select TWO.)

Options:

A.

Configure AWS Global Accelerator. Create Regional endpoint groups in each Region where an EC2 fleet is hosted.

B.

Create a content delivery network (CDN) by using Amazon CloudFront. Enable caching for static and dynamic content, and specify a high expiration period.

C.

Integrate AWS Client VPN into the application. Instruct users to select which Region is closest to them after they launch the application. Establish a VPN connection to that Region.

D.

Create an Amazon Route 53 weighted routing policy. Configure the routing policy to give the highest weight to the EC2 instances in the Region that has the largest number of users.

E.

Configure an Amazon API Gateway endpoint in each Region where an EC2 fleet is hosted. Instruct users to select which Region is closest to them after they launch the application. Use the API Gateway endpoint that is closest to them.

Buy Now
Questions 108

A company runs its critical storage application in the AWS Cloud. The application uses Amazon S3 in two AWS Regions. The company wants the application to send remote user data to the nearest S3 bucket with no public network congestion. The company also wants the application to fail over with the least amount of management of Amazon S3.

Which solution will meet these requirements?

Options:

A.

Implement an active-active design between the two Regions. Configure the application to use the regional S3 endpoints closest to the user.

B.

Use an active-passive configuration with S3 Multi-Region Access Points. Create a global endpoint for each of the Regions.

C.

Send user data to the regional S3 endpoints closest to the user. Configure an S3 cross-account replication rule to keep the S3 buckets synchronized.

D.

Set up Amazon S3 to use Multi-Region Access Points in an active-active configuration with a single global endpoint. Configure S3 Cross-Region Replication.

Buy Now
Questions 109

Question:

A genomics research company is designing a scalable architecture for a loosely coupled workload. Tasks in the workload are independent and can be processed in parallel. The architecture needs to minimize management overhead and provide automatic scaling based on demand.

Options:

Options:

A.

Use a cluster of Amazon EC2 instances. Use AWS Systems Manager to manage the workload.

B.

Implement a serverless architecture that uses AWS Lambda functions.

C.

Use AWS ParallelCluster to deploy a dedicated high-performance cluster.

D.

Implement vertical scaling for each workload task.

Buy Now
Questions 110

A company wants to restrict access to the content of its web application. The company needs to protect the content by using authorization techniques that are available on AWS. The company also wants to implement a serverless architecture for authorization and authentication that has low login latency.

The solution must integrate with the web application and serve web content globally. The application currently has a small user base, but the company expects the application's user base to increase

Which solution will meet these requirements?

Options:

A.

Configure Amazon Cognito for authentication. Implement Lambda@Edge for authorization. Configure Amazon CloudFront to serve the web application globally

B.

Configure AWS Directory Service for Microsoft Active Directory for authentication. Implement AWS Lambda for authorization. Use an Application Load Balancer to serve the web application globally.

C.

Configure Amazon Cognito for authentication. Implement AWS Lambda for authorization Use Amazon S3 Transfer Acceleration to serve the web application globally.

D.

Configure AWS Directory Service for Microsoft Active Directory for authentication. Implement Lambda@Edge for authorization. Use AWS Elastic Beanstalk to serve the web application globally.

Buy Now
Questions 111

A company is building a serverless application to process large video files that users upload. The application performs multiple tasks to process each video file. Processing can take up to 30 minutes for the largest files.

The company needs a scalable architecture to support the processing application.

Which solution will meet these requirements?

Options:

A.

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure a schedule in Amazon EventBridge Scheduler to invoke an AWS Lambda function periodically to check for new files. Configure the Lambda function to perform all the processing tasks.

B.

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure an Amazon EFS event notification to start an AWS Step Functions workflow that uses AWS Fargate tasks to perform the processing tasks.

C.

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to send an event to Amazon EventBridge when a user uploads a new video file. Configure an AWS Step Functions workflow as a target for an EventBridge rule. Use the workflow to manage AWS Fargate tasks to perform the processing tasks.

D.

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to invoke an AWS Lambda function when a user uploads a new video file. Configure the Lambda function to perform all the processing tasks.

Buy Now
Questions 112

A company is designing a solution to capture customer activity on the company's web applications. The company wants to analyze the activity data to make predictions.

Customer activity on the web applications is unpredictable and can increase suddenly. The company requires a solution that integrates with other web applications. The solution must include an authorization step.

Which solution will meet these requirements?

Options:

A.

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Configure the applications to pass an authorization header to the GWLB.

B.

Deploy an Amazon API Gateway endpoint in front of an Amazon Kinesis data stream. Store the data in an Amazon S3 bucket. Use an AWS Lambda function to handle authorization.

C.

Deploy an Amazon API Gateway endpoint in front of an Amazon Data Firehose delivery stream. Store the data in an Amazon S3 bucket. Use an API Gateway Lambda authorizer to handle authorization.

D.

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Use an AWS Lambda function to handle authorization.

Buy Now
Questions 113

An international company needs to share data from an Amazon S3 bucket to employees who are located around the world. The company needs a secure solution to provide employees with access to the S3 bucket. The employees are already enrolled in AWS IAM Identity Center.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create a help desk application to generate an Amazon S3 presigned URL for each employee. Configure the presigned URLs to have short expirations. Instruct employees to contact the company help desk to receive a presigned URL to access the S3 bucket.

B.

Create a group for Amazon S3 access in IAM Identity Center. Add the employees who require access to the S3 bucket to the group. Create an IAM policy to allow Amazon S3 access from the group. Instruct employees to use the AWS access portal to access the AWS Management Console and navigate to the S3 bucket.

C.

Create an Amazon S3 File Gateway. Create one share for data uploads and a second share for data downloads. Set up an SFTP service on an Amazon EC2 instance. Mount the shares to the EC2 instance. Instruct employees to use the SFTP server.

D.

Configure AWS Transfer Family SFTP endpoints. Select the custom identity provider option. Use AWS Secrets Manager to manage the user credentials. Instruct employees to use Transfer Family SFTP.

Buy Now
Questions 114

A company is creating an application. The company stores data from tests of the application in multiple on-premises locations.

The company needs to connect the on-premises locations to VPCs in an AWS Region in the AWS Cloud. The number of accounts and VPCs will increase during the next year. The network architecture must simplify the administration of new connections and must provide the ability to scale.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Create a peering connection between the VPCs. Create a VPN connection between the VPCs and the on-premises locations.

B.

Launch an Amazon EC2 instance. On the instance, include VPN software that uses a VPN connection to connect all VPCs and on-premises locations.

C.

Create a transit gateway. Create VPC attachments for the VPC connections. Create VPNattachments for the on-premises connections.

D.

Create an AWS Direct Connect connection between the on-premises locations and a central VPC. Connect the central VPC to other VPCs by using peering connections.

Buy Now
Questions 115

How can DynamoDB data be made available for long-term analytics with minimal operational overhead?

Options:

A.

Configure DynamoDB incremental exports to S3.

B.

Configure DynamoDB Streams to write records to S3.

C.

Configure EMR to copy DynamoDB data to S3.

D.

Configure EMR to copy DynamoDB data to HDFS.

Buy Now
Questions 116

A company is using microservices to build an ecommerce application on AWS. The company wants to preserve customer transaction information after customers submit orders. The company wants to store transaction data in an Amazon Aurora database. The company expects sales volumes to vary throughout each year.

Options:

A.

Use an Amazon API Gateway REST API to invoke an AWS Lambda function to send transaction data to the Aurora database. Send transaction data to an Amazon Simple Queue Service (Amazon SQS) queue that has a dead-letter queue. Use a second Lambda function to read from the SQS queue and to update the Aurora database.

B.

Use an Amazon API Gateway HTTP API to send transaction data to an Application Load Balancer (ALB). Use the ALB to send the transaction data to Amazon Elastic Container Service (Amazon ECS) on Amazon EC2. Use ECS tasks to store the data in Aurora database.

C.

Use an Application Load Balancer (ALB) to route transaction data to Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon EKS to send the data to the Aurora database.

D.

Use Amazon Data Firehose to send transaction data to Amazon S3. Use AWS Database Migration Service (AWS DMS) to migrate the data from Amazon S3 to the Aurora database.

Buy Now
Questions 117

A company wants to create an Amazon EMR cluster that multiple teams will use. The company wants to ensure that each team's big data workloads can access only the AWS services that each team needs to interact with. The company does not want the workloads to have access to Instance Metadata Service Version 2 (IMDSv2) on the cluster's underlying EC2 instances.

Which solution will meet these requirements?

Options:

A.

Configure interface VPC endpoints for each AWS service that the teams need. Use the required interface VPC endpoints to submit the big data workloads.

B.

Create EMR runtime roles. Configure the cluster to use the runtime roles. Use the runtime roles to submit the big data workloads.

C.

Create an EC2 IAM instance profile that has the required permissions for each team. Use the instance profile to submit the big data workloads.

D.

Create an EMR security configuration that has the EnableApplicationScoped IAM Role option set to false. Use the security configuration to submit the big data workloads.

Buy Now
Questions 118

A company needs to migrate a MySQL database from an on-premises data center to AWS within 2 weeks. The database is 180 TB in size. The company cannot partition the database.

The company wants to minimize downtime during the migration. The company's internet connection speed is 100 Mbps.

Which solution will meet these requirements?

Options:

A.

Order an AWS Snowball Edge Storage Optimized device. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS for MySQL and replicate ongoing changes. Send the Snowball Edge device back to AWS to finish the migration. Continue to replicate ongoing changes.

B.

Establish an AWS Site-to-Site VPN connection between the data center and AWS. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS tor MySQL and replicate ongoing changes.

C.

Establish a 10 Gbps dedicated AWS Direct Connect connection between the data center and AWS. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

D.

Use the company's existing internet connection. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

Buy Now
Questions 119

A gaming company is developing a game that requires significant compute resources to process game logic, player interactions, and real-time updates. The company needs a compute solution that can dynamically scale based on fluctuating player demand while maintaining high performance. The company must use a relational database that can run complex queries.

Options:

A.

Deploy Amazon EC2 instances to supply compute capacity. Configure Auto Scaling groups to achieve dynamic scaling based on player count. Use Amazon RDS for MySQL as the database.

B.

Refactor the game logic into small, stateless functions. Use AWS Lambda to process the game logic. Use Amazon DynamoDB as the database.

C.

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster on AWS Fargate to supply compute capacity. Scale the ECS tasks based on player demand. Use Amazon Aurora Serverless v2 as the database.

D.

Use AWS ParallelCluster for high performance computing (HPC). Provision compute nodes that have GPU instances to process the game logic and player interactions. Use Amazon RDS for MySQL as the database.

Buy Now
Questions 120

A company is planning to deploy a managed MySQL database solution for its non-production applications. The company plans to run the system for several years on AWS. Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Create an Amazon RDS for MySQL instance. Purchase a Reserved Instance.

B.

Create an Amazon RDS for MySQL instance. Use the instance on an on-demand basis.

C.

Create an Amazon Aurora MySQL cluster with writer and reader nodes. Use the cluster on an on-demand basis.

D.

Create an Amazon EC2 instance. Manually install and configure MySQL Server on the instance.

Buy Now
Questions 121

A gaming company is building an application that uses a database to store user data. The company wants the database to have an active-active configuration that allows data writes to a secondary AWS Region. The database must achieve a sub-second recovery point objective (RPO).

Options:

Options:

A.

Deploy an Amazon ElastiCache (Redis OSS) cluster. Configure a global data store for disaster recovery. Configure the ElastiCache cluster to cache data from an Amazon RDS database that is deployed in the primary Region.

B.

Deploy an Amazon DynamoDB table in the primary Region and the secondary Region. Configure Amazon DynamoDB Streams to invoke an AWS Lambda function to write changes from the table in the primary Region to the table in the secondary Region.

C.

Deploy an Amazon Aurora MySQL database in the primary Region. Configure a global database for the secondary Region.

D.

Deploy an Amazon DynamoDB table in the primary Region. Configure global tables for the secondary Region.

Buy Now
Questions 122

A company has applications that run on Amazon EC2 instances in a VPC One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

Options:

A.

Configure an S3 gateway endpoint.

B.

Create an S3 bucket in a private subnet.

C.

Create an S3 bucket in the same AWS Region as the EC2 instances.

D.

Configure a NAT gateway in the same subnet as the EC2 instances

Buy Now
Questions 123

A company generates approximately 20 GB of data multiple times each day. The company uses AWS DataSync to copy all data from on-premises storage to Amazon S3 every 6 hours for further processing. The analytics team wants to modify the copy process to copy only data relevant to the analytics team and ignore the rest of the data. The team wants to copy data as soon as possible and receive a notification when the copy process is finished. Which combination of steps will meet these requirements MOST cost-effectively? (Select THREE.)

Options:

A.

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create a custom script to upload the manifest file to an S3 bucket.

B.

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create an AWS Lambda function to load the manifest file data into an Amazon DynamoDB table.

C.

Create an AWS Lambda function that Amazon EventBridge invokes when the manifest file is loaded into Amazon DynamoDB. Configure the Lambda function to copy the data from on-premises storage to the S3 bucket that uses the manifest file.

D.

Create an AWS Lambda function that an S3 Event Notification invokes when the manifest file is uploaded. Configure the Lambda function to invoke the DataSync task by calling the StartTaskExecution API action with a manifest.

E.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an Amazon EventBridge rule to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

F.

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

Buy Now
Questions 124

A company is developing a serverless, bidirectional chat application that can broadcast messages to connected clients. The application is based on AWS Lambda functions. The Lambda functions receive incoming messages in JSON format.

The company needs to provide a frontend component for the application.

Which solution will meet this requirement?

Options:

A.

Use an Amazon API Gateway HTTP API to direct incoming JSON messages to backend destinations.

B.

Use an Amazon API Gateway REST API that is configured with a Lambda proxy integration.

C.

Use an Amazon API Gateway WebSocket API to direct incoming JSON messages to backend destinations.

D.

Use an Amazon CloudFront distribution that is configured with a Lambda function URL as a custom origin.

Buy Now
Questions 125

An internal product team is deploying a new application to a private VPC in a company's AWS account. The application runs on Amazon EC2 instances that are in a security group named App1. The EC2 instances store application data in an Amazon S3 bucket and use AWS Secrets Manager to store application service credentials. The company's security policy prohibits applications in a private VPC from using public IP addresses to communicate.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Configure gateway endpoints for Amazon S3 and AWS Secrets Manager.

B.

Configure interface VPC endpoints for Amazon S3 and AWS Secrets Manager.

C.

Add routes to the endpoints in the VPC route table.

D.

Associate the App1 security group with the interface VPC endpoints. Configure a self-referencing security group rule to allow inbound traffic.

E.

Associate the App1 security group with the gateway endpoints. Configure a self-referencing security group rule to allow inbound traffic.

Buy Now
Questions 126

A solutions architect is creating a data reporting application that will send traffic through third-party network firewalls in an AWS security account. The firewalls and application servers must be load balanced.

The application uses TCP connections to generate reports. The reports can run for several hours and can be idle for up to 1 hour. The reports must not time out during an idle period.

Which solution will meet these requirements?

Options:

A.

Use a Gateway Load Balancer (GWLB) for the firewalls. Use an Application Load Balancer (ALB) for the application servers. Set the ALB idle timeout period to 1 hour.

B.

Use a single firewall in the security account. Use an Application Load Balancer (ALB) for the application servers. Set the ALB idle timeout and firewall idle timeout periods to 1 hour.

C.

Use a Gateway Load Balancer (GWLB) for the firewalls. Use an Application Load Balancer (ALB) for the application servers. Set the idle timeout periods for the ALB, the GWLB, and the firewalls to 1 hour.

D.

Use a Gateway Load Balancer (GWLB) for the firewalls. Use an Application Load Balancer (ALB) for the application servers. Configure the ALB idle timeout period to 1 hour. Increase the application server capacity to finish the report generation faster.

Buy Now
Questions 127

A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions granted to IAM users to determine which users have more permissions than required.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.

Use Network Access Analyzer to review all access permissions in the company's AWS accounts.

B.

Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.

C.

Use AWS Identity and Access Management (IAM) Access Analyzer to review all the company's resources and accounts.

D.

Use Amazon Inspector to find vulnerabilities in existing IAM policies.

Buy Now
Questions 128

A company runs multiple web applications on Amazon EC2 instances behind a single Application Load Balancer (ALB). The application experiences unpredictable traffic spikes throughout each day. The traffic spikes cause high latency. The unpredictable spikes last less than 3 hours. The company needs a solution to resolve the latency issue caused by traffic spikes.

Options:

A.

Use EC2 instances in an Auto Scaling group. Configure the ALB and Auto Scaling group to use a target tracking scaling policy.

B.

Use EC2 Reserved Instances in an Auto Scaling group. Configure the Auto Scaling group to use a scheduled scaling policy based on peak traffic hours.

C.

Use EC2 Spot Instances in an Auto Scaling group. Configure the Auto Scaling group to use a scheduled scaling policy based on peak traffic hours.

D.

Use EC2 Reserved Instances in an Auto Scaling group. Replace the ALB with a Network Load Balancer (NLB).

Buy Now
Questions 129

A company plans to run a high performance computing (HPC) workload on Amazon EC2 Instances The workload requires low-latency network performance and high network throughput with tightly coupled node-to-node communication.

Which solution will meet these requirements?

Options:

A.

Configure the EC2 instances to be part of a cluster placement group

B.

Launch the EC2 instances with Dedicated Instance tenancy.

C.

Launch the EC2 instances as Spot Instances.

D.

Configure an On-Demand Capacity Reservation when the EC2 instances are launched.

Buy Now
Questions 130

A company hosts a public web application on AWS. The website has a three-tier architecture. The frontend web tier is comprised of Amazon EC2 instances in an Auto Scaling group. The application tier is a second Auto Scaling group. The database tier is an Amazon RDS database.

The company has configured the Auto Scaling groups to handle the application's normal level of demand. During an unexpected spike in demand, the company notices a long delay in the startup time when the frontend and application layers scale out. The company needs to improve the scaling performance of the application without negatively affecting the user experience.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.

Decrease the minimum number of EC2 instances for both Auto Scaling groups. Increase the desired number of instances to meet the peak demand requirement.

B.

Configure the maximum number of instances for both Auto Scaling groups to be the number required to meet the peak demand. Create a warm pool.

C.

Increase the maximum number of EC2 instances for both Auto Scaling groups to meet the normal demand requirement. Create a warm pool.

D.

Reconfigure both Auto Scaling groups to use a scheduled scaling policy. Increase the size of the EC2 instance types and the RDS instance types.

Buy Now
Questions 131

A company hosts a database that runs on an Amazon RDS instance deployed to multiple Availability Zones. A periodic script negatively affects a critical application by querying the database. How can application performance be improved with minimal costs?

Options:

A.

Add functionality to the script to identify the instance with the fewest active connections and query that instance.

B.

Create a read replica of the database. Configure the script to query only the read replica.

C.

Instruct the development team to manually export new entries at the end of the day.

D.

Use Amazon ElastiCache to cache the common queries the script runs.

Buy Now
Questions 132

A company has an e-commerce site. The site is designed as a distributed web application hosted in multiple AWS accounts under one AWS Organizations organization. The web application is comprised of multiple microservices. All microservices expose their AWS services either through Amazon CloudFront distributions or public Application Load Balancers (ALBs). The company wants to protect public endpoints from malicious attacks and monitor security configurations. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage rules in AWS WAF. Use AWS Config rules to monitor the Regional and global WAF configurations.

B.

Use AWS WAF to protect the public endpoints. Apply AWS WAF rules in each account. Use AWS Config rules and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.

C.

Use AWS WAF to protect the public endpoints. Use AWS Firewall Manager from a dedicated security account to manage the rules in AWS WAF. Use Amazon Inspector and AWS Security Hub to monitor the WAF configurations of the ALBs and the CloudFront distributions.

D.

Use AWS Shield Advanced to protect the public endpoints. Use AWS Config rules to monitor the Shield Advanced configuration for each account.

Buy Now
Questions 133

A company is building a critical data processing application that will run on Amazon EC2 instances. The company must not run any two nodes on the same underlying hardware. The company requires at least 99.99% availability for the application.

Which solution will meet these requirements?

Options:

A.

Deploy the application to one Availability Zone by using a cluster placement group strategy.

B.

Deploy the application to three Availability Zones by using a spread placement group strategy.

C.

Deploy the application to three Availability Zones by using a cluster placement group strategy.

D.

Deploy the application to one Availability Zone by using a partition placement group strategy.

Buy Now
Questions 134

A company is building a mobile gaming app. The company wants to serve users from around the world with low latency. The company needs a scalable solution to host the application and to route user requests to the location that is nearest to each user.

Which solution will meet these requirements?

Options:

A.

Use an Application Load Balancer to route requests to Amazon EC2 instances that are deployed across multiple Availability Zones.

B.

Use a Regional Amazon API Gateway REST API to route requests to AWS Lambda functions.

C.

Use an edge-optimized Amazon API Gateway REST API to route requests to AWS Lambda functions.

D.

Use an Application Load Balancer to route requests to containers in an Amazon ECS cluster.

Buy Now
Questions 135

An online gaming company hosts its platform on Amazon EC2 instances behind Network Load Balancers (NLBs) across multiple AWS Regions. The NLBs can route requests to targets overthe internet. The company wants to improve the customer playing experience by reducing end-to-end load time for its global customer base.

Which solution will meet these requirements?

Options:

A.

Create Application Load Balancers (ALBs) in each Region to replace the existing NLBs. Register the existing EC2 instances as targets for the ALBs in each Region.

B.

Configure Amazon Route 53 to route equally weighted traffic to the NLBs in each Region.

C.

Create additional NLBs and EC2 instances in other Regions where the company has large customer bases.

D.

Create a standard accelerator in AWS Global Accelerator. Configure the existing NLBs as target endpoints.

Buy Now
Questions 136

A company runs an application as a task in an Amazon Elastic Container Service (Amazon ECS) cluster. The application must have read and write access to a specific group of Amazon S3 buckets. The S3 buckets are in the same AWS Region and AWS account as the ECS cluster. The company needs to grant the application access to the S3 buckets according to the principle of least privilege.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.

Add a tag to each bucket. Create an IAM policy that includes a StringEquals condition that matches the tags and values of the buckets.

B.

Create an IAM policy that lists the full Amazon Resource Name (ARN) for each S3 bucket.

C.

Attach the IAM policy to the instance role of the ECS task.

D.

Create an IAM policy that includes a wildcard Amazon Resource Name (ARN) that matches all combinations of the S3 bucket names.

E.

Attach the IAM policy to the task role of the ECS task.

Buy Now
Questions 137

A company wants to publish a private website for its on-premises employees. The website consists of several HTML pages and image files. The website must be available only through HTTPS and must be available only to on-premises employees. A solutions architect plans to store the website files in an Amazon S3 bucket.

Which solution will meet these requirements?

Options:

A.

Create an S3 bucket policy to deny access when the source IP address is not the public IP address of the on-premises environment Set up an Amazon Route 53 alias record to point to the S3 bucket. Provide the alias record to the on-premises employees to grant the employees access to the website.

B.

Create an S3 access point to provide website access. Attach an access point policy to deny access when the source IP address is not the public IP address of the on-premises environment. Provide the S3 access point alias to the on-premises employees to grant the employees access to the website.

C.

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Use AWS Certificate Manager for SSL. Use AWS WAF with an IP set rule that allows access for the on-premises IP address. Set up an Amazon Route 53 alias record to point to the CloudFront distribution.

D.

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Create a CloudFront signed URL for the objects in the bucket. Set up an Amazon Route 53 alias record to point to the CloudFront distribution. Provide the signed URL to the on-premises employees to grant the employees access to the website.

Buy Now
Questions 138

A retail company runs its application on AWS. The application uses Amazon EC2 for web servers, Amazon RDS for database services, and Amazon CloudFront for global content distribution.

The company needs a solution to mitigate DDoS attacks.

Which solution will meet this requirement?

Options:

A.

Implement AWS WAF custom rules to limit the length of query requests. Configure CloudFront to work with AWS WAF.

B.

Enable AWS Shield Advanced. Configure CloudFront to work with Shield Advanced.

C.

Use Amazon Inspector to scan the EC2 instances. Enable Amazon GuardDuty.

D.

Enable Amazon Macie. Configure CloudFront Origin Shield.

Buy Now
Questions 139

A company receives data transfers from a small number of external clients that use SFTP software on an Amazon EC2 instance. The clients use an SFTP client to upload data. The clients use SSH keys for authentication. Every hour, an automated script transfers new uploads to an Amazon S3 bucket for processing.

The company wants to move the transfer process to an AWS managed service and to reduce the time required to start data processing. The company wants to retain the existing user management and SSH key generation process. The solution must not require clients to make significant changes to their existing processes.

Which solution will meet these requirements?

Options:

A.

Reconfigure the script that runs on the EC2 instance to run every 15 minutes. Create an S3 Event Notifications rule for all new object creation events. Set an Amazon Simple Notification Service (Amazon SNS) topic as the destination.

B.

Create an AWS Transfer Family SFTP server that uses the existing S3 bucket as a target. Use service-managed users to enable authentication.

C.

Require clients to add the AWS DataSync agent into their local environments. Create an IAM user for each client that has permission to upload data to the target S3 bucket.

D.

Create an AWS Transfer Family SFTP connector that has permission to access the target S3 bucket for each client. Store credentials in AWS Systems Manager. Create an IAM role to allow the SFTP connector to securely use the credentials.

Buy Now
Questions 140

A company has a single AWS account that contains resources belonging to several teams. The company needs to identify the costs associated with each team. The company wants to use a tag named CostCenter to identify resources that belong to each team.

Options:

A.

Tag all resources that belong to each team with the user-defined CostCenter tag.

B.

Create a tag for each team, and set the value to CostCenter.

C.

Activate the CostCenter tag to track cost allocation.

D.

Configure AWS Billing and Cost Management to send monthly invoices to the company through email messages.

E.

Set up consolidated billing in the existing AWS account.

Buy Now
Exam Code: SAA-C03
Exam Name: AWS Certified Solutions Architect - Associate (SAA-C03)
Last Update: Sep 30, 2025
Questions: 467
$57.75  $164.99
$43.75  $124.99
$36.75  $104.99
buy now SAA-C03