★ Pass on Your First TRY ★ 100% Money Back Guarantee ★ Realistic Practice Exam Questions

Free Instant Download NEW SAA-C03 Exam Dumps (PDF & VCE):
Available on: https://www.certleader.com/SAA-C03-dumps.html


Master the SAA-C03 AWS Certified Solutions Architect - Associate (SAA-C03) content and be ready for exam day success quickly with this Pass4sure SAA-C03 test question. We guarantee it!We make it a reality and give you real SAA-C03 questions in our Amazon-Web-Services SAA-C03 braindumps.Latest 100% VALID Amazon-Web-Services SAA-C03 Exam Questions Dumps at below page. You can use our Amazon-Web-Services SAA-C03 braindumps and pass your exam.

Also have SAA-C03 free dumps questions for you:

NEW QUESTION 1
A company uses an Amazon Auroia PostgreSQL DB cluster 10 store its critical data m tne us-east-l Region The company wants to develop a disaster recovery plan to recover the database m the us west 1 Region The company has a recovery time objective (RTO) of S minutes and has a recovery point objective (RPO) of 1 minute
What should a solutions architect do to moot these requirements?

  • A. Create a read replica in us-west-1 Set the DB cluster to automaKaliy fail over to the read replica if the primary instance is not responding
  • B. Create an Aurora global database Sel us-west-1 as the secondary Region update connections to use the writer and reader endpomis as appropriate
  • C. Set up a second Aurora DB cluster in us-west-1 Use logical replication to keep the databases synchronized Create an Amazon EvontBridgc (Amazon CloudWatch Events) rule to change thedatabase endpoint rf the primary DB cluster does not respond.
  • D. Use Aurora automated snapshots to store data in an Amazon S3 bucket Enable S3 Verswnm
  • E. Configure S3 Cross-Region Replication to us-west-1 Create a second Aurora DB cluster in us-west-1 Create an Amazon EventBndge (Amazon CloudWatch Events) rule to restore the snapshot il the primary D8 cluster does not respond

Answer: B

NEW QUESTION 2
A solutions architect is designing a two-tier web application The application consists of a public-facing web tier hosted on Amazon EC2 in public subnets The database tier consists of Microsoft SQL Server running on Amazon EC2 in a private subnet Security is a high priority for the company
How should security groups be configured in this situation? (Select TWO )

  • A. Configure the security group for the web tier to allow inbound traffic on port 443 from 0.0.0.0/0.
  • B. Configure the security group for the web tier to allow outbound traffic on port 443 from 0.0.0.0/0.
  • C. Configure the security group for the database tier to allow inbound traffic on port 1433 from the security group for the web tier.
  • D. Configure the security group for the database tier to allow outbound traffic on ports 443 and 1433 to the security group for the web tier.
  • E. Configure the security group for the database tier to allow inbound traffic on ports 443 and 1433 from the security group for the web tier.

Answer: AC

Explanation:
"Security groups create an outbound rule for every inbound rule." Not completely right. Statefull does NOT mean that if you create an inbound (or outbound) rule, it will create an outbound (or inbound) rule. What it does mean is: suppose you create an inbound rule on port 443 for the X ip. When a request enters on port 443 from X ip, it will allow traffic out for that request in the port 443. However, if you look at the outbound rules, there will not be any outbound rule on port 443 unless explicitly create it. In ACLs, which are stateless, you would have to create an inbound rule to allow incoming requests and an outbound rule to allow your application responds to those incoming requests.
https://docs.aws.amazon.com/vpc/latest/userguide/VPC_SecurityGroups.html#SecurityGroupRules

NEW QUESTION 3
A gaming company wants to launch a new internet-facing application in multiple AWS Regions. The application will use the TCP and UDP protocols for communication. The company needs to provide high availability and minimum latency for global users.
Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

  • A. Create internal Network Load Balancers in front of the application in each Region
  • B. Create external Application Load Balancers in front of the application in each Region
  • C. Create an AWS Global Accelerator accelerator to route traffic to the load balancers in each Region
  • D. Configure Amazon Route 53 to use a geolocation routing policy to distribute the traffic
  • E. Configure Amazon CloudFront to handle the traffic and route requests to the application in each Region

Answer: AC

NEW QUESTION 4
A company's web application consists o( an Amazon API Gateway API in front of an AWS Lambda function and an Amazon DynamoDB database. The Lambda function
handles the business logic, and the DynamoDB table hosts the data. The application uses Amazon Cognito user pools to identify the individual users of the application. A solutions architect needs to update the application so that only users who have a subscription can access premium content.

  • A. Enable API caching and throttling on the API Gateway API
  • B. Set up AWS WAF on the API Gateway API Create a rule to filter users who have a subscription
  • C. Apply fine-grained 1AM permissions to the premium content in the DynamoDB table
  • D. Implement API usage plans and API keys to limit the access of users who do not have a subscription.

Answer: C

NEW QUESTION 5
A company is migrating a distributed application to AWS The application serves variable workloads The legacy platform consists of a primary server trial coordinates jobs across multiple compute nodes The company wants to modernize the application with a solution that maximizes resiliency and scalability
How should a solutions architect design the architecture to meet these requirements?

  • A. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling grou
  • B. Configure EC2 Auto Scaling to use scheduled scaling
  • C. Configure an Amazon Simple Queue Service (Amazon SQS) queue as a destination for the jobs Implement the compute nodes with Amazon EC2 Instances that are managed in an Auto Scaling group Configure EC2 Auto Scaling based on the size of the queue
  • D. Implement the primary server and the compute nodes with Amazon EC2 instances that are managed In an Auto Scaling grou
  • E. Configure AWS CloudTrail as a destination for the fobs Configure EC2 Auto Scaling based on the load on the primary server
  • F. implement the primary server and the compute nodes with Amazon EC2 instances that are managed in an Auto Scaling group Configure Amazon EventBridge (Amazon CloudWatch Events) as a destination for the jobs Configure EC2 Auto Scaling based on the load on the compute nodes

Answer: B

NEW QUESTION 6
A hospital wants to create digital copies for its large collection of historical written records. The hospital will continue to add hundreds of new documents each day. The hospital's data team will scan the documents and will upload the documents to the AWS Cloud.
A solutions architect must implement a solution to analyze the documents: extract the medical information, and store the documents so that an application can run SQL queries on the data The solution must maximize scalability and operational efficiency
Which combination of steps should the solutions architect take to meet these requirements? (Select TWO.)

  • A. Write the document information to an Amazon EC2 instance that runs a MySQL database
  • B. Write the document information to an Amazon S3 bucket Use Amazon Athena to query the data
  • C. Create an Auto Scaling group of Amazon EC2 instances to run a custom application that processes the scanned files and extracts the medical information.
  • D. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Rekognition to convert the documents to raw text Use Amazon Transcribe Medical to detect and extract relevant medical Information from the text.
  • E. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Textract to convert the documents to raw text Use Amazon Comprehend Medical to detect and extract relevant medical information from the text

Answer: AE

NEW QUESTION 7
A company has a production workload that runs on 1,000 Amazon EC2 Linux instances. The workload is powered by third-party software. The company needs to patch the third-party software on all EC2 instances as quickly as possible to remediate a critical security vulnerability.
What should a solutions architect do to meet these requirements?

  • A. Create an AWS Lambda function to apply the patch to all EC2 instances.
  • B. Configure AWS Systems Manager Patch Manager to apply the patch to all EC2 instances.
  • C. Schedule an AWS Systems Manager maintenance window to apply the patch to all EC2 instances.
  • D. Use AWS Systems Manager Run Command to run a custom command that applies the patch to all EC2 instances.

Answer: D

NEW QUESTION 8
A company runs an on-premises application that is powered by a MySQL database The company is migrating the application to AWS to Increase the application's elasticity and availability
The current architecture shows heavy read activity on the database during times of normal operation Every 4 hours the company's development team pulls a full export of the production database to populate a database in the staging environment During this period, users experience unacceptable application latency The development team is unable to use the staging environment until the procedure completes
A solutions architect must recommend replacement architecture that alleviates the application latency issue The replacement architecture also must give the development team the ability to continue using the staging environment without delay
Which solution meets these requirements?

  • A. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for productio
  • B. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
  • C. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand
  • D. Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.
  • E. Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for productio
  • F. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.

Answer: B

NEW QUESTION 9
A company has more than 5 TB of file data on Windows file servers that run on premises Users and applications interact with the data each day
The company is moving its Windows workloads to AWS. As the company continues this process, the company requires access to AWS and on-premises file storage with minimum latency The company needs a solution that minimizes operational overhead and requires no significant changes to the existing file access patterns. The company uses an AWS Site-to-Site VPN connection for connectivity to AWS
What should a solutions architect do to meet these requirements?

  • A. Deploy and configure Amazon FSx for Windows File Server on AW
  • B. Move the on-premises file data to FSx for Windows File Serve
  • C. Reconfigure the workloads to use FSx for Windows File Server on AWS.
  • D. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to the S3 File Gateway Reconfigure the on-premises workloads and the cloud workloads to use the S3 File Gateway
  • E. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to Amazon S3 Reconfigure the workloads to use either Amazon S3 directly or the S3 File Gateway, depending on each workload's location
  • F. Deploy and configure Amazon FSx for Windows File Server on AWS Deploy and configure an Amazon FSx File Gateway on premises Move the on-premises file data to the FSx File Gateway Configure the cloud workloads to use FSx for Windows File Server on AWS Configure the on-premises workloads to use the FSx File Gateway

Answer: D

NEW QUESTION 10
A company is developing an Internal application that uses a PostgreSQL database. The company has decided to host the database on Amazon Aurora The application does not need to be highly available but data must be stored in multiple Availability Zones to maximize durability.
Which database configuration meets these requirements MOST cost-effectively?

  • A. An Aurora PostgreSQL DB cluster with a single DB Instance
  • B. An Aurora PostgreSQL DB cluster with a primary DB instance and a read replica
  • C. An Aurora PostgreSQL DB cluster with Multi-AZ deployment enabled
  • D. An Aurora PostgreSQL global database cluster

Answer: B

NEW QUESTION 11
A company is designing a new web application that the company will deploy into a single AWS Region. The application requires a two-tier architecture that will include Amazon EC2 instances and an Amazon RDS DB instance. A solutions architect needs to design the application so that all components are highly available.

  • A. Deploy EC2 instances In an additional Region Create a DB instance with the Multi-AZ option activated
  • B. Deploy all EC2 instances in the same Region and the same Availability Zon
  • C. Create a DB instance with the Multi-AZ option activated.
  • D. Deploy the fcC2 instances across at least two Availability Zones within the some Regio
  • E. Create a DB instance in a single Availability Zone
  • F. Deploy the EC2 instances across at least Two Availability Zones within the same Regio
  • G. Create a DB instance with the Multi-AZ option activated

Answer: D

NEW QUESTION 12
A company runs an on-premises application that is powered by a MySQL database The company is migrating the application to AWS to Increase the application's elasticity and availability
The current architecture shows heavy read activity on the database during times of normal operation Every 4 hours the company's development team pulls a full export of the production database to populate a database in the staging environment During this period, users experience unacceptable application latency The development team is unable to use the staging environment until the procedure completes
A solutions architect must recommend replacement architecture that alleviates the application latency issue The replacement architecture also must give the development team the ability to continue using the staging environment without delay
Which solution meets these requirements?

  • A. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for productio
  • B. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.
  • C. Use Amazon Aurora MySQL with Multi-AZ Aurora Replicas for production Use database cloning to create the staging database on-demand
  • D. Use Amazon RDS for MySQL with a Mufti AZ deployment and read replicas for production Use the standby instance tor the staging database.
  • E. Use Amazon RDS for MySQL with a Multi-AZ deployment and read replicas for productio
  • F. Populate the staging database by implementing a backup and restore process that uses the mysqldump utility.

Answer: C

NEW QUESTION 13
Availability Zone The company wants the application to be highly available with minimum downtime and minimum loss of data
Which solution will meet these requirements with the LEAST operational effort?

  • A. Place the EC2 instances in different AWS Regions Use Amazon Route 53 health checks to redirect traffic Use Aurora PostgreSQL Cross-Region Replication
  • B. Configure the Auto Scaling group to use multiple Availability Zones Configure the database as Multi-AZ Configure an Amazon RDS Proxy instance for the database
  • C. Configure the Auto Scaling group to use one Availability Zone Generate hourly snapshots of the database Recover the database from the snapshots in the event of a failure.
  • D. Configure the Auto Scaling group to use multiple AWS Regions Write the data from the application to Amazon S3 Use S3 Event Notifications to launch an AWS Lambda function to write the data to the database

Answer: B

NEW QUESTION 14
A company has thousands of edge devices that collectively generate 1 TB of status alerts each day.
Each alert is approximately 2 KB in size. A solutions architect needs to implement a solution to ingest and store the alerts for future analysis.
The company wants a highly available solution. However, the company needs to minimize costs and does not want to manage additional infrastructure. Additionally, the company wants to keep 14 days of data available for immediate analysis and archive any data older than 14 days.
What is the MOST operationally efficient solution that meets these requirements?

  • A. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days
  • B. Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest the alerts Create a script on the EC2 instances that will store tne alerts m an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier after 14 days
  • C. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream to deliver the alerts to an Amazon Elasticsearch Service (Amazon ES) duster Set up the Amazon ES cluster to take manual snapshots every day and delete data from the duster that is older than 14 days
  • D. Create an Amazon Simple Queue Service (Amazon SQS i standard queue to ingest the alerts and set the message retention period to 14 days Configure consumers to poll the SQS queue check the age of the message and analyze the message data as needed If the message is 14 days old the consumer should copy the message to an Amazon S3 bucket and delete the message from the SQS queue

Answer: A

Explanation:
Explanation
https://aws.amazon.com/kinesis/datafirehose/features/?nc=sn&loc=2#:~:text=into%20Amazon%20S3%2C%20Amazon%20Redshift%2C%20Amazon%20OpenSearch%20Service%2C%20Kinesis,Delivery%20streams

NEW QUESTION 15
A business's backup data totals 700 terabytes (TB) and is kept in network attached storage (NAS) at its data center. This backup data must be available in the event of occasional regulatory inquiries and preserved for a period of seven years. The organization has chosen to relocate its backup data from its on-premises data center to Amazon Web Services (AWS). Within one month, the migration must be completed. The company's public internet connection provides 500 Mbps of dedicated capacity for data transport.
What should a solutions architect do to ensure that data is migrated and stored at the LOWEST possible cost?

  • A. Order AWS Snowball devices to transfer the dat
  • B. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
  • C. Deploy a VPN connection between the data center and Amazon VP
  • D. Use the AWS CLI to copy the data from on premises to Amazon S3 Glacier.
  • E. Provision a 500 Mbps AWS Direct Connect connection and transfer the data to Amazon S3. Use a lifecycle policy to transition the files to Amazon S3 Glacier Deep Archive.
  • F. Use AWS DataSync to transfer the data and deploy a DataSync agent on premise
  • G. Use the DataSync task to copy files from the on-premises NAS storage to Amazon S3 Glacier.

Answer: A

NEW QUESTION 16
A company runs multiple Windows workloads on AWS. The company’s employees use Windows the file shares that are hosted on two Amazon EC2 instances. The file shares synchronize data between themselves and maintain duplicate copies. The company wants a highly available and durable storage solution that preserves how users currently access the files.

  • A. Migrate all the data to Amazon S3 Set up IAM authentication for users to access files
  • B. Set up an Amazon S3 File Gatewa
  • C. Mount the S3 File Gateway on the existing EC2 Instances.
  • D. Extend the file share environment to Amazon FSx for Windows File Server with a Multi-AZ configuratio
  • E. Migrate all the data to FSx for Windows File Server.
  • F. Extend the file share environment to Amazon Elastic File System (Amazon EFS) with a Multi-AZ configuratio
  • G. Migrate all the data to Amazon EFS.

Answer: C

NEW QUESTION 17
A company observes an increase in Amazon EC2 costs in its most recent bill
The billing team notices unwanted vertical scaling of instance types for a couple of EC2 instances
A solutions architect needs to create a graph comparing the last 2 months of EC2 costs and perform an in-depth analysis to identify the root cause of the vertical scaling
How should the solutions architect generate the information with the LEAST operational overhead?

  • A. Use AWS Budgets to create a budget report and compare EC2 costs based on instance types
  • B. Use Cost Explorer's granular filtering feature to perform an in-depth analysis of EC2 costs based on instance types
  • C. Use graphs from the AWS Billing and Cost Management dashboard to compare EC2 costs based on instance types for the last 2 months
  • D. Use AWS Cost and Usage Reports to create a report and send it to an Amazon S3 bucket Use Amazon QuickSight with Amazon S3 as a source to generate an interactive graph based on instance types.

Answer: B

Explanation:
Explanation
AWS Cost Explorer is a tool that enables you to view and analyze your costs and usage. You can explore your usage and costs using the main graph, the Cost Explorer cost and usage reports, or the Cost Explorer RI reports. You can view data for up to the last 12 months, forecast how much you're likely to spend for the next 12 months, and get recommendations for what Reserved Instances to purchase. You can use Cost Explorer to identify areas that need further inquiry and see trends that you can use to understand your costs. https://docs.aws.amazon.com/costmanagement/ latest/userguide/ce-what-is.html

NEW QUESTION 18
A company uses Amazon EC2 instances to host its internal systems As pan of a deployment operation, an administrator tries to use the AWS CLI to terminate an EC2 instance However, the administrator receives a 403 (Access Dented) error message
The administrator is using an IAM role that has the following 1AM policy attached:
SAA-C03 dumps exhibit
What is the cause of the unsuccessful request?

  • A. The EC2 Instance has a resource-based policy win a Deny statement.B The principal has not been specified in the policy statement
  • B. The "Action" field does not grant the actions that are required to terminate the EC2 instance
  • C. The request to terminate the EC2 instance does not originate from the CIDR blocks 192 0 2.0:24 or 203.0.113.0/24.

Answer: B

NEW QUESTION 19
A company hosts its multi-tier applications on AWS. For compliance, governance, auditing, and security, the company must track configuration changes on its AWS resources and record a history of API calls made to these resources.
What should a solutions architect do to meet these requirements?

  • A. Use AWS CloudTrail to track configuration changes and AWS Config to record API calls
  • B. Use AWS Config to track configuration changes and AWS CloudTrail to record API calls
  • C. Use AWS Config to track configuration changes and Amazon CloudWatch to record API calls
  • D. Use AWS CloudTrail to track configuration changes and Amazon CloudWatch to record API calls

Answer: B

NEW QUESTION 20
A company wants to migrate its existing on-premises monolithic application to AWS.
The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants to break the application into smaller applications. A different team will manage each application. The company needs a highly scalable solution that minimizes operational overhead.
Which solution will meet these requirements?

  • A. Host the application on AWS Lambda Integrate the application with Amazon API Gateway.
  • B. Host the application with AWS Amplif
  • C. Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.
  • D. Host the application on Amazon EC2 instance
  • E. Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.
  • F. Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon ECS as the target.

Answer: C

NEW QUESTION 21
......

Thanks for reading the newest SAA-C03 exam dumps! We recommend you to try the PREMIUM 2passeasy SAA-C03 dumps in VCE and PDF here: https://www.2passeasy.com/dumps/SAA-C03/ (0 Q&As Dumps)