This page was exported from Braindump2go Free Exam Dumps with PDF and VCE Collection [ https://www.mcitpdump.com ] Export date:Wed May 15 5:34:18 2024 / +0000 GMT ___________________________________________________ Title: [October-2021]Exam Pass 100%!Braindump2go SAA-C02 Exam PDF SAA-C02 711Q Instant Download[Q724-Q745] --------------------------------------------------- October/2021 Latest Braindump2go SAA-C02 Exam Dumps with PDF and VCE Free Updated Today! Following are some new SAA-C02 Real Exam Questions!QUESTION 724A company is building a new furniture inventory application. The company has deployed the application on a fleet of Amazon EC2 instances across multiple Availability Zones. The EC2 instances run behind an Application Load Balancer (ALB) in their VPC.A solutions architect has observed that incoming traffic seems to favor one EC2 instance resulting in latency for some requests.What should the solutions architect do to resolve this issue?A. Disable session affinity (sticky sessions) on the ALBB. Replace the ALB with a Network Load BalancerC. increase the number of EC2 instances in each Availability ZoneD. Adjust the frequency of the health checks on the ALB's target groupAnswer: BQUESTION 725A startup company is using me AWS Cloud to develop a traffic control monitoring system for a large city. The system must be highly available and must provide near-real-time results for residents and city officials even during peak events.Gigabytes of data will come in daily from loT devices that run at intersections and freeway ramps across the city. The system must process the data sequentially to provide the correct timeline. However results need to show only what has happened in the last 24 hours.Which solution will meet these requirements MOST cost-effectively?A. Deploy Amazon Kinesis Data Firehose to accept incoming data from the loT devices and write the data to Amazon S3 Build a web dashboard to display the data from the last 24 hoursB. Deploy an Amazon API Gateway API endpoint and an AWS Lambda function to process incoming data from the loT devices and store the data in Amazon DynamoDB Build a web dashboard to display the data from the last 24 hoursC. Deploy an Amazon API Gateway API endpoint and an Amazon Simple Notification Service (Amazon SNS) tope to process incoming data from the loT devices Write the data to Amazon Redshift Build a web dashboard to display the data from the last 24 hoursD. Deploy an Amazon Simple Queue Service (Amazon SOS) FIFO queue and an AWS Lambda function to process incoming data from the loT devices and store the data in an Amazon RDS DB instance Build a web dashboard to display the data from the last 24 hoursAnswer: DQUESTION 726A company has designed an application where users provide small sets of textual data by calling a public API. The application runs on AWS and includes a public Amazon API Gateway API that forwards requests to an AWS Lambda function for processing. The Lambda function then writes the data to an Amazon Aurora Serverless database for consumption.The company is concerned that it could lose some user data it a Lambda function fails to process the request property or reaches a concurrency limit.What should a solutions architect recommend to resolve this concern?A. Split the existing Lambda function into two Lambda functions Configure one function to receive API Gateway requests and put relevant items into Amazon Simple Queue Service (Amazon SQS) Configure the other function to read items from Amazon SQS and save the data into AuroraB. Configure the Lambda function to receive API Gateway requests and write relevant items to Amazon ElastiCache Configure ElastiCache to save the data into AuroraC. Increase the memory for the Lambda function Configure Aurora to use the Multi-AZ featureD. Split the existing Lambda function into two Lambda functions Configure one function to receive API Gateway requests and put relevant items into Amazon Simple Notification Service (Amazon SNS)Configure the other function to read items from Amazon SNS and save the data into AuroraAnswer: AQUESTION 727A developer has a script lo generate daily reports that users previously ran manually. The script consistently completes in under 10 minutes. The developer needs to automate this process in a cost-effective manner.Which combination of services should the developer use? (Select TWO.)A. AWS LambdaB. AWS CloudTrailC. Cron on an Amazon EC2 instanceD. Amazon EC2 On-Demand Instance with user dataE. Amazon EventBridge (Amazon CloudWatch Events)Answer: CEQUESTION 728A solution architect is creating a new Amazon CloudFront distribution for an application. Some of Ine information submitted by users is sensitive. The application uses HTTPS but needs another layer" of security. The sensitive information should be protected throughout the entire application stack end access to the information should be restricted to certain applications.Which action should the solutions architect take?A. Configure a CloudFront signed URLB. Configure a CloudFront signed cookie.C. Configure a CloudFront field-level encryption profileD. Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol PolicyAnswer: CQUESTION 729A company has an Amazon S3 bucket that contains confidential information in its production AWS account. The company has turned on AWS CloudTrail for the account. The account sends a copy of its logs to Amazon CloudWatch Logs. The company has configured the S3 bucket to log read and write data events.A company auditor discovers that some objects in the S3 bucket have been deleted.A solutions architect must provide the auditor with information about who deleted the objects.What should the solutions architect do to provide this information?A. Create a CloudWatch Logs fitter to extract the S3 write API calls against the S3 bucketB. Query the CloudTrail togs with Amazon Athena to identify the S3 write API calls against the S3 bucketC. Use AWS Trusted Advisor to perform security checks for S3 write API calls that deleted the contentD. Use AWS Config to track configuration changes on the S3 bucket Use these details to track the S3 write API calls that deleted the contentAnswer: BQUESTION 730A company has three AWS accounts Management Development and Production. These accounts use AWS services only in the us-east-1 Region. All accounts have a VPC with VPC Flow Logs configured to publish data to an Amazon S3 bucket in each separate account. For compliance reasons the company needs an ongoing method to aggregate all the VPC flow logs across all accounts into one destination S3 bucket in the Management account.What should a solutions architect do to meet these requirements with the LEAST operational overhead?A. Add S3 Same-Region Replication rules in each S3 bucket that stores VPC flow logs to replicate objects to the destination S3 bucket Configure the destination S3 bucket to allow objects to be received from the S3 buckets in other accountsB. Set up an IAM user in the Management account Grant permissions to the IAM user to access the S3 buckets that contain the VPC flow logs Run the aws s3 sync command in the AWS CLl to copy the objects to the destination S3 bucketC. Use an S3 inventory report to specify which objects in the S3 buckets to copy Perform an S3 batch operation to copy the objects into the destination S3 bucket in the Management account with a single request.D. Create an AWS Lambda function in the Management account Grant S3 GET permissions on the source S3 buckets Grant S3 PUT permissions on the destination S3 bucket Configure the function to invoke when objects are loaded in the source S3 bucketsAnswer: AQUESTION 731A company is running a multi-tier web application on AWS. The application runs its database on Amazon Aurora MySQL. The application and database tiers are in the us-easily Region. A database administrator who monitors the Aurora DB cluster finds that an intermittent increase in read traffic is creating high CPU utilization on the read replica. The result is increased read latency for the application. The memory and disk utilization of the DB instance are stable throughout the event of increased latency.What should a solutions architect do to improve the read scalability?A. Reboot the DB clusterB. Create a cross-Region read replicaC. Configure Aurora Auto Scaling for the read replicaD. Increase the provisioned read IOPS for the DB instanceAnswer: BQUESTION 732A developer is creating an AWS Lambda function to perform dynamic updates to a database when an item is added to an Amazon Simple Queue Service (Amazon SOS) queue. A solutions architect must recommend a solution that tracks any usage of database credentials in AWS CloudTrail. The solution also must provide auditing capabilities.Which solution will meet these requirements?A. Store the encrypted credentials in a Lambda environment variableB. Create an Amazon DynamoDB table to store the credentials Encrypt the tableC. Store the credentials as a secure string in AWS Systems Manager Parameter StoreD. Use an AWS Key Management Service (AWS KMS) key store to store the credentialsAnswer: DQUESTION 733A company has a service that reads and writes large amounts of data from an Amazon S3 bucket in the same AWS Region. The service is deployed on Amazon EC2 instances within the private subnet of a VPC. The service communicates with Amazon S3 over a NAT gateway in the public subnet. However, the company wants a solution that will reduce the data output costs.Which solution will meet these requirements MOST cost-effectively?A. Provision a dedicated EC2 NAT instance in the public subnet. Configure the route table for the private subnet to use the elastic network interface of this instance as the destination for all S3 trafficB. Provision a dedicated EC2 NAT instance in the private subnet. Configure the route table for the public subnet to use the elastic network interface of this instance as the destination for all S3 traffic.C. Provision a VPC gateway endpoint. Configure the route table for the private subnet to use the gateway endpoint as the route for all S3 traffic.D. Provision a second NAT gateway. Configure the route table foe the private subnet to use this NAT gateway as the destination for all S3 traffic.Answer: CQUESTION 734A company has an application that uses an Amazon OynamoDB table lew storage. A solutions architect discovers that many requests to the table are not returning the latest data. The company's users have not reported any other issues with database performance Latency is in an acceptable range.Which design change should the solutions architect recommend?A. Add read replicas to the table.B. Use a global secondary index (GSI).C. Request strongly consistent reads for the tableD. Request eventually consistent reads for the table.Answer: CQUESTION 735A company wants lo share data that is collected from sell-driving cars with the automobile community. The data will be made available from within an Amazon S3 bucket. The company wants to minimize its cost of making this data available to other AWS accounts.What should a solutions architect do to accomplish this goal?A. Create an S3 VPC endpoint for the bucket.B. Configure the S3 bucket to be a Requester Pays bucket.C. Create an Amazon CloudFront distribution in front of the S3 bucket.D. Require that the fries be accessible only with the use of the BitTorrent protocol.Answer: AQUESTION 736A company recently announced the deployment of its retail website to a global audience.The website runs on multiple Amazon EC2 instances behind an Elastic Load Balancer.The instances run in an Auto Scaling group across multiple Availability Zones.The company wants to provide its customers with different versions of content based on the devices that the customers use to access the website.Which combination of actions should a solutions architect take to meet these requirements7 (Select TWO.)A. Configure Amazon CloudFront to cache multiple versions of the content.B. Configure a host header in a Network Load Balancer to forward traffic to different instances.C. Configure a Lambda@Edge function to send specific objects to users based on the User-Agent header.D. Configure AWS Global Accelerator. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up host-based routing to different EC2 instances.E. Configure AWS Global Accelerator. Forward requests to a Network Load Balancer (NLB). Configure the NLB to set up path-based routing to different EC2 instances.Answer: BDQUESTION 737A company has developed a new content-sharing application that runs on Amazon Elastic Container Service (Amazon ECS). The application runs on Amazon Linux Docker tasks that use the Amazon EC2 launch type. The application requires a storage solution that has the following characteristics:- Accessibility (or multiple ECS tasks through bind mounts- Resiliency across Availability Zones- Burslable throughput of up to 3 Gbps- Ability to be scaled up over timeWhich storage solution meets these requirements?A. Launch an Amazon FSx for Windows File Server Multi-AZ instance. Configure the ECS task definitions to mount the Amazon FSx instance volume at launch.B. Launch an Amazon Elastic File System (Amazon EFS) instance. Configure the ECS task definitions to mount the EFS Instance volume at launch.C. Create a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach set to enabled. Attach the EBS volume to the ECS EC2 instance Configure ECS task definitions to mount the EBS instance volume at launch.D. Launch an EC2 instance with several Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volumes attached m a RAID 0 configuration. Configure the EC2 instance as an NFS storage server. Configure ECS task definitions to mount the volumes at launch.Answer: BQUESTION 738An airline that is based in the United States provides services for routes in North America and Europe. The airline is developing a new read-intensive application that customers can use to find flights on either continent.The application requires strong read consistency and needs scalable database capacity to accommodate changes in user demand.The airline needs the database service to synchronize with the least possible latency between the two continents and to provide a simple failover mechanism to a second AWS Region.Which solution will meet these requirements?A. Deploy Microsoft SQL Server on Amazon EC2 instances in a Region in North America. Use SOL Server binary log replication on an EC2 instance in a Region in Europe.B. Create an Amazon DynamoDB global table Add a Region from North America and a Region from Europe to the table. Query data with strongly consistent reads.C. Use an Amazon Aurora MySQL global database. Deploy the read-write node in a Region in North America, and deploy read-only endpoints in Regions in North America and Europe. Query data with global read consistency.D. Create a subscriber application that uses Amazon Kinesis Data Steams for an Amazon Redshift cluster in a Region in North America. Create a second subscriber application for the Amazon Redshift cluster in a Region in Europe. Process all database modifications through Kinesis Data Streams.Answer: CQUESTION 739A company has a production web application in which users upload documents through a web interlace or a mobile app. According to a new regulatory requirement, new documents cannot be modified or deleted after they are stored.What should a solutions architect do to meet this requirement?A. Store the uploaded documents in an Amazon S3 bucket with S3 Versioning and S3 Object Lock enabledB. Store the uploaded documents in an Amazon S3 bucket. Configure an S3 Lifecycle policy to archive the documents periodically.C. Store the uploaded documents in an Amazon S3 bucket with S3 Versioning enabled Configure an ACL to restrict all access to read-only.D. Store the uploaded documents on an Amazon Elastic File System (Amazon EFS) volume.Access the data by mounting the volume in read-only mode.Answer: AQUESTION 740A company has a Microsoft NET application that runs on an on-premises Windows Server. The application stores data by using an Oracle Database Standard Edition server. The company is planning a migration to AWS and wants to minimize development changes while moving the application. The AWS application environment should be highly available. Which combination of actions should the company take to meet these requirements? (Select TWO.)A. Refactor the application as serverless with AWS Lambda functions running NET Core.B. Rehost the application in AWS Elastic Beanstalk with the .NET platform in a Multi-AZ deployment.C. Replatform the application to run on Amazon EC2 with the Amazon Linus Amazon Machine Image (AMI).D. Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Amazon DynamoDB in a Multi-AZ deployment.E. Use AWS Database Migration Service (AWS DMS) to migrate from the Oracle database to Oracle on Amazon RDS in a Multi-AZ deployment.Answer: ADQUESTION 741A company wants to enforce strict security guidelines on accessing AWS Cloud resources as the company migrates production workloads from its data centers. Company management wants all users to receive permissions according to their job roles and functions. Which solution meets these requirements with the LEAST operational overhead?A. Create an AWS Single Sign-On deployment. Connect to the on-premises Active Directory to centrally manage users and permissions across the companyB. Create an IAM role for each job function. Require each employee to call the stsiAssumeRole action in the AWS Management Console to perform their job role.C. Create individual IAM user accounts for each employee Create an IAM policy for each job function, and attach the policy to all IAM users based on their job role.D. Create individual IAM user accounts for each employee. Create IAM policies for each job function.Create IAM groups, and attach associated policies to each group. Assign the IAM users to a group based on their Job role.Answer: DQUESTION 742A company provides machine learning solutions .The company's users need to download large data sets from the company's Amazon S3 bucket. These downloads often take a long lime, especially when the users are running many simulations on a subset of those datasets. Users download the datasets to Amazon EC2 instances in the same AWS Region as the S3 bucket. Multiple users typically use the same datasets at the same time.Which solution will reduce the lime that is required to access the datasets?A. Configure the S3 bucket lo use the S3 Standard storage class with S3 Transfer Acceleration activated.B. Configure the S3 bucket to use the S3 Intelligent-Tiering storage class with S3 Transfer Acceleration activated.C. Create an Amazon Elastic File System (Amazon EFS) network Tile system. Migrate the datasets by using AWS DataSync.D. Move the datasets onto a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volume. Attach the volume to all the EC2 instances.Answer: CQUESTION 743A company needs to retain its AWS CloudTrail logs (or 3 years. The company is enforcing CloudTrail across a set of AWS accounts by using AWS Organizations from the parent account. The CloudTrail target S3 bucket is configured with S3 Versioning enabled. An S3 Lifecycle policy is in place to delete current objects after 3 years.After the fourth year of use of the S3 bucket, the S3 bucket metrics show that the number of objects has continued to rise. However, the number of new CloudTrail logs that are delivered to the S3 bucket has remained consistent.Which solution will delete objects that are older than 3 years in the MOST cost-effective manner?A. Configure the organization's centralized CloudTrail trail to expire objects after 3 years.B. Configure the S3 Lifecycle policy to delete previous versions as well as current versions.C. Create an AWS Lambda function to enumerate and delete objects from Amazon S3 that are older than 3 years.D. Configure the parent account as the owner of all objects that are delivered to the S3 bucket.Answer: BQUESTION 744A company has a website hosted on AWS. The website is behind an Application Load Balancer (ALB) that is configured to handle HTTP and HTTPS separately. The company wants to forward all requests to the website so that the requests will use HTTPS.What should a solutions architect do to meet this requirement?A. Update the ALB's network ACL to accept only HTTPS trafficB. Create a rule that replaces the HTTP in the URL with HTTPS.C. Create a listener rule on the ALB to redirect HTTP traffic to HTTPS.D. Replace the ALB with a Network Load Balancer configured to use Server Name Indication (SNI).Answer: CQUESTION 745A company is deploying an application that processes large quantities of data in batches as needed. The company plans to use Amazon EC2 instances for the workload. The network architecture must support a highly scalable solution and prevent groups of nodes from sharing the same underlying hardware.Which combination of network solutions will meet these requirements? (Select TWO.)A. Create Capacity Reservations for the EC2 instances to run in a placement groupB. Run the EC2 instances in a spread placement group.C. Run the EC2 instances in a cluster placement group.D. Place the EC2 instances in an EC2 Auto Scaling group.E. Run the EC2 instances in a partition placement group.Answer: BCResources From:1.2021 Latest Braindump2go SAA-C02 Exam Dumps (PDF & VCE) Free Share:https://www.braindump2go.com/saa-c02.html2.2021 Latest Braindump2go SAA-C02 PDF and SAA-C02 VCE Dumps Free Share:https://drive.google.com/drive/folders/1_5IK3H_eM74C6AKwU7sKaLn1rrn8xTfm?usp=sharing3.2021 Free Braindump2go SAA-C02 Exam Questions Download:https://www.braindump2go.com/free-online-pdf/SAA-C02-PDF-Dumps(750-774).pdfhttps://www.braindump2go.com/free-online-pdf/SAA-C02-VCE-Dumps(724-749).pdfFree Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams! --------------------------------------------------- Images: --------------------------------------------------- --------------------------------------------------- Post date: 2021-10-25 06:46:20 Post date GMT: 2021-10-25 06:46:20 Post modified date: 2021-10-25 06:46:20 Post modified date GMT: 2021-10-25 06:46:20 ____________________________________________________________________________________________ Export of Post and Page as text file has been powered by [ Universal Post Manager ] plugin from www.gconverters.com