AWS-Solution Architect Associate-Q & A

We have published 100+ questions and its answer. Post clicking on question, you can see its answer.Lets practice and navigate to quiz set if you want to do more practice. We have divided 100+ questions in separate pages so navigate to next page at the end of the page.

1) As a solution architect, I am planning to launch a RedShift cluster for processing and analyzing a large amount of data. For that purpose,The RedShift cluster will be deployed into a VPC with multiple subnets. Which construct is used when provisioning the cluster to allow you to specify a set of subnets in the VPC that the cluster will be deployed into? 
Ø  Cluster Subnet Group 
Ø  Availability Zone (AZ) 
Ø  Subnet Group 
Ø  DB Subnet Group

Ans: Cluster Subnet Group

2)Consider you are Company owner and you have to restrict the ability of most users to change their own passwords while continuing to allow a select group of users within specific user groups.
What is the best way to achieve this? (choose 2)
 ->Create an IAM Policy that grants users the ability to change their own password and attach it to the individual user accounts
 ->Under the IAM Password Policy deselect the option to allow users to change their own passwords
 ->Disable the ability for all users to change their own passwords using the AWS Security Token Service
 ->Create an IAM Role that grants users the ability to change their own password and attach it to the groups that contain the users
 ->Create an IAM Policy that grants users the ability to change their own password and attach it to the groups that contain the users
 

Ans: Under the IAM Password Policy deselect the option to allow users to change their own passwords
Create an IAM Policy that grants users the ability to change their own password and attach it to the groups that contain the users

3)      Suppose there is client who is working on streaming data. now there is a requirement in which,
You have been asked to implement a solution for capturing, transforming and loading streaming data into an Amazon RedShift cluster. The solution will capture data from Amazon Kinesis Data Streams. Which AWS services would you utilize in this scenario? (choose 2)
•Lambda for transforming the data
•EMR for transforming the data
•Kinesis Video Streams for capturing the data and loading it into RedShift
•Kinesis Data Firehose for capturing the data and loading it into RedShift
•AWS Data Pipeline for transforming the data



Ans: Kinesis Data Firehose for capturing the data and loading it into RedShift   AND
Lambda for transforming the data.

4) A research company is developing an information lake resolution in Amazon S3 to investigate immense datasets. the answer makes rare SQL queries solely. additionally, the corporate needs to decrease infrastructure prices.
Which AWS service should be used to meet these requirements?
Amazon Athena
Amazon Aurora

Ans: a) Amazon Athena

5) There is a new demand to implement in-memory caching for a Services
Application because of increasing read-heavy load. the information should be keep persistently. Automatic failover across AZs is additionally needed.
Which two items from the list below are required to deliver these requirements? (choose 2)
Multi-AZ with Cluster mode and Automatic Failover enabled
ElastiCache with the Redis engine

Ans : both.
v Redis engine stores data persistently
v Memached engine does not store data persistently
v Redis engine supports Multi-AZ using read replicas in another AZ in the same region
v You can have a fully automated, fault tolerant ElastiCache-Redis implementation by enabling both cluster mode and multi-AZ failover
v Memcached engine does

6) An application you manage stores encrypted informations in S3 buckets. you would like to be ready to search the encrypted informations through SQL queries and write the encrypted results back the S3 bucket. because the data is sensitive you would like to implement fine-grained management over access to the S3 bucket.
What combination of services represent the BEST options support these requirements? (choose 2)
Use IAM policies to restrict access to the bucket
Use Athena for querying the data and writing the results back to the bucket
Use bucket ACLs to restrict access to the bucket

Ans- Use IAM policies to restrict access to the bucket
Use Athena for querying the data and writing the results back to the bucket

7) As an user, I have to build an application that will collect information about system behavior. The application will rapidly ingest large amounts of dynamic data and requires very low latency. The database must be scalable without incurring downtime. Which database will be recommended?
DynamoDB
RDS with Microsoft SQL

ANS – DynamoDB

8) As a team member, you would like to share documents to other teammates.so suppose
You would like to share some documents with public users accessing an S3 bucket over the Internet.What are two valid methods of granting public read permissions so you can share the documents? (choose 2)
•Use the AWS Policy Generator to create a bucket policy for your Amazon S3 bucket granting read access to public anonymous users
•Share the documents using CloudFront and a static website
• Grant public read on all objects using the S3 bucket ACL
•Grant public read access to the objects when uploading
•Share the documents using a bastion host in a public subnet

ANS: Use the AWS Policy Generator to create a bucket policy for your Amazon S3 bucket granting read access to public anonymous users
Grant public read access to the objects when uploading

9) A Solutions creator must permit another AWS account programmatic access to transfer objects to his bucket. The Solutions creator must make sure that he retains full management of the objects uploaded to the bucket. however will this be done?

ANS- The Architect can use a resource-based bucket policy that grants cross-account access and include a conditional statement that only allows uploads if full control access is granted to the Architect

DETAILS:
• You can use a resource-based bucket policy to allow another AWS account to upload objects to your bucket and use a conditional statement to ensure that full control permissions are granted to a specific account identified by an ID (e.g. email address)
• You cannot use a resource-based ACL with IAM policy as this configuration does not support conditional statements
• Taking ownership of objects is not a concept that is valid in Amazon S3 and asking the user in the other AWS account to grant access when uploading is not a good method as technical controls to enforce this behaviour are preferred

10) A manufacturing company captures knowledge from machines running at client sites. Currently, thousands of machines send knowledge each five minutes, and this can be expected to grow to many thousands of machines within the close to future. the information is logged with the intent to be analyzed within the future
What is the SIMPLEST method to store this streaming data?

ANS- Create an Amazon Kinesis Firehose delivery stream to store the data in Amazon S3
• Kinesis Data Firehose is the easiest way to load streaming data into data stores and analytics tools. It captures, transforms, and loads streaming data and you can deliver the data to “destinations” including Amazon S3 buckets for later analysis
• Writing data into RDS via a series of EC2 instances and a load balancer is more complex and more expensive. RDS is also not an ideal data store for this data
• Using an SQS queue to store the data is not possible as the data needs to be stored long-term and SQS queues have a maximum retention time of 14 days
• Storing the data in EBS wold be expensive and as EBS volumes cannot be shared by multiple instances you would have a bottleneck of a single EC2 instance writing the data

Click on Next page number to continue.