agent: |
Verify whether a CUR report has been set up with the proper prerequisites
This task checks for the existence of a specified S3 bucket and exports the names of CUR reports and base paths. If these do not exist, it automates the configuration of CUR reports, including S3 bucket creation, policy updates for cost data delivery, and CUR setup.
- 1D1slhAQVok2mP0ZuqJ04End to End Configuration of an AWS Cost And Usage Report(CUR) to a S3 Bucket
1
End to End Configuration of an AWS Cost And Usage Report(CUR) to a S3 Bucket
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.This runbook provides a comprehensive guide for setting up and configuring AWS Cost and Usage Reports (CUR) to be delivered to an S3 bucket. It covers the process from creating a new S3 bucket, updating its policy for CUR compatibility, to configuring the CUR settings to target the created bucket.
inputsoutputs1- 1.1q2k8ukldgYZiKQKHmL64Create a New AWS S3 Bucket
1.1
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.This task involves creating a new Amazon S3 bucket in a specified AWS region. It's the initial step in setting up a destination for storing Cost and Usage Reports.
inputsoutputsimport boto3 from botocore.exceptions import ClientError # Retrieve AWS credentials from the vault creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def create_s3_bucket(bucket_name, region): """ Creates an S3 bucket in a specified region. :param bucket_name: Name of the S3 bucket to create. :param region: Region to create the bucket in. """ s3_client = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) try: if region == 'us-east-1': #Your default region should be specified here s3_client.create_bucket(Bucket=bucket_name) else: s3_client.create_bucket(Bucket=bucket_name, CreateBucketConfiguration={'LocationConstraint': region}) print(f"S3 bucket '{bucket_name}' created in {region}.") except ClientError as e: print(f"Error creating S3 bucket: {e}") # Example usage #bucket_name = 'test-this-cur-logging-bucket-1234' # Replace with your desired bucket name #region_name = 'us-east-1' # Replace with your desired region, e.g., 'us-east-1' #print(f"bucket received from upstream task {BUCKET_NAME}") #print(f"region name received from upstream task {region_name}") create_s3_bucket(BUCKET_NAME, region_name)copied1.1 - 1.2beFXCgLXgZ2RhT96pGCRUpdate the AWS S3 Bucket Policy to Allow CUR Logging
1.2
Update the AWS S3 Bucket Policy to Allow CUR Logging
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.In this task, the S3 bucket's policy is updated to grant necessary permissions for AWS Cost and Usage Reports to deliver log files to the bucket, ensuring secure and compliant data storage.
inputsoutputsimport boto3 import json from botocore.exceptions import ClientError # Retrieve AWS credentials from the vault creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] # Initialize STS client and get account ID sts_client = boto3.client('sts', aws_access_key_id=access_key, aws_secret_access_key=secret_key) account_id = sts_client.get_caller_identity()["Account"] def update_s3_bucket_policy_for_cur(bucket_name, account_id, region): """ Updates the S3 bucket policy to allow AWS CUR to deliver log files. :param bucket_name: Name of the S3 bucket. :param account_id: AWS account ID. :param region: AWS region. """ policy = { "Version": "2008-10-17", "Id": "Policy1335892530063", "Statement": [ { "Sid": "Stmt1335892150622", "Effect": "Allow", "Principal": { "Service": "billingreports.amazonaws.com" }, "Action": [ "s3:GetBucketAcl", "s3:GetBucketPolicy" ], "Resource": f"arn:aws:s3:::{bucket_name}", "Condition": { "StringEquals": { "aws:SourceAccount": account_id, "aws:SourceArn": f"arn:aws:cur:us-east-1:{account_id}:definition/*" # These endpoints here only work on us-east-1 even if the region_name is different } } }, { "Sid": "Stmt1335892526596", "Effect": "Allow", "Principal": { "Service": "billingreports.amazonaws.com" }, "Action": "s3:PutObject", "Resource": f"arn:aws:s3:::{bucket_name}/*", "Condition": { "StringEquals": { "aws:SourceAccount": account_id, "aws:SourceArn": f"arn:aws:cur:us-east-1:{account_id}:definition/*" # These endpoints here only work on us-east-1 even if the region_name is different } } } ] } s3_client = boto3.client('s3', aws_access_key_id=access_key, aws_secret_access_key=secret_key) try: s3_client.put_bucket_policy(Bucket=bucket_name, Policy=json.dumps(policy)) print(f"Bucket policy updated to allow CUR deliveries for '{bucket_name}'.") except ClientError as e: print(f"Error updating bucket policy: {e}") # Example usage # bucket_name = 'test-this-cur-logging-bucket-1234' # Replace with the name of your existing bucket # region_name = 'us-east-1' # Replace with your region, e.g., 'us-east-1' update_s3_bucket_policy_for_cur(BUCKET_NAME, account_id, region_name)copied1.2 - 1.3QihhZKCozaisYMq4aa9FConfigure AWS Cost And Usage Report to a S3 Bucket
1.3
Configure AWS Cost And Usage Report to a S3 Bucket
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.This task involves configuring AWS Cost and Usage Reports (CUR) to direct the reports to the newly created and configured S3 bucket, finalizing the setup for report generation and storage.
inputsoutputsimport boto3 from botocore.exceptions import ClientError # Retrieve AWS credentials from the vault creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def configure_cur_report(bucket_name, report_name, region_name): """ Configures AWS Cost and Usage Report to be delivered to an S3 bucket with Parquet format for Athena. :param bucket_name: Name of the S3 bucket for report delivery. :param report_name: Name of the report. :param region_name: AWS region where the S3 bucket is located. """ cur_client = boto3.client('cur', aws_access_key_id=access_key, aws_secret_access_key=secret_key, region_name='us-east-1') report_definition = { 'ReportName': report_name, 'TimeUnit': 'HOURLY', 'Format': 'Parquet', 'Compression': 'Parquet', 'S3Bucket': bucket_name, 'S3Prefix': f"{report_name}/{report_name}/date-range/", 'S3Region': region_name, 'AdditionalSchemaElements': ['RESOURCES'], 'ReportVersioning': 'OVERWRITE_REPORT', # Updated to OVERWRITE_REPORT 'AdditionalArtifacts': ['ATHENA'], # Enable integration for Athena 'RefreshClosedReports': True } try: response = cur_client.put_report_definition(ReportDefinition=report_definition) print(f"CUR report '{report_name}' configured for delivery to '{bucket_name}'.") except ClientError as e: print(f"Error configuring CUR report: {e}") # Example usage #bucket_name = 'dagknows-cur-logging-bucket-athena-test-188379622596' #report_name = 'My-CUR-report-Athena-Test-1234' #region_name = 'us-east-1' # Replace with your region, e.g., 'us-east-1' configure_cur_report(BUCKET_NAME, report_name, region_name)copied1.3
- 2D3Ak6K6gZMMWVI1v4OgNCUR Integration using AWS Athena, S3, Glue
2
CUR Integration using AWS Athena, S3, Glue
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.This task involves setting up an integration to analyze AWS Cost and Usage Report (CUR) data using AWS Athena, storing the data in S3, and organizing it with AWS Glue Crawler.
inputsoutputs2- 2.1HzshumZwX8irum3pnXEeCreate CloudFormation Stack for Athena CUR Setup
2.1
Create CloudFormation Stack for Athena CUR Setup
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.Develop a CloudFormation stack to automate the deployment of necessary resources for querying CUR data with Athena, ensuring all components are correctly configured and linked.
inputsoutputsimport boto3 from botocore.exceptions import NoCredentialsError, PartialCredentialsError, ClientError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] bucket_name = BUCKET_NAME # Parameters #bucket_name = 'dagknows-cur-logging-bucket-athena-test-188379622596' #report_name = 'My-CUR-report-Athena-Test-1234' stack_name = 'MyCURReportAthenaStack' #region_name = 'us-east-1' def create_cloudformation_stack(stack_name, template_body, region='us-east-1'): """ Create a CloudFormation stack in AWS using the provided template. :param stack_name: The name of the CloudFormation stack to be created. :param template_body: String containing the YAML formatted CloudFormation template. :param region: AWS region where the stack will be created. Default is 'us-east-1'. :return: None """ cf_client = boto3.client('cloudformation', aws_access_key_id=access_key, aws_secret_access_key=secret_key, region_name=region) try: print(f"Initiating stack creation: {stack_name}") response = cf_client.create_stack( StackName=stack_name, TemplateBody=template_body, Capabilities=['CAPABILITY_NAMED_IAM'], OnFailure='DELETE', TimeoutInMinutes=10, EnableTerminationProtection=False ) print(f"Stack creation initiated successfully, stack ID: {response['StackId']}") except cf_client.exceptions.LimitExceededException: print("Error: You have exceeded your AWS CloudFormation limits.") except cf_client.exceptions.AlreadyExistsException: print("Error: A stack with the same name already exists.") except cf_client.exceptions.TokenAlreadyExistsException: print("Error: A client request token already exists.") except cf_client.exceptions.InsufficientCapabilitiesException: print("Error: Insufficient capabilities to execute this operation. Check required permissions.") except ClientError as e: print(f"AWS ClientError: {e.response['Error']['Message']}") except Exception as e: print(f"An unexpected error occurred: {e}") def load_template_from_s3(bucket_name, report_name): """ Load a CloudFormation template from an S3 bucket. :param bucket_name: The name of the S3 bucket. :param report_name: The base directory name of the CUR report. :return: String of the file's contents or None if there's an error. """ s3_client = boto3.client('s3', aws_access_key_id=access_key, aws_secret_access_key=secret_key) key = f"{report_name}/{report_name}/date-range/{report_name}/crawler-cfn.yml" # If there is a s3 key error then check this step for crawler.yml S3 URI try: response = s3_client.get_object(Bucket=bucket_name, Key=key) print("Template fetched successfully from S3.") return response['Body'].read().decode('utf-8') except NoCredentialsError: print("Error: No AWS credentials were provided.") except PartialCredentialsError: print("Error: Incomplete AWS credentials were provided.") except ClientError as e: print(f"Error accessing S3: {e.response['Error']['Message']}") except Exception as e: print(f"An unexpected error occurred while loading the template from S3: {e}") return None def stack_exists(stack_name, region): """ Check if a CloudFormation stack exists. :param stack_name: The name of the CloudFormation stack. :param region: AWS region where the stack is located. :return: Boolean indicating if the stack exists. """ cf_client = boto3.client('cloudformation', aws_access_key_id=access_key, aws_secret_access_key=secret_key, region_name=region) try: response = cf_client.describe_stacks(StackName=stack_name) return len(response['Stacks']) > 0 except cf_client.exceptions.ClientError as e: if 'does not exist' in str(e): return False else: print(f"Error checking if stack exists: {e}") return True def main(): print("Loading the CloudFormation template from S3...") template_body = load_template_from_s3(bucket_name, report_name) if template_body: print("Template loaded successfully.") if stack_exists(stack_name, region_name): print(f"CloudFormation stack '{stack_name}' already exists. Skipping creation.") else: create_cloudformation_stack(stack_name, template_body, region_name) else: print("Failed to load template. Exiting.") main()copied2.1 - 2.2Am2eJdv5VJFGEgA7YLmqVerify whether the CloudFormation stack has been created
2.2
Verify whether the CloudFormation stack has been created
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.Check if the specified CloudFormation stack is already present to avoid redundancy, ensuring that resources are managed efficiently without duplication.
inputsoutputsimport boto3 import time from botocore.exceptions import ClientError, EndpointConnectionError, NoCredentialsError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def check_stack_creation_status(stack_name, region='us-east-1'): """ Polls the CloudFormation stack status until it is in a completed state or failed. :param stack_name: The name of the CloudFormation stack to check. :param region: AWS region where the stack is located. Default is 'us-east-1'. :return: None """ try: cf_client = boto3.client('cloudformation', aws_access_key_id=access_key,aws_secret_access_key=secret_key,region_name=region) except NoCredentialsError: print("Error: No valid AWS credentials found.") return except EndpointConnectionError: print(f"Error: Unable to connect to the CloudFormation service endpoint in {region}.") return while True: try: response = cf_client.describe_stacks(StackName=stack_name) stack_status = response['Stacks'][0]['StackStatus'] print(f"Current status of the stack '{stack_name}': {stack_status}") if stack_status == 'CREATE_COMPLETE': print(f"Stack {stack_name} has been created successfully.") break elif 'FAILED' in stack_status or 'ROLLBACK' in stack_status or stack_status == 'DELETE_COMPLETE': print(f"Stack {stack_name} creation failed with status: {stack_status}") break except ClientError as e: if e.response['Error']['Code'] == 'ValidationError': print("Error: The specified stack name does not exist.") else: print(f"Failed to get stack status: {e.response['Error']['Message']}") break except Exception as e: print(f"An unexpected error occurred: {e}") break time.sleep(10) # Wait for 10 seconds before checking the stack status again #stack_name = 'MyCURReportAthenaStack' # Replace with your stack name #region_name = "us-east-1" # Specify the AWS region check_stack_creation_status(stack_name, region_name)copied2.2 - 2.3KwsIZ9b1qbdxgzhF5HPCCreate s3 bucket to store athena cur query results and skip if it exists
2.3
Create s3 bucket to store athena cur query results and skip if it exists
There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.Establish an S3 bucket dedicated to storing results from Athena queries on CUR data; this process includes a check to skip creation if the bucket already exists.
inputsoutputsimport boto3 from botocore.exceptions import ClientError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def get_account_number(): """ Fetches the AWS account number using boto3 and STS. :return: AWS account number as a string. """ sts_client = boto3.client('sts',aws_access_key_id=access_key,aws_secret_access_key=secret_key,region_name="us-east-1") try: account_id = sts_client.get_caller_identity()["Account"] print(f"AWS account number fetched: {account_id}") return account_id except ClientError as e: print(f"Error fetching AWS account number: {e}") return None def bucket_exists(bucket_name): """ Check if an S3 bucket exists. :param bucket_name: Name of the S3 bucket. :return: True if the bucket exists, False otherwise. """ s3_client = boto3.client('s3', aws_access_key_id=access_key, aws_secret_access_key=secret_key) try: s3_client.head_bucket(Bucket=bucket_name) return True except ClientError: return False def create_s3_bucket(bucket_name, region='us-east-1'): """ Creates an S3 bucket in a specified region, handling the 'us-east-1' special case. :param bucket_name: Name of the S3 bucket to create. :param region: Region to create the bucket in. """ s3_client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key) try: create_bucket_params = { 'Bucket': bucket_name } # Add the LocationConstraint for regions other than 'us-east-1'/Account home region if region != 'us-east-1': create_bucket_params['CreateBucketConfiguration'] = { 'LocationConstraint': region } s3_client.create_bucket(**create_bucket_params) print(f"S3 bucket '{bucket_name}' created in {region}.") except ClientError as e: if e.response['Error']['Code'] in ['BucketAlreadyExists', 'BucketAlreadyOwnedByYou']: print(f"Bucket {bucket_name} already exists in {region}.") else: print(f"Error creating S3 bucket: {e.response['Error']['Message']}") except BotoCoreError as e: print(f"Unexpected error occurred while creating bucket: {e}") account_number = get_account_number() if account_number: bucket_name = f'dagknows-cur-logging-bucket-athena-query-results-{account_number}' #region_name = 'us-east-1' # Specify the AWS region # Check if the bucket exists and create if not if bucket_exists(bucket_name): print(f"Bucket {bucket_name} already exists in {region_name}. Skipping creation.") else: create_s3_bucket(bucket_name, region_name) else: print("Failed to fetch account number. Exiting.")copied2.3