Sign in
agent:

AWS S3 Bucket Security Audits

There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

•Use Case: Ensure that S3 buckets are configured securely and do not expose sensitive data to the public.

•DagKnows can automate the scanning of S3 buckets to identify publicly accessible buckets or objects.

•The platform can trigger automated remediation actions, such as adjusting bucket policies or encrypting sensitive data.

  1. 1

    AWS S3 Bucket Public Write Access Audit : SOC2 Compliance

    There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

    This runbook conducts an audit, ensuring that S3 buckets within AWS do not allow unauthorized public write access. This audit reviews Block Public Access settings, bucket policies, and ACLs to adhere to SOC2's strict data security standards. It aims to identify and rectify any configurations that may compromise data integrity and confidentiality.

    1
    1. 1.1

      List the names of all S3 buckets

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task involves retrieving and listing the names of all the S3 buckets that are currently associated with your AWS account. By fetching this list, you gain an overview of the existing S3 buckets under your account, which can aid in resource management, access control, and tracking. This information is valuable for maintaining an organized and well-structured AWS environment, ensuring efficient storage utilization, and facilitating easy navigation of your stored data.

      import json cmd = "aws s3api list-buckets" output = _exe(None, cmd,cred_label=cred_label) #Parse the JSON response response_data = json.loads(output) #Extract bucket names bucket_names = [bucket["Name"] for bucket in response_data["Buckets"]] #Print the extracted bucket names: for bucket_name in bucket_names: print(bucket_name)
      copied
      1.1
    2. 1.2

      Check which buckets allow AWS S3 Bucket Public Write Access

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      The task involves auditing AWS S3 buckets to identify those that permit public write access. This process helps ensure data security by flagging buckets that might be vulnerable to unauthorized modifications.

      import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError import json creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def is_write_public(bucket_policy): """ Determines if the bucket policy allows public write access. """ try: policy_document = json.loads(bucket_policy['Policy']) except json.JSONDecodeError: print("Error parsing the bucket policy JSON.") return False for statement in policy_document.get('Statement', []): actions = statement.get('Action', []) actions = [actions] if isinstance(actions, str) else actions principals = statement.get('Principal', {}) # Checking if the principal is set to '*' (public access) is_public_principal = principals == '*' or principals.get('AWS') == '*' # Checking for 's3:Put*' or 's3:*' actions public_write_actions = any(action in ['s3:Put*', 's3:*'] or action.startswith('s3:Put') for action in actions) if is_public_principal and public_write_actions: return True return False def is_acl_public_write(bucket_acl): """ Determines if the bucket ACL allows public write access. """ for grant in bucket_acl['Grants']: if grant['Grantee'].get('Type') == 'Group' and grant['Grantee'].get('URI') == 'http://acs.amazonaws.com/groups/global/AllUsers': if 'WRITE' in grant['Permission']: return True return False def check_s3_buckets_public_write(): """ Checks all S3 buckets in the account to ensure they do not allow public write access. """ try: s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) buckets = s3.list_buckets().get('Buckets', []) if not buckets: print("No S3 buckets found in the account.") return for bucket in buckets: bucket_name = bucket['Name'] is_compliant = True # Check block public access settings try: public_access_block = s3.get_public_access_block(Bucket=bucket_name) if public_access_block['PublicAccessBlockConfiguration'].get('BlockPublicAcls', False) is False: print(f"Bucket '{bucket_name}' is non-compliant: Public Access Block allows public write.") is_compliant = False except ClientError as e: if e.response['Error']['Code'] != 'NoSuchPublicAccessBlockConfiguration': raise # Check the bucket policy try: bucket_policy = s3.get_bucket_policy(Bucket=bucket_name) if is_write_public(bucket_policy): print(f"Bucket '{bucket_name}' is non-compliant: Policy allows public write access.") is_compliant = False except ClientError as e: if e.response['Error']['Code'] != 'NoSuchBucketPolicy': raise # Check bucket ACL try: bucket_acl = s3.get_bucket_acl(Bucket=bucket_name) if is_acl_public_write(bucket_acl): print(f"Bucket '{bucket_name}' is non-compliant: ACL allows public write access.") is_compliant = False except ClientError: raise if is_compliant: print(f"Bucket '{bucket_name}' is compliant: No public write access detected.") print("Public write access check complete for all S3 buckets.") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") # Example usage check_s3_buckets_public_write() context.skip_sub_tasks=True
      copied
      1.2
      1. 1.2.1

        Enforce S3 Bucket Write Protection using Public Access Block Settings

        There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

        This task programmatically tightens security on a specified AWS S3 bucket by disabling public write access. It modifies the bucket's Block Public Access settings, ensuring compliance with data security standards. This preventive measure is critical in safeguarding sensitive data from unauthorized modifications.

        import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def disable_public_write_access(bucket_name): """ Disables public write access for a specified S3 bucket by updating Block Public Access settings and ACL. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) # Update Block Public Access settings to block public ACLs try: s3.put_public_access_block( Bucket=bucket_name, PublicAccessBlockConfiguration={ 'BlockPublicAcls': True, 'IgnorePublicAcls': True, 'BlockPublicPolicy': True, 'RestrictPublicBuckets': True } ) print(f"Updated Block Public Access settings for '{bucket_name}'.") except ClientError as e: print(f"Failed to update Block Public Access settings for '{bucket_name}': {e}") raise try: if bucket_name: #bucket_name = 'your-bucket-name' disable_public_write_access(bucket_name) else: print("Please provide a bucket name to restrict public access") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}")
        copied
        1.2.1
  2. 2

    AWS S3 Bucket Public Read Access Audit: SOC2 Compliance

    There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

    This runbook involves a thorough review of S3 bucket configurations to ensure they align with SOC2 standards by prohibiting public read access. It includes checking Block Public Access settings, analyzing bucket policies, and inspecting ACLs to prevent unauthorized data exposure. Essential for maintaining data integrity and confidentiality.

    2
    1. 2.1

      List the names of all S3 buckets

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task involves retrieving and listing the names of all the S3 buckets that are currently associated with your AWS account. By fetching this list, you gain an overview of the existing S3 buckets under your account, which can aid in resource management, access control, and tracking. This information is valuable for maintaining an organized and well-structured AWS environment, ensuring efficient storage utilization, and facilitating easy navigation of your stored data.

      import json cmd = "aws s3api list-buckets" output = _exe(None, cmd,cred_label=cred_label) #Parse the JSON response response_data = json.loads(output) #Extract bucket names bucket_names = [bucket["Name"] for bucket in response_data["Buckets"]] #Print the extracted bucket names: for bucket_name in bucket_names: print(bucket_name)
      copied
      2.1
    2. 2.2

      Check which buckets allow AWS S3 Bucket Public Read Access

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      The task involves scanning AWS S3 buckets to detect any that permit public read access, highlighting potential vulnerabilities in data privacy and security.

      import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError import json creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def is_read_public(bucket_policy): """ Determines if the bucket policy allows public read access. """ try: policy_document = json.loads(bucket_policy['Policy']) except json.JSONDecodeError: print("Error parsing the bucket policy JSON.") return False for statement in policy_document.get('Statement', []): actions = statement.get('Action', []) actions = [actions] if isinstance(actions, str) else actions principals = statement.get('Principal', {}) # Checking if the principal is set to '*' (public access) is_public_principal = principals == '*' or principals.get('AWS') == '*' # Checking for 's3:Get*' or 's3:*' actions public_read_actions = any(action in ['s3:Get*', 's3:*'] or action.startswith('s3:Get') for action in actions) if is_public_principal and public_read_actions: return True return False def is_acl_public_read(bucket_acl): """ Determines if the bucket ACL allows public read access. """ for grant in bucket_acl['Grants']: if grant['Grantee'].get('Type') == 'Group' and grant['Grantee'].get('URI') == 'http://acs.amazonaws.com/groups/global/AllUsers': if 'READ' in grant['Permission']: return True return False def check_s3_buckets_public_read(): """ Checks all S3 buckets in the account to ensure they do not allow public read access. """ try: s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) buckets = s3.list_buckets().get('Buckets', []) if not buckets: print("No S3 buckets found in the account.") return for bucket in buckets: bucket_name = bucket['Name'] is_compliant = True # Check block public access settings try: public_access_block = s3.get_public_access_block(Bucket=bucket_name) if not public_access_block['PublicAccessBlockConfiguration'].get('BlockPublicAcls', False): print(f"Bucket '{bucket_name}' is non-compliant: Public Access Block allows public read.") is_compliant = False except ClientError as e: if e.response['Error']['Code'] != 'NoSuchPublicAccessBlockConfiguration': raise # Check the bucket policy try: bucket_policy = s3.get_bucket_policy(Bucket=bucket_name) if is_read_public(bucket_policy): print(f"Bucket '{bucket_name}' is non-compliant: Policy allows public read access.") is_compliant = False except ClientError as e: if e.response['Error']['Code'] != 'NoSuchBucketPolicy': raise # Check bucket ACL try: bucket_acl = s3.get_bucket_acl(Bucket=bucket_name) if is_acl_public_read(bucket_acl): print(f"Bucket '{bucket_name}' is non-compliant: ACL allows public read access.") is_compliant = False except ClientError: raise if is_compliant: print(f"Bucket '{bucket_name}' is compliant: No public read access detected.") print("Public read access check complete for all S3 buckets.") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") # Example usage check_s3_buckets_public_read() context.skip_sub_tasks=True
      copied
      2.2
      1. 2.2.1

        Enforce S3 Bucket Read Protection using Public Access Block Settings

        There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

        This task strengthens data security by restricting public read access to specified AWS S3 buckets. It updates Block Public Access settings and ACLs, ensuring data confidentiality. This action aligns with security compliance standards to protect sensitive information.

        import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def disable_public_write_access(bucket_name): """ Disables public write access for a specified S3 bucket by updating Block Public Access settings and ACL. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) # Update Block Public Access settings to block public ACLs try: s3.put_public_access_block( Bucket=bucket_name, PublicAccessBlockConfiguration={ 'BlockPublicAcls': True, 'IgnorePublicAcls': True, 'BlockPublicPolicy': True, 'RestrictPublicBuckets': True } ) print(f"Updated Block Public Access settings for '{bucket_name}'.") except ClientError as e: print(f"Failed to update Block Public Access settings for '{bucket_name}': {e}") raise try: if bucket_name: #bucket_name = 'your-bucket-name' disable_public_write_access(bucket_name) else: print("Please provide a bucket name to restrict public access") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}")
        copied
        2.2.1
  3. 3

    AWS S3 Bucket Server-Side Encryption Audit: SOC2 Compliance

    There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

    This runbook methodically assesses and verifies server-side encryption configurations, identifying buckets that do not adhere to AES-256 or AWS KMS encryption standards. It aims to ensure all S3 buckets within an AWS environment meet stringent SOC2 encryption requirements, enhancing data security and compliance.

    3
    1. 3.1

      List the names of all S3 buckets

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task involves retrieving and listing the names of all the S3 buckets that are currently associated with your AWS account. By fetching this list, you gain an overview of the existing S3 buckets under your account, which can aid in resource management, access control, and tracking. This information is valuable for maintaining an organized and well-structured AWS environment, ensuring efficient storage utilization, and facilitating easy navigation of your stored data.

      import json cmd = "aws s3api list-buckets" output = _exe(None, cmd,cred_label=cred_label) #Parse the JSON response response_data = json.loads(output) #Extract bucket names bucket_names = [bucket["Name"] for bucket in response_data["Buckets"]] #Print the extracted bucket names: for bucket_name in bucket_names: print(bucket_name)
      copied
      3.1
    2. 3.2

      Check which AWS S3 buckets have Server Side Encryption enabled

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task assesses whether AWS S3 buckets have default server-side encryption activated or if their bucket policies explicitly deny any put-object requests that lack server-side encryption, specifically using AES-256 or AWS KMS. It designates S3 buckets as NON_COMPLIANT if they are not set to be encrypted by default.

      # Compliance Rule: s3-bucket-server-side-encryption-enabled # This rule checks each S3 bucket for two key criteria: # 1. Default Encryption: The bucket must have server-side encryption enabled by default, # using either AES-256 or AWS KMS. # 2. Policy Compliance: The bucket policy must explicitly deny put-object requests that # are not accompanied by server-side encryption using AES-256 or AWS KMS. # A bucket is considered NON_COMPLIANT if it does not have default encryption enabled. import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError import json creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def is_encryption_enabled(bucket_name): """ Check if the specified S3 bucket has server-side encryption enabled. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) try: # Attempt to retrieve the bucket encryption configuration encryption = s3.get_bucket_encryption(Bucket=bucket_name) #print(encryption) # for debugging # If this call is successful, encryption is enabled return True except ClientError as e: if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError': # Encryption is not enabled return False else: # Other errors raise def is_policy_compliant(bucket_name): """ Check if the bucket policy explicitly denies put-object requests without server-side encryption. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) try: policy = s3.get_bucket_policy(Bucket=bucket_name) policy_document = json.loads(policy['Policy']) #print(f"Policy for bucket '{bucket_name}': {policy_document}") # Debug statement for statement in policy_document.get('Statement', []): if statement.get('Effect') == 'Deny': actions = statement.get('Action', []) if isinstance(actions, str): actions = [actions] if any(action.startswith('s3:Put') for action in actions): conditions = statement.get('Condition', {}).get('StringEquals', {}) encryption_condition = conditions.get('s3:x-amz-server-side-encryption', None) if encryption_condition in ['AES256', 'aws:kms']: #print(f"Bucket '{bucket_name}' has a compliant policy with encryption requirement.") # Debug statement return True #print(f"Bucket '{bucket_name}' does not have a compliant policy with encryption requirement.") # Debug statement return False except ClientError as e: if e.response['Error']['Code'] == 'NoSuchBucketPolicy': #print(f"No policy for bucket '{bucket_name}'.") # Debug statement return False else: raise def check_all_buckets_for_encryption(): """ Check all S3 buckets in the account for server-side encryption and compliance with bucket policy. """ try: s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) buckets = s3.list_buckets().get('Buckets', []) if not buckets: print("No S3 buckets found in the account.") return for bucket in buckets: bucket_name = bucket['Name'] encrypted = is_encryption_enabled(bucket_name) policy_compliant = is_policy_compliant(bucket_name) # A bucket is considered NON_COMPLIANT if it does not have default encryption enabled. if encrypted: if policy_compliant: print(f"Bucket '{bucket_name}' is COMPLIANT with server-side encryption and has a compliant policy.") else: print(f"Bucket '{bucket_name}' is COMPLIANT with server-side encryption but does not have a compliant policy.") else: print(f"Bucket '{bucket_name}' is NON_COMPLIANT with server-side encryption.") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") check_all_buckets_for_encryption() context.skip_sub_tasks=True
      copied
      3.2
      1. 3.2.1

        AWS S3 Bucket Encryption Setup and Status Verification Process

        There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

        This task involves enabling AES-256 server-side encryption on S3 buckets and verifying its activation. This process ensures data security by encrypting contents within the buckets. By default all new buckets created are encrypted but this task beneficial for legacy buckets without encryption enabled.

        import boto3 from botocore.exceptions import ClientError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def enable_and_verify_bucket_encryption(bucket_name): """ Enable default AES-256 server-side encryption on the specified S3 bucket and verify the encryption status. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) encryption_configuration = {'Rules': [{'ApplyServerSideEncryptionByDefault': {'SSEAlgorithm': 'AES256'}}]} try: s3.put_bucket_encryption(Bucket=bucket_name, ServerSideEncryptionConfiguration=encryption_configuration) response = s3.get_bucket_encryption(Bucket=bucket_name) if response['ResponseMetadata']['HTTPStatusCode'] == 200: print(f"Encryption successfully enabled on bucket '{bucket_name}'.") else: print(f"Failed to verify encryption on bucket '{bucket_name}'.") except ClientError as e: print(f"AWS ClientError: {e.response['Error']['Message']}") except BotoCoreError as e: print(f"BotoCoreError: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") #bucket_name = 'test-sse-encryption-bucket-123' enable_and_verify_bucket_encryption(bucket_name)
        copied
        3.2.1
  4. 4

    AWS S3 Bucket Logging Enabled Audit: SOC2 Compliance

    There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

    This runbook automates the assessment and activation of Server Access Logging for Amazon S3 buckets. It aligns with SOC2 compliance guidelines by ensuring that every S3 bucket has logging enabled, contributing to better security and traceability of actions performed on the buckets.

    4
    1. 4.1

      List the names of all S3 buckets

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task involves retrieving and listing the names of all the S3 buckets that are currently associated with your AWS account. By fetching this list, you gain an overview of the existing S3 buckets under your account, which can aid in resource management, access control, and tracking. This information is valuable for maintaining an organized and well-structured AWS environment, ensuring efficient storage utilization, and facilitating easy navigation of your stored data.

      import json cmd = "aws s3api list-buckets" output = _exe(None, cmd,cred_label=cred_label) #Parse the JSON response response_data = json.loads(output) #Extract bucket names bucket_names = [bucket["Name"] for bucket in response_data["Buckets"]] #Print the extracted bucket names: for bucket_name in bucket_names: print(bucket_name)
      copied
      4.1
    2. 4.2

      Check which AWS S3 buckets have Server Access Logging enabled

      There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

      This task involves checking AWS S3 buckets to determine if Server Access Logging is enabled. It's crucial for monitoring and diagnosing security incidents, as it records requests made to the S3 bucket, enhancing transparency and aiding compliance with security protocols.

      # SOC2 Compliance Guideline: S3 Bucket Logging import boto3 from botocore.exceptions import ClientError, NoCredentialsError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def is_logging_enabled(bucket_name): """ Check if logging is enabled for the specified S3 bucket. """ s3 = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key) try: # Attempt to retrieve the bucket logging configuration logging_config = s3.get_bucket_logging(Bucket=bucket_name) # Logging is enabled if 'LoggingEnabled' key is present in the response return 'LoggingEnabled' in logging_config except ClientError as e: print(f"Error checking logging for bucket '{bucket_name}': {e}") raise def check_all_buckets_for_logging(): """ Check all S3 buckets in the account to ensure logging is enabled. """ try: s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) buckets = s3.list_buckets().get('Buckets', []) if not buckets: print("No S3 buckets found in the account.") return for bucket in buckets: bucket_name = bucket['Name'] if is_logging_enabled(bucket_name): print(f"Bucket '{bucket_name}' is COMPLIANT with logging enabled.") else: print(f"Bucket '{bucket_name}' is NON_COMPLIANT with logging disabled.") except NoCredentialsError: print("No AWS credentials found. Please configure your credentials.") except BotoCoreError as e: print(f"An error occurred accessing AWS S3 service: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") check_all_buckets_for_logging() context.skip_sub_tasks=True
      copied
      4.2
      1. 4.2.1

        AWS S3 Bucket Logging Setup and Verification

        There was a problem that the LLM was not able to address. Please rephrase your prompt and try again.

        This task involves setting up and verifying Server Access Logging for AWS S3 buckets. It ensures that logging is active for a bucket, providing detailed records of access requests. This is crucial for security monitoring, compliance with data governance standards, and effective management of AWS resources.

        import boto3 from botocore.exceptions import ClientError, BotoCoreError creds = _get_creds(cred_label)['creds'] access_key = creds['username'] secret_key = creds['password'] def enable_and_verify_logging(bucket_name, log_bucket, log_prefix): """ Enable logging for an S3 bucket and verify that it's been enabled, with additional checks. """ s3 = boto3.client('s3',aws_access_key_id=access_key,aws_secret_access_key=secret_key) # Check if required parameters are provided if not bucket_name or not log_bucket or not log_prefix: print("Error: Bucket name, logging bucket, or log prefix is missing.") return try: # Enable logging s3.put_bucket_logging( Bucket=bucket_name, BucketLoggingStatus={ 'LoggingEnabled': { 'TargetBucket': log_bucket, 'TargetPrefix': log_prefix } } ) print(f"Logging enabled for bucket '{bucket_name}'.") # Verify logging response = s3.get_bucket_logging(Bucket=bucket_name) if 'LoggingEnabled' in response: print("Logging Status: Enabled") print(f"HTTP Status Code: {response['ResponseMetadata']['HTTPStatusCode']}") print(f"Target Bucket: {response['LoggingEnabled']['TargetBucket']}") print(f"Target Prefix: {response['LoggingEnabled']['TargetPrefix']}") else: print("Logging is not enabled.") except ClientError as e: print(f"AWS ClientError: {e.response['Error']['Message']}") except BotoCoreError as e: print(f"BotoCoreError: {e}") except Exception as e: print(f"An unexpected error occurred: {e}") #bucket_name = 'encryption-test-bucket-789' #log_bucket = 'encryption-test-bucket-789' # It can be the same as bucket_name but not recommended #log_prefix = 'log-prefix/whatever' enable_and_verify_logging(bucket_name, log_bucket, log_prefix)
        copied
        4.2.1