AWS Serverless Application Model (SAM) was released a couple months ago. The punch line of this new release in my mind is the ability to version your lambda function code and your cloudformation template next to each other. The idea being to have completely packaged serverless application that deploy from a single repository.
I spent an afternoon playing around with AWS SAM, and I'm already a pretty big fan. It makes deploying lambda functions a lot easier, especially when you have different accounts you want to use them in.
The example below is to create a lambda function that tags EBS volumes as they become available
I spent an afternoon playing around with AWS SAM, and I'm already a pretty big fan. It makes deploying lambda functions a lot easier, especially when you have different accounts you want to use them in.
The example below is to create a lambda function that tags EBS volumes as they become available
AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Resources: EbsVolumeAvailableTagger: Type: AWS::Serverless::Function Properties: Handler: ebs_available_date_tagger.lambda_handler Role: !GetAtt EbsCleanerIAMRole.Arn Runtime: python2.7 IAMEbsVolumeListTagPolicy: Type: "AWS::IAM::Policy" DependsOn: EbsCleanerIAMRole Properties: PolicyName: !Sub "Role=EBSCleaner,Env=${AccountParameter},Service=Lambda,Rights=RW" PolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Action: - "ec2:CreateTags" - "ec2:DeleteTags" - "ec2:DescribeTags" - "ec2:DescribeVolumeAttribute" - "ec2:DescribeVolumeStatus" - "ec2:DescribeVolumes" Resource: - "*" Roles: - !Ref EbsCleanerIAMRole EbsCleanerIAMRole: Type: "AWS::IAM::Role" Properties: RoleName: !Sub "Role=EbsCleaner,Env=${AccountParameter},Service=Lambda" AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - "lambda.amazonaws.com" Action: - "sts:AssumeRole" Path: "/" Parameters: AccountParameter: Type: String Default: NoPHI AllowedValues: - Prod - Staging - Corporate Description: Enter the account where this lambda function is being created. Will be used to properly name the created IAM roleAnd then the python that it runs
import boto3 import re import logging import time def lambda_handler(event, context): # Number of days to wait before deleting a volume volumeDaysOld = 30 # Get a cloudwatch logger logger = logging.getLogger('EbsVolumeCleanup') logger.setLevel(logging.DEBUG) # Obtain boto3 resources logger.info('Getting boto 3 resources') opsworksClient = boto3.client('opsworks') ec2Client = boto3.client('ec2') availableVolumes = ec2Client.describe_volumes(Filters=[{'Name':'status','Values':['available']}]) availableVolumesToTag = [] for volume in availableVolumes['Volumes']: logger.info(volume) if 'Tags' in volume: tags = volume['Tags'] availableDate = (tag for tag in tags if tag['Key'] == 'volumeAvailableDate').next() if availableDate: logger.info('Volume was available' + availableDate['Value']) else: logger.info('Volume not yet tagged') availableVolumesToTag.append(volume['VolumeId']) else: availableVolumesToTag.append(volume['VolumeId']) logger.info('Volumes to be tagged available: ' + "-" + str(len(availableVolumesToTag)) + " " + "|".join(availableVolumesToTag)) if availableVolumesToTag: ec2Client.create_tags(Resources=availableVolumesToTag,Tags=[{'Key':'volumeAvailableDate','Value':time.strftime("%d/%m/%Y")}]) return 0If you put the two of these into a directory together, you can use the aws cloudformation package and deploy CLI commands to push them to your account.
aws cloudformation package --template-file ec2-management-cft.yml --output-template-file instance-management-cft-staging.yml --s3-bucket cft-deployment-bucket --s3-prefix "lambda/ec2-management aws cloudformation deploy --template-file instance-management-cft-sandbox.yml --stack-name Sandbox-InstanceManagement --capabilities CAPABILITY_NAMED_IAM --parameter-overrides AccountParameter=StagingThose commands will package your lambda function, inserting the correct CodeUri property.
No comments:
Post a Comment