-
Suggestion
-
Resolution: Unresolved
-
1
-
It would be handy to have commands available to the pipeline for management of deployment and repository variables. I realise there is an API to handle this but a simple one line command would be much more practical.
My use case is that I have pipelines to create a a number of AWS CloudFormation stacks and for deployment of applications to AWS. I want to store some information about these stacks and applications somewhere that can be passed to future pipelines e.g. Elastic Beanstalk EnvironmentId, ApplicationName, EnvironmentName, StackId etc. These values change due to blue/green deployments, new stacks being spun up and old ones being terminated. Manually changing them is error prone and writing code to make api calls for a simple variable update seems overkill.
I have worked around this by storing the information in AWS SSM ParameterStore which works well.
An example of retrieving a value from here is as simple as:
#!bash export EB_APP_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/application-name --query 'Parameter.Value' --output text) export EB_ENVIRONMENT_ID=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-id --query 'Parameter.Value' --output text) export EB_ENVIRONMENT_NAME=$(aws ssm get-parameter --name /eb/$1/$ENV/environment-name --query 'Parameter.Value' --output text)
And overwriting an existing parameter using Boto3:
#!python ssmClient.put_parameter( Name='/eb/' + application_name + '/' + env_type + '/environment-name', Value=env['EnvironmentName'], Description='Elastic beanstalk environment name for ' + args.env_type, Type='String', Overwrite=True )
In short, a command line tool to complement the API would be useful, much like the AWS CLI complements their APIs.
Form Name |
---|