Details
-
Suggestion
-
Resolution: Unresolved
-
None
Description
Based on what I have read on the forum, each step of a Bitbucket Pipeline runs in a separate container. Also, there is currently no easy way to share environment variables between steps.
Here is an idea that would provide this without much work, assuming I'm not too wrong about how you implement pipelines: the basic idea is to automate the following approach that I currently use:
At the end of each step, I have some lines like this, that save variables into an artifact "shared_vars.sh", such that this artifact can be sourced in subsequent step:
# share variables with subsequent step(s): - echo export IMAGE_TAG=$IMAGE_TAG > shared_vars.sh - echo export IMAGE_NAME=$IMAGE_NAME >> shared_vars.sh
Then in subsequent steps I just need one line at the top of the script element, like this:
# get vars from previous step: - source shared_vars.sh
This seems very "automatable":
- have a section above pipelines where you can define variable names, eg
image: ... sharedVars: - IMAGE_NAME - IMAGE_TAG pipelines: ...
- Bitbucket pipelines engine automatically appends instructions at the end of the list of lines under script, to append code to export the value of each var appearing in sharedVars, just like I did; the file could a temporary "hidden" artifact, but it could just be a file that gets docker copy'd out of the container and into the next.
- Bitbucket pipelines engine automatically prepends one line at the start of the list of lines under script, to source that "hidden" env-var-sharing artifact, just like I did.
And voila! You have a feature that everyone has been asking for, for almost no work!
Attachments
Issue Links
- duplicates
-
BCLOUD-20293 Proposal for how to share data between steps
- Closed