Skip to content

Commit bd0761f

Browse files
committed
Various README enhancements
1 parent 6d3705d commit bd0761f

File tree

1 file changed

+54
-52
lines changed

1 file changed

+54
-52
lines changed

README.md

Lines changed: 54 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,21 @@
1-
# Deploy StackStorm
1+
# Github Action: Deploy StackStorm
22

3-
GitHub action to deploy [StackStorm](https://stackstorm.com/) to an AWS VM (EC2).
3+
[![LICENSE](https://img.shields.io/badge/license-MIT-green)](LICENSE.md)
4+
5+
GitHub action to deploy [StackStorm](https://stackstorm.com/) to an AWS VM (EC2) with [Terraform](operations/deployment/terraform/modules) and [Ansible](https://github.com/stackstorm/ansible-st2).
46

57
## Prerequisites
68
- An [AWS account](https://aws.amazon.com/premiumsupport/knowledge-center/create-and-activate-aws-account/) and [Access Keys](https://docs.aws.amazon.com/powershell/latest/userguide/pstools-appendix-sign-up.html)
79
- The following secrets should be added to your GitHub actions secrets:
8-
- AWS_ACCESS_KEY_ID
9-
- AWS_SECRET_ACCESS_KEY
10-
- ST2_AUTH_USERNAME
11-
- ST2_AUTH_PASSWORD
10+
- `AWS_ACCESS_KEY_ID`
11+
- `AWS_SECRET_ACCESS_KEY`
12+
- `ST2_AUTH_USERNAME`
13+
- `ST2_AUTH_PASSWORD`
1214

1315

1416
## Example usage
1517

16-
Create `.github/workflow/deploy.yaml` with the following to build on push.
18+
Create a Github Action Workflow `.github/workflow/deploy.yaml` with the following to build on push to the `main` branch.
1719

1820
```yaml
1921
name: Deploy ST2 Single VM with GHA
@@ -22,96 +24,96 @@ on:
2224
push:
2325
branches: [ main ]
2426

25-
2627
jobs:
2728
deploy:
2829
runs-on: ubuntu-latest
2930
steps:
3031
- id: deploy
31-
name: Deploy
32+
name: Deploy StackStorm
33+
# TODO: pin to the specific version (best practices)
3234
uses: bitovi/github-actions-deploy-stackstorm@main
3335
with:
36+
aws_default_region: us-east-1
3437
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID}}
3538
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
36-
aws_default_region: us-east-1
3739
st2_auth_username: ${{ secrets.ST2_AUTH_USERNAME}}
3840
st2_auth_password: ${{ secrets.ST2_AUTH_PASSWORD}}
3941
```
4042
4143
This will create the following resources in AWS:
4244
- An EC2 instance
45+
- Route53 records
4346
- A load balancer
44-
- Security groups
45-
- Optionally, a VPC with subnets
47+
- Security groups (ports `80`, `443`, `22`)
48+
- Optionally, a VPC with subnets (see `aws_create_vpc`)
4649

47-
> For more details about what is created, see `operations/deployment/terraform/modules`
50+
> For more details about what is created, see [operations/deployment/terraform/modules](operations/deployment/terraform/modules/)
4851

4952
## Customizing
5053

5154
### Inputs
5255

53-
The following inputs can be used as `step.with` keys
56+
The following inputs can be used as `steps.with` keys:
5457

5558
| Name | Type | Default | Description |
5659
|------------------|---------|-------------|------------------------------------|
57-
| `checkout` | Bool | true | Specifies if this action should checkout the code (i.e. whether or not to run the `uses: actions/checkout@v3` action prior to deploying so that the deployment has access to the repo files) |
58-
| `aws_access_key_id` | String | | AWS access key ID (Required) |
59-
| `aws_secret_access_key` | String | | AWS secret access key (Required) |
60-
| `aws_session_token` | String | | AWS session token |
61-
| `aws_default_region` | String | us-east-1 | AWS default region (Required) |
62-
| `tf_state_bucket` | String | `${org}-${repo}-{branch}-tf-state` | AWS S3 bucket to use for Terraform state. |
63-
| `tf_state_bucket_destroy` | Bool | false | Force purge and deletion of tf_state_bucket defined. Any file contained there will be destroyed. `stack_destroy` must also be `true` |
64-
| `ec2_instance_profile` | String | | The AWS IAM instance profile to use for the EC2 instance |
65-
| `ec2_instance_type` | String | t2.medium | The AWS EC2 instance type. |
66-
| `stack_destroy` | Bool | false | Set to "true" to Destroy the stack |
67-
| `aws_resource_identifier` | String | `${org}-{repo}-{branch}` | Set to override the AWS resource identifier for the deployment. Use with destroy to destroy specific resources. |
68-
| `aws_create_vpc` | Bool | false | Whether an AWS VPC should be created in the action. |
69-
| `st2_auth_username` | String | | Username used by StackStorm standalone authentication |
70-
| `st2_auth_password` | String | | Password used by StackStorm standalone authentication |
71-
| `st2_packs` | String |`"st2"` | Comma separated list of packs to install. This flag does not work with a --python3 only pack.. If you modify this option, be sure to also include `st2` in the list. |
72-
| `infrastructure_only` | Bool | false | Does infrastructure (i.e. terraform) but **not** the deployment (i.e. ansible) |
73-
74-
75-
## Note about resource identifiers
76-
77-
Most resources will contain the tag GITHUB_ORG-GITHUB_REPO-GITHUB_BRANCH, some of them, even the resource name after.
78-
We limit this to a 60 characters string because some AWS resources have a length limit and short it if needed.
79-
80-
We use the kubernetes style for this. For example, kubernetes -> k(# of characters)s -> k8s. And so you might see some compressions are made.
81-
82-
For some specific resources, we have a 32 characters limit. If the identifier length exceeds this number after compression, we remove the middle part and replace it for a hash made up from the string itself.
60+
| `checkout` | bool | `true` | Specifies if this action should checkout the code (i.e. whether or not to run the `uses: actions/checkout@v3` action prior to deploying so that the deployment has access to the repo files) |
61+
| **AWS configuration** |
62+
| `aws_access_key_id` | string | | AWS access key ID (Required) |
63+
| `aws_secret_access_key` | string | | AWS secret access key (Required) |
64+
| `aws_session_token` | string | | AWS session token |
65+
| `aws_default_region` | string | `us-east-1` | AWS default region (Required) |
66+
| `ec2_instance_type` | string | `t2.medium` | The AWS EC2 instance type. |
67+
| `ec2_instance_profile` | string | | [The AWS IAM instance profile](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html) to use for the EC2 instance. Use if you want to pass an AWS role with specific permissions granted to the instance |
68+
| `aws_resource_identifier` | string | `${org}-{repo}-{branch}` | Auto-generated by default so it's unique for org/repo/branch. Set to override with custom naming the unique AWS resource identifier for the deployment. |
69+
| `aws_create_vpc` | bool | `false` | Whether an AWS VPC should be created in the action. Otherwise, the existing default VPC will be used. |
70+
| `infrastructure_only` | bool | `false` | Does infrastructure (i.e. terraform) but **not** the deployment (i.e. ansible) |
71+
| **Teraform configuration** |
72+
| `tf_state_bucket` | string | `${org}-${repo}-{branch}-tf-state` | AWS S3 bucket to use for Terraform state. By default, a new deployment will be created for each unique branch. Hardcode if you want to keep a shared resource state between the several branches. |
73+
| **StackStorm configuration** |
74+
| `st2_auth_username` | string | | Username used by StackStorm standalone authentication. Set as a secret in GH Actions. |
75+
| `st2_auth_password` | string | | Password used by StackStorm standalone authentication. Set as a secret in GH Actions. |
76+
| `st2_packs` | string |`"st2"` | Comma separated list of packs to install. If you modify this option, be sure to also include `st2` in the list. |
77+
| **Cleanup** |
78+
| `stack_destroy` | bool | `false` | Set to `true` to Destroy the created AWS infrastructure for this instance |
79+
| `tf_state_bucket_destroy` | bool | `false` | Force purge and deletion of `tf_state_bucket` defined. Any file contained there will be destroyed. `stack_destroy` must also be `true`. Use if you want to clean up the resources |
80+
81+
82+
## Note about AWS resource identifiers
83+
Most resources will contain the tag `GITHUB_ORG-GITHUB_REPO-GITHUB_BRANCH` to make them unique. Because some AWS resources have a length limit, we shorten identifiers to a `60` characters max string.
84+
85+
We use the Kubernetes style for this. For example, `Kubernetes` -> `k(# of characters)s` -> `k8s`. And so you might see how compressions are made.
86+
87+
For some specific resources, we have a `32` characters limit. If the identifier length exceeds this number after compression, we remove the middle part and replace it with a hash made up of the string itself.
8388

8489
### S3 buckets naming
85-
86-
Buckets name can be made of up to 63 characters. If the length allows us to add `-tf-state`, we will do so. If not, a simple `-tf` will be added.
90+
Bucket names can be made of up to 63 characters. If the length allows us to add `-tf-state`, we will do so. If not, a simple `-tf` will be added.
8791

8892
## Made with BitOps
89-
[BitOps](https://bitops.sh) allows you to define Infrastructure-as-Code for multiple tools in a central place. This action uses a BitOps [Operations Repository](https://bitops.sh/operations-repo-structure/) to set up the necessary Terraform and Ansible to create infrastructure and deploy to it.
93+
[BitOps](https://bitops.sh/) allows you to define Infrastructure-as-Code for multiple tools in a central place. This action uses BitOps [Operations Repository Structure](https://bitops.sh/operations-repo-structure/) to organize the necessary Terraform and Ansible steps, create infrastructure and deploy to it.
9094

9195
## Future
92-
In the future, this action will support more cloud providers (via [BitOps Plugins](https://bitops.sh/plugins/) like [AWS](https://github.com/bitops-plugins/aws)) such as
96+
In the future, this action may support more cloud providers (via [BitOps Plugins](https://bitops.sh/plugins/) like [AWS](https://github.com/bitops-plugins/aws)) such as:
9397
- [Google Cloud Platform](https://cloud.google.com/gcp)
9498
- [Microsoft Azure](https://azure.microsoft.com/en-us/)
9599
- [Nutanix](https://www.nutanix.com/)
96100
- [Open Stack](https://www.openstack.org/)
97101
- [VMWare](https://www.vmware.com/)
98102
- etc
99103

100-
This action will also support multiple deployment types such as:
104+
This action may also support multiple deployment types such as:
101105
- [Kubernetes](https://github.com/StackStorm/stackstorm-k8s)
102106
- Multi-VM
103107

104-
## Contributing
105-
We would love for you to contribute to [bitovi/github-actions-deploy-docker-to-ec2](https://github.com/bitovi/github-actions-deploy-docker-to-ec2). [Issues](https://github.com/bitovi/github-actions-deploy-docker-to-ec2/issues) and [Pull Requests](https://github.com/bitovi/github-actions-deploy-docker-to-ec2/pulls) are welcome!
108+
This action is still in its early stages, so we welcome your feedback! [Open an issue](issues/) if you have a feature request.
106109

107-
## License
108-
The scripts and documentation in this project are released under the [MIT License](https://github.com/bitovi/github-actions-deploy-docker-to-ec2/blob/main/LICENSE).
110+
## Contributing
111+
We would love for you to contribute to [bitovi/github-actions-deploy-stackstorm](/). [Issues](issues/) and [Pull Requests](pulls/) are welcome!
109112

110113
## Provided by Bitovi
111114
[Bitovi](https://www.bitovi.com/) is a proud supporter of Open Source software.
112115

113-
114116
## Need help?
115-
Bitovi has consultants that can help. Drop into [Bitovi's Community Slack](https://www.bitovi.com/community/slack), and talk to us in the `#devops` channel!
117+
Bitovi has consultants that can help. Drop into [Bitovi's Community Slack](https://www.bitovi.com/community/slack), and talk to us in the `#devops` channel!
116118

117-
Need DevOps Consulting Services? Head over to https://www.bitovi.com/devops-consulting, and book a free consultation.
119+
Need DevOps Consulting Services? Head over to https://www.bitovi.com/devops-consulting, and book a free consultation.

0 commit comments

Comments
 (0)