How does Pipeline integrate with Amazon S3 buckets?

  • Updated

Customers have three options to integrate their AWS account to Tapestri Pipeline. We encourage you to create a new S3 bucket for Mission Bio files only and connect it with one of the methods below.

Specific permissions must be completed before integrating an S3 bucket into the Pipeline application. The actions the script performs and the permissions required are as follows:

  • Create an S3 bucket (s3:CreateBucket)
  • Wait until the bucket exists (s3:ListBucket)
  • Create a KMS key (kms:CreateKey)
  • Create a KMS alias (kms:CreateAlias)
  • Configure the S3 bucket to block public access (s3:PutPublicAccessBlock)
  • Configure the S3 bucket to use a server-side encryption by default (s3:PutEncryptionConfiguration)

This article covers:

How do I create a new S3 bucket from Tapestri Pipeline?

How do I integrate my S3 bucket into the Pipeline application?

Using the AWS Management Console

Using the AWS Command Line Interface (CLI)

Using Terraform

How do I create a new S3 bucket from Tapestri Pipeline?

  1. Log into Mission Bio Portal
  2. Launch Tapestri Pipeline v2.
  3. Go to the top-right corner and click your name. A drop-down menu will appear. Click Cloud Connector.
    Settings_Menu.png
  4. In the Amazon S3 Settings section, input your 12-digit AWS Account ID or the alias for the account.
    Amazon_S3_-_Cloud.png
  5. For the S3 Bucket Name, enter a name that is not being used by an existing bucket. We recommend using this bucket only for files you intend to import into Tapestri Pipeline.
  6. Click Generate S3 Bucket Policy. 

How do I integrate my S3 bucket into the Pipeline application?

Customers have three choices for integrating their S3 bucket to the Pipeline application – AWS Management Console, AWS Command Line Interface (CLI), and Terraform. Instructions for each procedure are below.

Note: If customers choose to use the AWS Management Console, they will create the bucket themselves in AWS. If customers choose to use the AWS Command Line Interface or Terraform, buckets will be automatically generated by the platform.

Using the AWS Management Console

After clicking Generate S3 Bucket Policy, customers are given a panel of options on the left side. Click Use AWS Management Console.

Management_Console_.png

You will be given two policies in Tapestri Pipeline. Copy the first policy received from Tapestri Pipeline. The second policy will be used later. Below is a sample of what your first policy should look like:

{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Principal": {

"AWS": "arn:aws:iam::<customer_account_id>:root"

},

"Action": "kms:*",

"Resource": "*"

},

{

"Effect": "Allow",

"Principal": {

"AWS": "arn:aws:iam::<missionbio_account_id>:root"

},

"Action": [

"kms:Decrypt",

"kms:DescribeKey",

"kms:Encrypt",

"kms:GenerateDataKey*",

"kms:ReEncrypt*"

],

"Resource": "*"

}

]

}

Note: Your policy will be auto-populated with fields specific to your account, meaning <customer_account_id> will differ depending on your account. 

From your AWS Management Console, perform the following:

  1. Search for and open the Key Management Service console.
    Key_Management_Service.png
  2. Click Customer managed keys.
  3. Click Create key and choose Symmetric as the key type in Step 1.Create_Key_-_AWS.png
    Symmetric_-_AWS.png
  4. Enter missionbio as the value for Alias in Step 2. 
  5. Skip Steps 3 and 4. 
  6. In Step 5, review and edit the key policy. Paste the first policy received from Tapestri Pipeline. 
  7. Open the S3 Management Console, and click Create bucket.
  8. Input the same unique name you used to label your bucket in Tapestri Pipeline.
  9. In the Default Encryption section:
          1. Select the Enable option for server-side encryption.
          2. Select AWS Key Management Service key (SSE-KMS) in Encryption key type.
          3. Select Choose from your KMS master keys in AWS KMS key.
          4. Select the KMS created earlier (alias set to missionbio).
  10. Select the S3 bucket just created.
  11. Open the Permissions tab on that bucket.
  12. Click Edit in the Bucket Policy section.
  13. Copy and paste the second policy from Tapestri Pipeline, and click Save changes. Below is a sample of what your second policy should look like. However your policy will be auto-populated with fields specific to your account.
  14.  

{

"Version": "2012-10-17",

"Statement": [

{

"Effect": "Allow",

"Principal": {

"AWS": "arn:aws:iam::<missionbio-account-id>:root"

},

"Action": "s3:ListBucket",

"Resource": "arn:aws:s3:::<customer-bucket-name>"

},

{

"Effect": "Allow",

"Principal": {

"AWS": "arn:aws:iam::<missionbio-account-id>:root"

},

"Action": [

"s3:AbortMultipartUpload",

"s3:DeleteObject",

"s3:ListMultipartUploadParts",

"s3:GetObject",

"s3:PutObject"

],

"Resource": "arn:aws:s3:::<customer-bucket-name>/*"

]

}

Using the AWS Command Line Interface (CLI)

After clicking Generate S3 Bucket Policy, customers are given a panel of options on the left side. Click Use AWS CLI. 


AWS_CLI.png

Click on script to download a script file necessary to connect Tapestri Pipeline to AWS.

Enter the script below and replace <S3_BUCKET_NAME> with your unique bucket name created earlier

$ tapestripipeline-aws-integration.sh <S3_BUCKET_NAME>

Using Terraform

After clicking Generate S3 Bucket Policy, customers are given a panel of options on the left side. Click Use Terraform. 


Terraform.png

Click on configuration to download a script file necessary to connect Tapestri Pipeline to AWS. 

To connect your bucket, run the script using the following commands:

  1. Run terraform init
  2. Run terraform plan
  3. Enter the unique name for your bucket created earlier: S3 bucket name: <s3-bucket-name>
Share this article:

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request