Run the Pipeline Container (AWS)
Last updated
Was this helpful?
Last updated
Was this helpful?
Configure Disdat with an ECR prefix and the name of your AWS Batch job queue
How to execute your container on Batch
How to look at your resulting bundles
Have you setup your AWS ? You will also need to . That means creating 1.) a and, 2.) a batch.
You then need to add your AWS Batch job queue name to the Disdat .
We're still assuming your running in (switched into) your examples context that has a remote attached:
Let's push our container up to AWS ECR. We use the same dockerize command. We say --no-build
because you built it in the prior .
You should see a bunch of transfer status updates as Docker moves the container to ECR.
If you log in to your AWS account you should see something like:
Once the job moves to the SUCCEEDED state, then the container has run successfully. If it didn't you can click on the job and follow the clicks through to its CloudWatch logs. Those look like:
Yay! We pushed a container, ran it up on AWS, and then grabbed all of our results!
There's a lot more to know about how Disdat manages bundles when it runs pipelines remotely. And how to control the size of the container instance of your job.
I need to build containers with unix*, R, or other dependencies
I need to know more details about (using dependencies, return types, etc.)
I need more information on