Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions job_bundles/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,13 @@ S3 prefix, then distributes the hashing and data copies across a number of worke
uses content-addressed storage for data files, users that later submit jobs with these files attached will not have to
upload them.

### FFmpeg movie from job output

The [ffmpeg_movie_from_job_output](ffmpeg_movie_from_job_output) job bundle downloads the rendered output of another
completed job in the same queue and uses FFmpeg to encode the image sequence into an MP4 video. This is useful as a
post-processing utility — after a render job completes, submit this job with the source Job ID to automatically
assemble the frames into a movie with configurable frame rate, quality, and resolution settings.

### SSH via SSM Managed Node

The [ssh_to_smf](ssh_to_smf/README.md) job bundle registers a Deadline Cloud worker as an
Expand Down
1 change: 1 addition & 0 deletions job_bundles/ffmpeg_movie_from_job_output/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
s3_settings.json
98 changes: 98 additions & 0 deletions job_bundles/ffmpeg_movie_from_job_output/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,98 @@
# FFmpeg Movie from Job Output

## Introduction

This job bundle downloads the rendered output of another completed job in the same queue
and uses FFmpeg to encode the image sequence into an MP4 video file. It is useful as a
post-processing utility — for example, after a Blender or Maya render job completes, you
can submit this job to automatically assemble the frames into a movie.

See also [ffmpeg_encode_video](../ffmpeg_encode_video) for a simpler sample that encodes
a local image sequence without downloading from another job.

## How it works

A [pre-submission hook](https://github.com/aws-deadline/deadline-cloud/blob/mainline/docs/submission-hooks.md)
(`inject_s3_settings.py`) runs at submission time on your workstation and looks up the
queue's job attachment S3 bucket configuration. It writes the settings to a JSON file that
gets uploaded as a job attachment, so the worker can access S3 without needing any
Deadline Cloud API permissions.

On the worker, the job installs the `deadline` Python library via pip in a job environment,
then runs a single step that:

1. Uses the `deadline.job_attachments` Python API to download the output files from the
source job's job attachments in S3.
2. Auto-detects the image format from the downloaded files, sorts them alphabetically, and
encodes them into an H.264 MP4 video using FFmpeg with BT.709 color space metadata.

## Prerequisites

### Software

The job requires FFmpeg (from conda-forge) and the Deadline Cloud Python library (installed
via pip at runtime). On service-managed fleets, set the conda queue environment channel to
`conda-forge`. The job's `CondaPackages` parameter defaults to `ffmpeg`.

### Submission hooks

This job bundle uses a pre-submission hook to inject S3 settings. Enable bundle hooks
before submitting (one-time setup):

```bash
deadline config set settings.allow_bundle_hooks true
```

The hook runs on your local machine at submission time using your existing AWS credentials.
No additional IAM permissions are needed on the queue role.

### Source job requirements

- The source job must have completed and produced output files via job attachments.
- Both jobs must be in the same queue (they share the same job attachments S3 bucket).

## Parameters

| Parameter | Description | Default |
|-----------|-------------|---------|
| Source Job ID | The Job ID of the completed source job | (required) |
| Source Step ID | Restrict download to a specific step's output | (empty = all) |
| Frame Rate | Video frame rate in fps | 24 |
| Pixel Format | Output pixel format (`yuv420p` or `yuv444p`) | yuv420p |
| Encoding Preset | FFmpeg speed/compression tradeoff | medium |
| Constant Rate Factor | H.264 quality (0 = lossless, 51 = worst, 17-18 ≈ visually lossless) | 18 |
| Output Resolution | Optional WIDTHxHEIGHT override (e.g. `1920x1080`) | (empty = source) |
| Output Filename | Name of the output video file | output.mp4 |
| Output Directory | Where to save the video | output |

## Example submission

```bash
# Enable bundle hooks (one-time setup)
deadline config set settings.allow_bundle_hooks true

# Submit via GUI
deadline bundle gui-submit ffmpeg_movie_from_job_output/

# Submit via CLI
deadline bundle submit ffmpeg_movie_from_job_output/ \
-p SourceJobId=job-0123456789abcdef0123456789abcdef \
-p FrameRate=30 \
-p OutputFilename=my_render.mp4

# Download only a specific step's output
deadline bundle submit ffmpeg_movie_from_job_output/ \
-p SourceJobId=job-0123456789abcdef0123456789abcdef \
-p SourceStepId=step-0123456789abcdef0123456789abcdef
```

## Typical workflow

1. Submit a render job (e.g. Blender, Maya) to your queue.
2. Wait for the render job to complete.
3. Copy the Job ID from Deadline Cloud Monitor.
4. Submit this job bundle with the source Job ID.
5. Download the output video from Deadline Cloud Monitor.

You can also automate this by scripting the submission after the render job completes
using `deadline job wait` followed by `deadline bundle submit`.
5 changes: 5 additions & 0 deletions job_bundles/ffmpeg_movie_from_job_output/hooks.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
version: "1.0"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we mean to check this in?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes! based on feedback on the hooks PR hooks files have a version - https://github.com/aws-deadline/deadline-cloud/blob/mainline/docs/submission-hooks.md

preSubmission:
- command: python3
args: [inject_s3_settings.py]
timeout: 30
38 changes: 38 additions & 0 deletions job_bundles/ffmpeg_movie_from_job_output/inject_s3_settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
"""Pre-submission hook that writes job attachment S3 settings to a file in the bundle."""
import json
import os
import sys

from deadline.client import api

metadata = json.load(sys.stdin)
farm_id = metadata["farmId"]
queue_id = metadata["queueId"]
bundle_dir = metadata["jobBundleDir"]

print(f"Looking up job attachment settings for queue {queue_id}...", file=sys.stderr)
deadline = api.get_boto3_client("deadline")
queue = deadline.get_queue(farmId=farm_id, queueId=queue_id)
ja = queue.get("jobAttachmentSettings", {})

if not ja:
print("ERROR: Queue has no job attachment settings configured.", file=sys.stderr)
sys.exit(1)

bucket = ja["s3BucketName"]
prefix = ja["rootPrefix"]
print(f"S3 bucket: {bucket}, prefix: {prefix}", file=sys.stderr)

# Write settings file into the bundle so it gets uploaded as a job attachment
settings_path = os.path.join(bundle_dir, "s3_settings.json")
with open(settings_path, "w") as f:
json.dump({"s3BucketName": bucket, "rootPrefix": prefix}, f)

# Output asset reference so the file gets uploaded
print(json.dumps({
"attachments": {
"assetReferences": {
"inputFilenames": [settings_path]
}
}
}))
Loading
Loading