Video Editing Workflow Page Header

PugetBench for Premiere Pro

Our benchmarks are designed in partnership with hardware and software industry leaders, as well as end users and influencers who use Premiere Pro daily, to ensure that they are representative of real-world workflows.

Quickly Jump To: Key FeaturesHardware RequirementsTest BreakdownScoringUpdate LogLegacy & Beta Versions

Adobe Premiere Pro is one of the leading NLE applications for video editing. From simple video cuts to advanced color grading and effects, Premiere Pro is an industry-leading solution for beginners and experienced editors.

PugetBench for Premiere Pro runs on top of your installed copy of Adobe Premiere Pro, providing benchmark data directly from the application. Our benchmarks are designed in partnership with hardware and software industry leaders, as well as end users and influencers who use Premiere Pro daily, to ensure that they are representative of real-world workflows.

Key Features

Adobe Premiere Pro CC Icon

Realistic Testing

Interfaces with Adobe Premiere Pro and benchmarks real-world codecs & effects.

Checklist Icon in Puget Colors

Comprehensive

Provides a detailed analysis of results across multiple tests outlined below.

Computer Screen Icon in Puget Systems Colors

Public Database

Compare your scores to thousands of other user-submitted results.

Hardware Requirements

Windows

  • Adobe Premiere Pro 23.3 or newer
  • 32GB of RAM recommended – you are likely to receive a lower than normal score with lower amounts of RAM
  • Discrete GPU with >4GB of VRAM (Standard preset) or >8GB of VRAM (Extended preset)
  • Compatible with both Windows 10 & 11
  • Premiere Pro/OS language must be set to English

MacOS

  • Adobe Photoshop version 23.3 or newer
  • 32GB of RAM recommended – you are likely to receive a lower than normal score with lower amounts of RAM
  • MacOS 13
  • Premiere Pro/OS language must be set to English

Test Breakdown

Our Premiere Pro benchmark is broken down into three categories:

  1. Encoding – performance when exporting to various codecs
  2. Processing – performance when working with different types of source media
  3. GPU Effects – performance for the most common GPU-accelerated effects in Premiere Pro

Currently, we are using the export/render process in Premiere Pro as a basis for all of our tests. In the past, we include Live Playback directly, but it is too inconsistent to use as a basis for a benchmark. For all tests, the measured FPS is calculated based on the number of frames rendered, divided by the render time as recorded in the Premiere Pro log file.

Encoding Tests

For evaluating the performance of your system when exporting to various codecs, our benchmark is designed to make the encoding portion of a render as big of a bottleneck as possible.

To do this, we are using a source DNxHR LB 480×270 clip, scaled up into a UHD (3840×2160) timeline. DNxHR LB is a very easy codec to process (especially at this low of a resolution), and doesn’t have hardware decoding support on any platform, making it a good baseline codec to use. In order to make sure that each pixel is unique and prevent codecs like H.264 or HEVC from getting too “smart”, we are also applying a simple fractal noise PNG image (NOT the fractal noise effect) on top of the clip to ensure that we do not have any repeating pixels.

From this UHD timeline, we export to the range of codecs we want to test encoding performance for and calculate the FPS based on how long it takes relative to the number of frames rendered. While not a 100% pure encoding test (some resources are still being used to process the DNxHR clip, scale it, and process the noise overlay), this is about as close as a pure encoding test that we can do in Premiere Pro without getting into highly artificial setups.

In total, we are currently testing the following export codecs:

  • H.264 50Mbps 8-bit UHD
  • HEVC 50Mbps 8-bit UHD
  • HEVC 60Mbps 10-bit UHD*
  • H.264 50Mbps 8-bit (Software Encoding) UHD
  • HEVC 50Mbps 8-bit (Software Encoding) UHD
  • HEVC 60Mbps 10-bit (Software Encoding) UHD*
  • DNxHR LB UHD
  • DNxHR SQ UHD
  • DNxHR HQX UHD*
  • ProRes 422 Proxy UHD
  • ProRes 422HQ UHD
  • ProRes 4444 UHD*

*Extended Preset Only

If you wish to examine our specific settings, you can download the export presets here.

Processing Tests

On the opposite side of encoding, we are also evaluating how fast your system is able to decode different codecs in Premiere Pro. Unfortunately, the best way to test this in a real-world manner is to use live playback, which we can’t currently do for reasons explained in our PugetBench for Premiere Pro 0.98 Testing Methodology post .

For these processing tests, we use a wide range of codecs at different resolutions; creating a timeline at the clip’s native resolution. We then export each timeline to DNxHR LB at HD (1920×1080) resolution. Similar to the encoding tests, we standardized on DNxHR LB for the export codec as it is a very easy codec to process and doesn’t have hardware decoding support on any platform.

The only additional thing we do is disable the ImporterMPEG.UseSWDecodeWhileEncoding setting, which in the recent versions of Premiere Pro disables hardware decoding when exporting. This setting exists to address stability issues when using hardware decoding and encoding at the same time (which our benchmark is not doing). However, since we want to specifically test how good the system is at handling certain media with hardware decoding support, we are disabling it when running our tests. It is automatically set back to the default when the tests are finished.

This method’s benchmark results are highly consistent and accurately reflect the relative performance between two systems when working with different codecs. The raw FPS results do tend to be higher than what you will see during live playback or preview since it does not include all the additional steps required for Premiere Pro to show the video in the preview monitor, but we found that the relative performance between two CPUs or GPUs largely aligns with relative performance during live playback.

The base media we are currently testing is:

  • 4K H.264 150Mbps 4:2:0 8-bit
  • 4K HEVC 100Mbps 4:2:2 10-bit*
  • 8K HEVC 100Mbps 4:2:0 8-bit*
  • 4K ProRes 422 Proxy
  • 4K ProRes 422
  • 4K DNxHR LB
  • 4K DNxHR SQ
  • 4K Cinema RAW Light ST*
  • 4K ARRIRAW*
  • 5K Sony X-OCN*
  • 4K RED
  • 8K RED*

*Extended Preset Only

GPU Effects Tests

The last category of tests looks at performance for GPU-accelerated effects. Many of the effects in Premiere Pro are relatively easy to process singularly, so applying just a single instance isn’t going to stress even a low-end GPU. To address this, we apply each effect anywhere from four, to forty, times.

As a base, we are using a DNxHR LB UHD (3840×2160) clip, and export to DNxHR LB HD (1920×1080) – just like the “4K DNxHR LB” processing test. The difference here is that we also apply the following effects for each test:

  • Lumetri Color x40
  • Gaussian Blur x40
  • Sharpen x40
  • VR Digital Glitch x20*
  • VR De-Noise x4*

*Extended Preset Only

How does the Scoring Work?

All of the scores in our Premiere Pro benchmark are calculated using geometric means rather than averages or performance relative to a reference result. This helps to normalize the scores so that larger or small results are not unfairly weighted. It also allows for the benchmark to be more flexible and better able to handle large performance shifts due to either application optimizations or the launch of more powerful hardware.

For the actual score calculations, we start by dividing the tests by codec and whether it is an encoding- or processing-based test. A score is generated for each group by taking the geometric mean of all the test results. This score is not logged, and is only used behind the scenes to calculate the major scores.

Test Group Scores (not logged)

LongGOP Encoding Score = geomean (LongGOP Encoding test1, LongGOP Encoding test2, ...)
LongGOP Processing Score = geomean (LongGOP Processing test1, LongGOP Processing test2, ...)
IntraFrame Encoding Score = geomean (IntraFrame Encoding test1, IntraFrame Encoding test2, ...)
IntraFrame Processing Score = geomean (IntraFrame Processing test1, IntraFrame Processing test2, ...)
RAW Processing Score = geomean (RAW Processing test1, RAW Processing test2, ...)
GPU Effects Processing Score = geomean (GPU test1, GPU test2, ...)

Major Scores

The group scores are combined into the LongGOP, IntraFrame, RAW, and GPU Effects scores. For LongGOP and IntraFrame, this is done by calculating the geometric mean of the encoding and processing scores. RAW and GPU Effects scores, however, are used as-is since there are no encoding-based tests for RAW or GPU Effects. Each score is multiplied by a different scoring coefficient to bring the scores across all our benchmarks roughly in line with each other.

LongGOP Score = geomean (LongGOP Encoding Score, LongGOP Processing Score) * 0.95
IntraFrame Score = geomean (IntraFrame Encoding Score, IntraFrame Processing Score) * 0.91
RAW Score = RAW Processing Score * 2.12
GPU Effects Score = GPU Effects Processing Score * 1.18

Overall Score

These major scores are then combined into the Overall Score using a geometric mean and multiplied by 100 to differentiate the Overall Score from the Major Scores. Currently, we do not weigh any of the major scores more than any of the others, so each contributes equally to the Overall Score.

Overall Score = geomean (LongGOP Score, IntraFrame Score, RAW Score, GPU Effects Score) * 100

This method results in an Overall Score with a typical run-to-run variance of about 1-2%, and Major Scores with a variance of about 5%. Assuming there is no thermal or other performance throttling occurring on the system.

Benchmark Update Log

Version 1.0.0 (Major Update)

  • Major update – Results cannot be compared to previous benchmark versions.
  • Increased scoring coefficient for the Overall Score to differentiate the score from previous benchmark versions.
  • Added scoring coefficient modifiers to bring the Overall Score in-line with other 1.x PugetBench benchmarks.
  • Added support for Premiere Pro 24.x
  • Integrated into “PugetBench for Creators” benchmark application. The Premiere Pro benchmark is no longer available as a stand-alone Premiere Pro plugin.

Version 0.98.0 BETA (Major Update)

  • Major update – Results cannot be compared to previous benchmark versions
  • Minimum application support changed to Premiere Pro 23.3 due to changes in the internal API and significant application performance optimizations.
  • Dropping live playback tests due to significant issues with run-to-run variation.
  • Dropping “CPU Effects” tests as the majority of commonly used effects are now gpu-accelerated.
  • No longer necessary for the Preview Window to be in a specific quadrant on Windows.
  • No longer necessary to change Security & Privacy settings on MacOS or allow Premiere Pro to control system events.
  • Changing the “Export” tests to a new “Encoding” test set.
    • These new tests focus on the encoding portion of an export.
    • The tests are based on a DNxHR LB 480×270 clip upscaled to UHD with an image overlay to ensure each pixel is unique.
    • Timeline is exported to a range of codecs at UHD.
    • Current export codecs include variations of H.264/HEVC (with both software and hardware encoding), DNxHR, and ProRes.
  • Adding new “Processing” tests. These are intended to measure how good the system is at processing (decoding) different types of codecs.
    • Tests are based on a timeline with a range of codecs at their native resolution with no effects.
    • Timelines are exported to DNxHR LB at 1920×1080. This allows the export to be primarily bottlenecked by how fast Premiere Pro is able to decode the source media.
    • Current source codecs include variations of H.264, HEVC, DNxHR, ProRes, RED, Cinema RAW Light, ARRIRAW, and X-OCN.
    • Note: This test is indicative of how good your system is at working with these codecs, but the FPS results tend to be higher than what you will see during live playback / preview since it does not include all the additional steps required for Premiere Pro to show the video in the preview monitor.
  • Disabling “ImporterMPEG.UseSWDecodeWhileEncoding” setting for the Processing tests. This flag disables hardware decoding when exporting due to current stability issues in Premiere Pro when using hardware encoding and decoding at the same time. We are not doing this, so the issues are not a concern, and this makes our tests more in-line with behavior when playing media in the preview window.
  • Adjusting export FPS measurements to be based on the export time recorded in the Premiere Pro log files. This should be a more “pure” measurement than simply timing how long the export functions in the plugin API take.
    • This does require us to enable Premiere Pro logging, which we prompt the user to allow, and requires Premiere Pro to restart.
  • Adjusting GPU Effects tests to isolate specific effects rather than applying a melting pot of effects to a single clip. This doesn’t affect the relative scores for GPUs significantly, but allows us to drill down on specific effects during benchmark analysis.
  • Changing the major scores (Live Playback, Export, Effects) to be based primarily on the codec being tested rather than the type of test, as that is what most often dictates what type of hardware will be best for different workflows. New major scores:
    • LongGOP (H.264/HEVC)
    • IntraFrame (DNxHR/ProRes)
    • RAW (RED/Cinema RAW Light/ARRIRAW/X-OCN)
    • GPU Effects
  • Changing all scores to be based on the geometric mean of the individual results (or major scores in the case of the Overall Score). This method helps to normalize the scores so that larger or small results are not unfairly weighted.

Version 0.95.7 BETA

  • Updated CLI utility to support changes to plugin paths when installing via Creative Cloud

Version 0.95.6 BETA

  • Added plugin and CLI support for Premiere Pro 23.0
  • Updated project files to support Premiere Pro 23.0
  • Updated benchmark upload/view URLs to match web hosting changes

Version 0.95.5 BETA

  • Added filter to remove “(C)”, “(R)”, and “(TM)” strings from the systems specifications
  • Added additional granularity to the “Custom” preset. You can now select individual tests to run rather than only being limited to specific groups of tests.
  • Added the ability to specify which tests to run with the “Custom” preset when using the CLI
    • Include the “/custom_tests” argument when launching the CLI along with a comma-separated list of test numbers.
    • The test number associated with each test can be seen if you launch the benchmark plugin manually. It will be shown before the name of each selectable test.
    • Example: /custom_tests “8,21”
  • Added “/cooldown” CLI argument to specify how many seconds to wait at the end of the benchmark run to allow the system to cool down. By default, the CLI waits for 20 seconds
    • Example: /cooldown 60
  • Added the ability to launch custom .bat files at various points during the benchmark run. This was requested multiple times in order to automate additional logging using utilities such as Windows Performance Recorder.
    • If running the benchmark manually, there is now a “Run custom commands” checkbox in the automation settings
    • If using the CLI, include the command line argument of “/custom_tests”
    • Four different .bat files are launched at the start/end of the benchmark, and at the start/end of each individual test.
    • More information on how to make the .bat files, where to save them, and a sample set of files is available in the user guide
    • Note that if the commands require admin privileges, you will need to launch the application or CLI as admin.
  • Required application information for the CLI moved to an .ini file that resides alongside the CLI in the plugin folder. This information was previously baked into the CLI itself, but by having it in an editable file, end users can add support for things like beta and pre-release versions of Premiere Pro. The .ini requires the following entries:
    • Main section title: The major version of Premiere Pro. This is used by the CLI for the “/app_version” argument
    • prefLoc: Location of the application preferences folder
    • prefFile: In many cases, this is the same as prefLoc, but some benchmarks need a specific preferences file defined
    • appLoc: Path to the Premiere Pro application
    • appEXE: Name of the Premiere Pro .exe when it has been launched

Version 0.95.4 BETA

  • Remade Benchmark_Project_22.prproj file in order to fix bug with Apple M1 devices.
  • Fixed issue with “Benchmark Complete” dialog box showing the Standard Overall Score when the Extended preset is run.
  • Updated spec gathering to include GPU core count for M1 devices.
  • Fixed bug that prevented the benchmark from being run multiple times (we still highly recommend restarting Premiere Pro between benchmark runs, however).
  • Switching to “qe.project.getActiveSequence().player.totalFrames” to get the number of total frames processed for Live Playback tests. Previously we just used the total number of frames in the sequence, but Pr 22.0 at times will process a frame multiple times. Especially on low-end systems, this can throw off the results since it makes it possible for Pr to drop a single frame more than once.

Version 0.95.3 BETA

  • Added project files and CLI support for Premiere Pro version 22.x
  • In order to run the benchmark, you must have both the latest plugin version, and download the latest test assets so that the project files for Premiere Pro 22.x are present.

Version 0.95.1 BETA

  • Added project files and CLI support for Premiere Pro 2021 (15.x)
  • In order to run the benchmark, you must have both the latest plugin version, and download the latest test assets so that the project files for Premiere Pro 2021 are present.

Version 0.95 BETA (Major Update)

  • Major update – Results cannot be compared to previous benchmark versions
  • Requires new project files due to significant changes to what is tested
  • Added additional failsafes for when gathering of system specs fails
  • Added spec gathering support for Apple M1 systems
  • In some cases, playback can report more dropped frames than exist in the sequence. When this happens, the playback FPS will be reported as “0”
  • Standard and Extended tests adjusted. Old test sets were too easy given recent hardware/application updates, resulting in many of the live playback tests in particular running at full FPS on most systems
  • Test changes also significantly reduced the amount of time it takes to run the benchmark
  • Scoring changed to be relative to a reference system (AMD Ryzen 5900X w/ GeForce RTX 3080)
  • Test changes:
    • Dropped half resolution Live Playback tests
    • Added 2x and 4x speed forward playback tests
    • Increased ProRes multicam test to 6 streams
    • Shortened all test sequences from 30 sec to 20 sec in length
    • Export tests now use in/out points so that we can shorten some tests (CPU effects test primarily)
    • H.264 export tests now count towards the GPU score since it uses GPU encoding by default
    • CPU & GPU Effects sequences no longer do live playback tests, and only export to ProRes. Live Playback results are often not consistent with this focused of a load, and this change should make the scoring more accurate
    • Shortened CPU Effects export test to 1/3 the length since this test can take a long time
    • GPU/CPU effects tests now have their own “Effects” category and sub-score
    • Overall score is now the average of the “Live Playback”, “Export”, and “Effects” sub-scores
    • Scoring now based on performance relative to a reference system (Ryzen 5900X w/ RTX 3080) rather than relative to the media FPS
    • Dropping 23.976 FPS media from Enhanced Extended tests. Extended will now only adds sequences with 8K 59.94FPS H.265 and R3D media
  • “Custom” preset is now a licensed feature and allows you to select individual tests to run. Custom preset can still be set via the CLI utility, but requires you to manually configure and save the settings at least once before running the benchmark from the command line.

Version 0.92 BETA

  • Moved to using node-wmi to gather system specs on Windows systems. This should be significantly faster and more reliable than the previous command line method
  • Added GPU driver and motherboard BIOS to the system specs for Windows systems
  • CLI utility now updates the Pr preferences so that workspaces are imported when opening a project (default behavior was changed in Pr 14.3.1)
  • Misc bug fixes

Version 0.91 BETA

  • Improved system spec gathering reliability
  • System specs on MacOS now gather properly even if the system drive’s name has been changed from the default “Macintosh HD”

Version 0.9 BETA (Major Update)

  • Results are now uploaded to our online database. This is required for the free version, but opt-in if you have a commercial license.
  • Removed the temporary .csv logging from the free version now that the full results can be viewed on our benchmark listing page.
  • Complete plugin UI revamp. Added status bar/text, and improved the configuration options.
  • Improved method used to gather system specs. This should break less often on unusual or older system configurations.
  • License validation moved from the CLI utility to the plugin itself.
  • Added the ability to do a “Custom” test which does the same tests as normal, only on whatever sequences are on the root-level of the currently opened project.
  • Added ability to set the location for any disk-based tests (exporting).
  • Added tooltips for the various settings that can now be configured.
  • Status logs and configuration settings moved to “~Documents⧵PugetBench⧵Premiere Pro⧵TIMESTAMP” since we cannot log directly to the plugin folder.
  • Dropped 8K H.265 tests. This test was simply too difficult for most systems and resulted in Premiere Pro hanging more often than not.
  • Dropped ability to run tests in individual codecs. The intention is that the “Custom” test type will replace this.
  • Improved CPU and GPU effects test. Added “Advanced” and “Extreme” tests to differentiate between what someone might be doing in their own projects versus a worst-case scenario. Also dropped exporting to H.264 for these tests since we want to focus on the system’s performance for processing CPU and GPU effects
  • Scoring has been adjusted based on the test changes. Due to this, the Overall Score will not be consistent with previous versions.
  • General bug fixes and stability improvements.

Version 0.88 BETA

  • Slight change in benchmark plugin folder name. Be sure to uninstall any existing “PugetBench for Premiere Pro” plugins before install the new 0.88 version.
  • Fixed issue where the benchmark would not run due to a change in the API when using confirm boxes.
  • Added “GPU Score” that is the average of the results from the “GPU Effects” tests.

Version 0.86 BETA

  • Dropped the “Benchmark Results” score screen due to Premiere Pro 2020 breaking MGRT scripting functionality
  • Adding back .csv log file support until we can get the “Benchmark Results” screen working
  • No longer generating the test sequences on the fly – switching to pre-made projects for each test codec. This greatly increases the overall stability of the benchmark
  • Due to using pre-made projects, Premiere Pro 2019 is no longer supported
  • Improved system information gathering methods. This should make it much more reliable on MacOS
  • Added option to continue the benchmark if the system information gathering does fail for any reason
  • Miscellaneous bug fixes and improvements

Version 0.8 BETA

  • Renamed benchmark to “PugetBench for Premiere Pro”
  • Added support for Premiere Pro 2020
  • Dropping the 29.97 FPS CPU and GPU effects tests. These are already somewhat artifical, so no reason to run them at both 29.97 and 59.94 FPS. This means that the Extended test scores will be slightly different than previous versions, but it shouldn’t be by a huge amount
  • There is now a “Benchmark Results” screen that comes up at the end of the benchmark that displays a bunch of useful information including: benchmark version, cores, results for each individual test, and system information like CPU, RAM, OS, GPU, and Premiere Pro version
  • The benchmark also makes a PNG of the “Benchmark Results” screen for easy sharing
  • Removed .csv log file support in the free edition (log files will be a feature in the commercial use version)
  • Removed “start” .exe applications (automation will be a feature in the commercial use version)
  • More code to ensure consistency between benchmark runs
  • Miscellaneous bug fixes and improvements

Version 0.3 BETA

  • Replaced H.264 media with footage straight from a Panasonic GH5 (Thanks Neil Purcell!)
  • Replaced “Quick” preset with “Standard” preset. Changed tested media to 59.94FPS instead of 29.97FPS.
    • This was done with the intention of having a benchmark preset that is geared towards measuring the overall performance of a system in as little time as possible.
  • Separated media files into two groups: a “Standard” set and a “Full” set to make the download much more manageable for people just doing the Standard preset.
  • Due to the change in the test media, the overall scores are not interchangeable with previous benchmark versions

Version 0.2 BETA

  • First release.

Legacy Benchmark Versions

Modern versions of our Premiere Pro benchmark are available via the PugetBench for Creators application. However, older versions of the benchmark are available via ZXP plugins. Note that these versions may not have support for recent versions of Premiere Pro.

Beta Benchmark Branch

In addition to the main benchmark versions, licensed users also have access to a beta branch. This is where we trial tests, often for the purpose of taking advantage of new features that have been added into the base applications. These benchmark builds should not be compared to the main branch, as they include different tests and can have differences in how the scoring is calculated.

Current notes on the beta branch:

  • No beta builds currently available