• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..03-May-2022-

.github/workflows/H16-Oct-2021-274252

libvmaf/H03-May-2022-42,58732,384

matlab/H16-Oct-2021-13,12610,247

model/H03-May-2022-43,33543,236

python/H03-May-2022-42,21234,980

resource/H16-Oct-2021-15,16414,428

third_party/libsvm/H16-Oct-2021-18,38815,463

workspace/H16-Oct-2021-4032

.dockerignoreH A D16-Oct-202134 54

.gitattributesH A D16-Oct-2021173 75

.gitignoreH A D16-Oct-2021173 1716

.travis.ymlH A D16-Oct-20211.2 KiB5650

CHANGELOG.mdH A D16-Oct-202117.3 KiB479334

CONTRIBUTING.mdH A D16-Oct-202113 KiB126104

DockerfileH A D16-Oct-2021926 4635

LICENSEH A D16-Oct-20212.8 KiB4231

MakefileH A D16-Oct-2021418 1612

OSSMETADATAH A D16-Oct-202119 11

README.mdH A D16-Oct-20215 KiB4429

unittestH A D16-Oct-2021168 106

README.md

1# VMAF - Video Multi-Method Assessment Fusion
2
3[![Build Status](https://travis-ci.com/Netflix/vmaf.svg?branch=master)](https://travis-ci.com/Netflix/vmaf)
4[![libvmaf](https://github.com/Netflix/vmaf/workflows/libvmaf/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3Alibvmaf)
5[![Windows](https://github.com/Netflix/vmaf/workflows/Windows/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3AWindows)
6[![ffmpeg](https://github.com/Netflix/vmaf/workflows/ffmpeg/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3Affmpeg)
7[![Docker](https://github.com/Netflix/vmaf/workflows/Docker/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3ADocker)
8
9VMAF is a perceptual video quality assessment algorithm developed by Netflix. This software package includes a stand-alone C library `libvmaf` and its wrapping Python library. The Python library also provides a set of tools that allows a user to train and test a custom VMAF model.
10
11Read [this](https://medium.com/netflix-techblog/toward-a-practical-perceptual-video-quality-metric-653f208b9652) tech blog post for an overview, [this](https://medium.com/netflix-techblog/vmaf-the-journey-continues-44b51ee9ed12) post for the tips of best practices, and [this](https://netflixtechblog.com/toward-a-better-quality-metric-for-the-video-community-7ed94e752a30) post for our latest efforts on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
12
13Also included in `libvmaf` are implementations of several other metrics: PSNR, PSNR-HVS, SSIM, MS-SSIM and CIEDE2000.
14
15![vmaf logo](resource/images/vmaf_logo.jpg)
16
17## News
18
19- (2021-10-7) We are open-sourcing CAMBI (Contrast Aware Multiscale Banding Index) - Netflix's detector for banding (aka contouring) artifacts. Check out the [tech blog](https://netflixtechblog.medium.com/cambi-a-banding-artifact-detector-96777ae12fe2) for an overview and the [technical paper](resource/doc/papers/CAMBI_PCS2021.pdf) published in PCS 2021 (note that the paper describes an initial version of CAMBI that no longer matches the code exactly, but it is still a good introduction). Also check out the [usage](resource/doc/cambi.md) page.
20- (2020-12-7) Check out our [latest tech blog](https://netflixtechblog.com/toward-a-better-quality-metric-for-the-video-community-7ed94e752a30) on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
21- (2020-12-3) We are releasing `libvmaf v2.0.0`. It has a new fixed-point and x86 SIMD-optimized (AVX2, AVX-512) implementation that achieves 2x speed up compared to the previous floating-point version. It also has a [new API](libvmaf/README.md) that is more flexible and extensible.
22- (2020-7-13) We have created a [memo](https://docs.google.com/document/d/1dJczEhXO0MZjBSNyKmd3ARiCTdFVMNPBykH4_HMPoyY/edit?usp=sharing) to share our thoughts on VMAF's property in the presence of image enhancement operations, its impact on codec evaluation, and our solutions.
23- (2020-2-27) We have changed VMAF's license from Apache 2.0 to [BSD+Patent](https://opensource.org/licenses/BSDplusPatent), a more permissive license compared to Apache that also includes an express patent grant.
24
25## Documentation
26
27There is an [overview of the documentation](resource/doc/index.md) with links to specific pages, covering FAQs, available models and metrics, software usage guides, and a list of resources.
28
29## Usage
30
31The software package offers a number of ways to interact with the VMAF implementation.
32
33  - The command-line tool [`vmaf`](libvmaf/tools/README.md) provides a complete algorithm implementation, such that one can easily deploy VMAF in a production environment. Additionally, the `vmaf` tool provides a number of auxillary metrics such as PSNR, SSIM and MS-SSIM.
34  - The [C library `libvmaf`](libvmaf/README.md) provides an interface to incorporate VMAF into your code, and tools to integrate other feature extractors into the library.
35  - The [Python library](resource/doc/python.md) offers a full array of wrapper classes and scripts for software testing, VMAF model training and validation, dataset processing, data visualization, etc.
36  - VMAF is now included as a filter in FFmpeg, and can be configured using: `./configure --enable-libvmaf`. Refer to the [Using VMAF with FFmpeg](resource/doc/ffmpeg.md) page.
37  - [VMAF Dockerfile](Dockerfile) generates a docker image from the [Python library](resource/doc/python.md). Refer to [this](resource/doc/docker.md) document for detailed usage.
38  - To build VMAF on Windows, follow [these](resource/doc/windows.md) instructions.
39  - AOM CTC: [AOM]((http://aomedia.org/)) has specified vmaf to be the standard implementation metrics tool according to the AOM common test conditions (CTC). Refer to [this page](resource/doc/aom_ctc.md) for usage compliant with AOM CTC.
40
41## Contribution Guide
42
43Refer to the [contribution](CONTRIBUTING.md) page. Also refer to this [slide deck](https://docs.google.com/presentation/d/1Gr4-MvOXu9HUiH4nnqLGWupJYMeh6nl2MNz6Qy9153c/edit#slide=id.gc20398b4b7_0_132) for an overview contribution guide.
44