|
Name |
|
Date |
Size |
#Lines |
LOC |
| .. | | 03-May-2022 | - |
| .github/workflows/ | H | 16-Oct-2021 | - | 274 | 252 |
| libvmaf/ | H | 03-May-2022 | - | 42,587 | 32,384 |
| matlab/ | H | 16-Oct-2021 | - | 13,126 | 10,247 |
| model/ | H | 03-May-2022 | - | 43,335 | 43,236 |
| python/ | H | 03-May-2022 | - | 42,212 | 34,980 |
| resource/ | H | 16-Oct-2021 | - | 15,164 | 14,428 |
| third_party/libsvm/ | H | 16-Oct-2021 | - | 18,388 | 15,463 |
| workspace/ | H | 16-Oct-2021 | - | 40 | 32 |
| .dockerignore | H A D | 16-Oct-2021 | 34 | 5 | 4 |
| .gitattributes | H A D | 16-Oct-2021 | 173 | 7 | 5 |
| .gitignore | H A D | 16-Oct-2021 | 173 | 17 | 16 |
| .travis.yml | H A D | 16-Oct-2021 | 1.2 KiB | 56 | 50 |
| CHANGELOG.md | H A D | 16-Oct-2021 | 17.3 KiB | 479 | 334 |
| CONTRIBUTING.md | H A D | 16-Oct-2021 | 13 KiB | 126 | 104 |
| Dockerfile | H A D | 16-Oct-2021 | 926 | 46 | 35 |
| LICENSE | H A D | 16-Oct-2021 | 2.8 KiB | 42 | 31 |
| Makefile | H A D | 16-Oct-2021 | 418 | 16 | 12 |
| OSSMETADATA | H A D | 16-Oct-2021 | 19 | 1 | 1 |
| README.md | H A D | 16-Oct-2021 | 5 KiB | 44 | 29 |
| unittest | H A D | 16-Oct-2021 | 168 | 10 | 6 |
README.md
1# VMAF - Video Multi-Method Assessment Fusion
2
3[![Build Status](https://travis-ci.com/Netflix/vmaf.svg?branch=master)](https://travis-ci.com/Netflix/vmaf)
4[![libvmaf](https://github.com/Netflix/vmaf/workflows/libvmaf/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3Alibvmaf)
5[![Windows](https://github.com/Netflix/vmaf/workflows/Windows/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3AWindows)
6[![ffmpeg](https://github.com/Netflix/vmaf/workflows/ffmpeg/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3Affmpeg)
7[![Docker](https://github.com/Netflix/vmaf/workflows/Docker/badge.svg)](https://github.com/Netflix/vmaf/actions?query=workflow%3ADocker)
8
9VMAF is a perceptual video quality assessment algorithm developed by Netflix. This software package includes a stand-alone C library `libvmaf` and its wrapping Python library. The Python library also provides a set of tools that allows a user to train and test a custom VMAF model.
10
11Read [this](https://medium.com/netflix-techblog/toward-a-practical-perceptual-video-quality-metric-653f208b9652) tech blog post for an overview, [this](https://medium.com/netflix-techblog/vmaf-the-journey-continues-44b51ee9ed12) post for the tips of best practices, and [this](https://netflixtechblog.com/toward-a-better-quality-metric-for-the-video-community-7ed94e752a30) post for our latest efforts on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
12
13Also included in `libvmaf` are implementations of several other metrics: PSNR, PSNR-HVS, SSIM, MS-SSIM and CIEDE2000.
14
15![vmaf logo](resource/images/vmaf_logo.jpg)
16
17## News
18
19- (2021-10-7) We are open-sourcing CAMBI (Contrast Aware Multiscale Banding Index) - Netflix's detector for banding (aka contouring) artifacts. Check out the [tech blog](https://netflixtechblog.medium.com/cambi-a-banding-artifact-detector-96777ae12fe2) for an overview and the [technical paper](resource/doc/papers/CAMBI_PCS2021.pdf) published in PCS 2021 (note that the paper describes an initial version of CAMBI that no longer matches the code exactly, but it is still a good introduction). Also check out the [usage](resource/doc/cambi.md) page.
20- (2020-12-7) Check out our [latest tech blog](https://netflixtechblog.com/toward-a-better-quality-metric-for-the-video-community-7ed94e752a30) on speed optimization, new API design and the introduction of a codec evaluation-friendly NEG mode.
21- (2020-12-3) We are releasing `libvmaf v2.0.0`. It has a new fixed-point and x86 SIMD-optimized (AVX2, AVX-512) implementation that achieves 2x speed up compared to the previous floating-point version. It also has a [new API](libvmaf/README.md) that is more flexible and extensible.
22- (2020-7-13) We have created a [memo](https://docs.google.com/document/d/1dJczEhXO0MZjBSNyKmd3ARiCTdFVMNPBykH4_HMPoyY/edit?usp=sharing) to share our thoughts on VMAF's property in the presence of image enhancement operations, its impact on codec evaluation, and our solutions.
23- (2020-2-27) We have changed VMAF's license from Apache 2.0 to [BSD+Patent](https://opensource.org/licenses/BSDplusPatent), a more permissive license compared to Apache that also includes an express patent grant.
24
25## Documentation
26
27There is an [overview of the documentation](resource/doc/index.md) with links to specific pages, covering FAQs, available models and metrics, software usage guides, and a list of resources.
28
29## Usage
30
31The software package offers a number of ways to interact with the VMAF implementation.
32
33 - The command-line tool [`vmaf`](libvmaf/tools/README.md) provides a complete algorithm implementation, such that one can easily deploy VMAF in a production environment. Additionally, the `vmaf` tool provides a number of auxillary metrics such as PSNR, SSIM and MS-SSIM.
34 - The [C library `libvmaf`](libvmaf/README.md) provides an interface to incorporate VMAF into your code, and tools to integrate other feature extractors into the library.
35 - The [Python library](resource/doc/python.md) offers a full array of wrapper classes and scripts for software testing, VMAF model training and validation, dataset processing, data visualization, etc.
36 - VMAF is now included as a filter in FFmpeg, and can be configured using: `./configure --enable-libvmaf`. Refer to the [Using VMAF with FFmpeg](resource/doc/ffmpeg.md) page.
37 - [VMAF Dockerfile](Dockerfile) generates a docker image from the [Python library](resource/doc/python.md). Refer to [this](resource/doc/docker.md) document for detailed usage.
38 - To build VMAF on Windows, follow [these](resource/doc/windows.md) instructions.
39 - AOM CTC: [AOM]((http://aomedia.org/)) has specified vmaf to be the standard implementation metrics tool according to the AOM common test conditions (CTC). Refer to [this page](resource/doc/aom_ctc.md) for usage compliant with AOM CTC.
40
41## Contribution Guide
42
43Refer to the [contribution](CONTRIBUTING.md) page. Also refer to this [slide deck](https://docs.google.com/presentation/d/1Gr4-MvOXu9HUiH4nnqLGWupJYMeh6nl2MNz6Qy9153c/edit#slide=id.gc20398b4b7_0_132) for an overview contribution guide.
44