• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..03-May-2022-

.circleci/H09-May-2019-

.travis/H09-May-2019-

cmake/H09-May-2019-

community/H09-May-2019-

conda/H09-May-2019-

docs/H03-May-2022-

onnx/H09-May-2019-

stubs/google/H07-May-2022-

third_party/H09-May-2019-

tools/H09-May-2019-

.clang-formatH A D09-May-20192.9 KiB

.gitignoreH A D09-May-20191.1 KiB

.gitmodulesH A D09-May-2019240

.travis.ymlH A D09-May-20191.4 KiB

CODEOWNERSH A D09-May-2019431

LICENSEH A D09-May-20191.1 KiB

MANIFEST.inH A D09-May-2019267

README.mdH A D09-May-20194.7 KiB

RELEASE-MANAGEMENT.mdH A D09-May-20193.2 KiB

VERSION_NUMBERH A D09-May-20196

appveyor.ymlH A D09-May-20192.1 KiB

pyproject.tomlH A D09-May-2019118

setup.cfgH A D09-May-20191.8 KiB

setup.pyH A D09-May-201910.9 KiB

README.md

1
2<p align="center"><img width="40%" src="docs/ONNX_logo_main.png" /></p>
3
4[![Build Status](https://img.shields.io/travis/onnx/onnx/master.svg?label=Linux)](https://travis-ci.org/onnx/onnx)
5[![Build status](https://img.shields.io/appveyor/ci/onnx/onnx/master.svg?label=Windows)](https://ci.appveyor.com/project/onnx/onnx)
6[![Build Status](https://img.shields.io/jenkins/s/http/powerci.osuosl.org/onnx-ppc64le-nightly-build.svg?label=Linux%20ppc64le)](http://powerci.osuosl.org/job/onnx-ppc64le-nightly-build/)
7
8[Open Neural Network Exchange (ONNX)](http://onnx.ai) is an open ecosystem that empowers AI developers
9to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard
10data types. Currently we focus on the capabilities needed for inferencing (scoring).
11
12ONNX is [widely supported](http://onnx.ai/supported-tools) and can be found in many frameworks, tools, and hardware. Enabling interoperability between different frameworks and streamlining the path from research to production will increase the speed of innovation in the AI community. We invite the community to join us and further evolve ONNX.
13
14# Use ONNX
15* [Supported Frameworks, Tools, and Hardware](http://onnx.ai/supported-tools)
16* [Tutorials for creating ONNX models from](https://github.com/onnx/tutorials).
17
18# Learn about the ONNX spec
19* [Overview](docs/Overview.md)
20* [ONNX intermediate representation spec](docs/IR.md)
21* [Versioning principles of the spec](docs/Versioning.md)
22* [Operators documentation](docs/Operators.md)
23* [Python API Overview](docs/PythonAPIOverview.md)
24
25# Programming utilities for working with ONNX Graphs
26* [Shape and Type Inference](docs/ShapeInference.md)
27* [Graph Optimization](docs/Optimizer.md)
28* [Opset Version Conversion](docs/VersionConverter.md)
29
30# Contribute
31ONNX is a community project. We encourage you to join the effort and contribute feedback, ideas, and code.
32You can join [one of the working groups](https://github.com/onnx/onnx/wiki/%5BAnnouncement%5D-ONNX-working-groups-established) and help shape the future of ONNX.
33
34Check out our [contribution guide](https://github.com/onnx/onnx/blob/master/docs/CONTRIBUTING.md)
35and [call for contributions](https://github.com/onnx/onnx/issues/426) to get started.
36
37If you think some operator should be added to ONNX specification, please read
38[this document](docs/AddNewOp.md).
39
40# Discuss
41We encourage you to open [Issues](https://github.com/onnx/onnx/issues), or use Gitter for more real-time discussion:
42[![Join the chat at https://gitter.im/onnx/Lobby](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/onnx/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
43
44# Follow Us
45Stay up to date with the latest ONNX news. [[Facebook](https://www.facebook.com/onnxai/)] [[Twitter](https://twitter.com/onnxai)]
46
47
48
49
50
51
52# Installation
53
54## Binaries
55
56A binary build of ONNX is available from [Conda](https://conda.io), in [conda-forge](https://conda-forge.org/):
57
58```
59conda install -c conda-forge onnx
60```
61
62## Source
63
64You will need an install of protobuf and numpy to build ONNX.  One easy
65way to get these dependencies is via
66[Anaconda](https://www.anaconda.com/download/):
67
68```
69# Use conda-forge protobuf, as default doesn't come with protoc
70conda install -c conda-forge protobuf numpy
71```
72
73You can then install ONNX from PyPi (Note: Set environment variable `ONNX_ML=1` for onnx-ml):
74
75```
76pip install onnx
77```
78
79You can also build and install ONNX locally from source code:
80
81```
82git clone https://github.com/onnx/onnx.git
83cd onnx
84git submodule update --init --recursive
85python setup.py install
86```
87
88Note: When installing in a non-Anaconda environment, make sure to install the Protobuf compiler before running the pip installation of onnx. For example, on Ubuntu:
89
90```
91sudo apt-get install protobuf-compiler libprotoc-dev
92pip install onnx
93```
94
95After installation, run
96
97```
98python -c "import onnx"
99```
100
101to verify it works.  Note that this command does not work from
102a source checkout directory; in this case you'll see:
103
104```
105ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'
106```
107
108Change into another directory to fix this error.
109
110# Testing
111
112ONNX uses [pytest](https://docs.pytest.org) as test driver. In order to run tests, first you need to install pytest:
113
114```
115pip install pytest nbval
116```
117
118After installing pytest, do
119
120```
121pytest
122```
123
124to run tests.
125
126# Development
127
128Check out [contributor guide](https://github.com/onnx/onnx/blob/master/docs/CONTRIBUTING.md) for instructions.
129
130# License
131
132[MIT License](LICENSE)
133
134# Code of Conduct
135
136[ONNX Open Source Code of Conduct](http://onnx.ai/codeofconduct.html)
137