1.. _intro-install:
2
3==================
4Installation guide
5==================
6
7.. _faq-python-versions:
8
9Supported Python versions
10=========================
11
12Scrapy requires Python 3.6+, either the CPython implementation (default) or
13the PyPy 7.2.0+ implementation (see :ref:`python:implementations`).
14
15.. _intro-install-scrapy:
16
17Installing Scrapy
18=================
19
20If you're using `Anaconda`_ or `Miniconda`_, you can install the package from
21the `conda-forge`_ channel, which has up-to-date packages for Linux, Windows
22and macOS.
23
24To install Scrapy using ``conda``, run::
25
26  conda install -c conda-forge scrapy
27
28Alternatively, if you’re already familiar with installation of Python packages,
29you can install Scrapy and its dependencies from PyPI with::
30
31    pip install Scrapy
32
33We strongly recommend that you install Scrapy in :ref:`a dedicated virtualenv <intro-using-virtualenv>`,
34to avoid conflicting with your system packages.
35
36Note that sometimes this may require solving compilation issues for some Scrapy
37dependencies depending on your operating system, so be sure to check the
38:ref:`intro-install-platform-notes`.
39
40For more detailed and platform specifics instructions, as well as
41troubleshooting information, read on.
42
43
44Things that are good to know
45----------------------------
46
47Scrapy is written in pure Python and depends on a few key Python packages (among others):
48
49* `lxml`_, an efficient XML and HTML parser
50* `parsel`_, an HTML/XML data extraction library written on top of lxml,
51* `w3lib`_, a multi-purpose helper for dealing with URLs and web page encodings
52* `twisted`_, an asynchronous networking framework
53* `cryptography`_ and `pyOpenSSL`_, to deal with various network-level security needs
54
55The minimal versions which Scrapy is tested against are:
56
57* Twisted 14.0
58* lxml 3.4
59* pyOpenSSL 0.14
60
61Scrapy may work with older versions of these packages
62but it is not guaranteed it will continue working
63because it’s not being tested against them.
64
65Some of these packages themselves depends on non-Python packages
66that might require additional installation steps depending on your platform.
67Please check :ref:`platform-specific guides below <intro-install-platform-notes>`.
68
69In case of any trouble related to these dependencies,
70please refer to their respective installation instructions:
71
72* `lxml installation`_
73* :doc:`cryptography installation <cryptography:installation>`
74
75.. _lxml installation: https://lxml.de/installation.html
76
77
78.. _intro-using-virtualenv:
79
80Using a virtual environment (recommended)
81-----------------------------------------
82
83TL;DR: We recommend installing Scrapy inside a virtual environment
84on all platforms.
85
86Python packages can be installed either globally (a.k.a system wide),
87or in user-space. We do not recommend installing Scrapy system wide.
88
89Instead, we recommend that you install Scrapy within a so-called
90"virtual environment" (:mod:`venv`).
91Virtual environments allow you to not conflict with already-installed Python
92system packages (which could break some of your system tools and scripts),
93and still install packages normally with ``pip`` (without ``sudo`` and the likes).
94
95See :ref:`tut-venv` on how to create your virtual environment.
96
97Once you have created a virtual environment, you can install Scrapy inside it with ``pip``,
98just like any other Python package.
99(See :ref:`platform-specific guides <intro-install-platform-notes>`
100below for non-Python dependencies that you may need to install beforehand).
101
102
103.. _intro-install-platform-notes:
104
105Platform specific installation notes
106====================================
107
108.. _intro-install-windows:
109
110Windows
111-------
112
113Though it's possible to install Scrapy on Windows using pip, we recommend you
114to install `Anaconda`_ or `Miniconda`_ and use the package from the
115`conda-forge`_ channel, which will avoid most installation issues.
116
117Once you've installed `Anaconda`_ or `Miniconda`_, install Scrapy with::
118
119  conda install -c conda-forge scrapy
120
121To install Scrapy on Windows using ``pip``:
122
123.. warning::
124    This installation method requires “Microsoft Visual C++” for installing some
125    Scrapy dependencies, which demands significantly more disk space than Anaconda.
126
127#. Download and execute `Microsoft C++ Build Tools`_ to install the Visual Studio Installer.
128
129#. Run the Visual Studio Installer.
130
131#. Under the Workloads section, select **C++ build tools**.
132
133#. Check the installation details and make sure following packages are selected as optional components:
134
135    * **MSVC**  (e.g MSVC v142 - VS 2019 C++ x64/x86 build tools (v14.23) )
136
137    * **Windows SDK**  (e.g Windows 10 SDK (10.0.18362.0))
138
139#. Install the Visual Studio Build Tools.
140
141Now, you should be able to :ref:`install Scrapy <intro-install-scrapy>` using ``pip``.
142
143.. _intro-install-ubuntu:
144
145Ubuntu 14.04 or above
146---------------------
147
148Scrapy is currently tested with recent-enough versions of lxml,
149twisted and pyOpenSSL, and is compatible with recent Ubuntu distributions.
150But it should support older versions of Ubuntu too, like Ubuntu 14.04,
151albeit with potential issues with TLS connections.
152
153**Don't** use the ``python-scrapy`` package provided by Ubuntu, they are
154typically too old and slow to catch up with latest Scrapy.
155
156
157To install Scrapy on Ubuntu (or Ubuntu-based) systems, you need to install
158these dependencies::
159
160    sudo apt-get install python3 python3-dev python3-pip libxml2-dev libxslt1-dev zlib1g-dev libffi-dev libssl-dev
161
162- ``python3-dev``, ``zlib1g-dev``, ``libxml2-dev`` and ``libxslt1-dev``
163  are required for ``lxml``
164- ``libssl-dev`` and ``libffi-dev`` are required for ``cryptography``
165
166Inside a :ref:`virtualenv <intro-using-virtualenv>`,
167you can install Scrapy with ``pip`` after that::
168
169    pip install scrapy
170
171.. note::
172    The same non-Python dependencies can be used to install Scrapy in Debian
173    Jessie (8.0) and above.
174
175
176.. _intro-install-macos:
177
178macOS
179-----
180
181Building Scrapy's dependencies requires the presence of a C compiler and
182development headers. On macOS this is typically provided by Apple’s Xcode
183development tools. To install the Xcode command line tools open a terminal
184window and run::
185
186    xcode-select --install
187
188There's a `known issue <https://github.com/pypa/pip/issues/2468>`_ that
189prevents ``pip`` from updating system packages. This has to be addressed to
190successfully install Scrapy and its dependencies. Here are some proposed
191solutions:
192
193* *(Recommended)* **Don't** use system python, install a new, updated version
194  that doesn't conflict with the rest of your system. Here's how to do it using
195  the `homebrew`_ package manager:
196
197  * Install `homebrew`_ following the instructions in https://brew.sh/
198
199  * Update your ``PATH`` variable to state that homebrew packages should be
200    used before system packages (Change ``.bashrc`` to ``.zshrc`` accordantly
201    if you're using `zsh`_ as default shell)::
202
203      echo "export PATH=/usr/local/bin:/usr/local/sbin:$PATH" >> ~/.bashrc
204
205  * Reload ``.bashrc`` to ensure the changes have taken place::
206
207      source ~/.bashrc
208
209  * Install python::
210
211      brew install python
212
213  * Latest versions of python have ``pip`` bundled with them so you won't need
214    to install it separately. If this is not the case, upgrade python::
215
216      brew update; brew upgrade python
217
218*   *(Optional)* :ref:`Install Scrapy inside a Python virtual environment
219    <intro-using-virtualenv>`.
220
221  This method is a workaround for the above macOS issue, but it's an overall
222  good practice for managing dependencies and can complement the first method.
223
224After any of these workarounds you should be able to install Scrapy::
225
226  pip install Scrapy
227
228
229PyPy
230----
231
232We recommend using the latest PyPy version. The version tested is 5.9.0.
233For PyPy3, only Linux installation was tested.
234
235Most Scrapy dependencies now have binary wheels for CPython, but not for PyPy.
236This means that these dependencies will be built during installation.
237On macOS, you are likely to face an issue with building Cryptography dependency,
238solution to this problem is described
239`here <https://github.com/pyca/cryptography/issues/2692#issuecomment-272773481>`_,
240that is to ``brew install openssl`` and then export the flags that this command
241recommends (only needed when installing Scrapy). Installing on Linux has no special
242issues besides installing build dependencies.
243Installing Scrapy with PyPy on Windows is not tested.
244
245You can check that Scrapy is installed correctly by running ``scrapy bench``.
246If this command gives errors such as
247``TypeError: ... got 2 unexpected keyword arguments``, this means
248that setuptools was unable to pick up one PyPy-specific dependency.
249To fix this issue, run ``pip install 'PyPyDispatcher>=2.1.0'``.
250
251
252.. _intro-install-troubleshooting:
253
254Troubleshooting
255===============
256
257AttributeError: 'module' object has no attribute 'OP_NO_TLSv1_1'
258----------------------------------------------------------------
259
260After you install or upgrade Scrapy, Twisted or pyOpenSSL, you may get an
261exception with the following traceback::
262
263    […]
264      File "[…]/site-packages/twisted/protocols/tls.py", line 63, in <module>
265        from twisted.internet._sslverify import _setAcceptableProtocols
266      File "[…]/site-packages/twisted/internet/_sslverify.py", line 38, in <module>
267        TLSVersion.TLSv1_1: SSL.OP_NO_TLSv1_1,
268    AttributeError: 'module' object has no attribute 'OP_NO_TLSv1_1'
269
270The reason you get this exception is that your system or virtual environment
271has a version of pyOpenSSL that your version of Twisted does not support.
272
273To install a version of pyOpenSSL that your version of Twisted supports,
274reinstall Twisted with the :code:`tls` extra option::
275
276    pip install twisted[tls]
277
278For details, see `Issue #2473 <https://github.com/scrapy/scrapy/issues/2473>`_.
279
280.. _Python: https://www.python.org/
281.. _pip: https://pip.pypa.io/en/latest/installing/
282.. _lxml: https://lxml.de/index.html
283.. _parsel: https://pypi.org/project/parsel/
284.. _w3lib: https://pypi.org/project/w3lib/
285.. _twisted: https://twistedmatrix.com/trac/
286.. _cryptography: https://cryptography.io/en/latest/
287.. _pyOpenSSL: https://pypi.org/project/pyOpenSSL/
288.. _setuptools: https://pypi.python.org/pypi/setuptools
289.. _homebrew: https://brew.sh/
290.. _zsh: https://www.zsh.org/
291.. _Anaconda: https://docs.anaconda.com/anaconda/
292.. _Miniconda: https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html
293.. _Visual Studio: https://docs.microsoft.com/en-us/visualstudio/install/install-visual-studio
294.. _Microsoft C++ Build Tools: https://visualstudio.microsoft.com/visual-cpp-build-tools/
295.. _conda-forge: https://conda-forge.org/
296