Scrapy is not recognized as an internal or external command, operable program or batch file

In addition to having Scrapy installed via pip you must have the directory Scripts of your Python installation in the PATH, where by default the executable should be found:

Scrapy is not recognized as an internal or external command, operable program or batch file

First check the path where you have Python installed If you have several versions of the interpreter you should look for the one where you have Scrapy installed, for example for the default installation directory in W10 of Python 3.6 as shown in the image above would be:

C:\Users\ YourUser \AppData\Local\Programs\Python\Python36

Then add the route to /Scripts to Path, for the above example it would be:

C:\Users\ YourUser \AppData\Local\Programs\Python\Python36\Scripts\

Scrapy is not recognized as an internal or external command, operable program or batch file

Once you have added the path to the Path do not forget to restart the terminal or open a new instance of CMD for the change to take effect.

The other option is to place yourself with the terminal in the directory /Scripts and once located in it you launch it as you do now, or you call the executable passing its absolute path:

$ C:\Users\TuUsuario\AppData\Local\Programs\Python\Python36\Scripts\scrapy runspider...

In this case it does not matter, but multiple commands of scraper (as crawl ) require a properly configured project.

pim

unread,

Jun 21, 2012, 6:23:18 AM6/21/12

to

Hi,

Sounds like you havn't added the python directory to your environment variables. Checkout this link for more information. (scroll to bottom)

Grtz, Pim

Sent from my iPhone

Roberto Fuentes

unread,

Jun 21, 2012, 6:54:58 AM6/21/12

to

Pim,

I deleted scrapy from my system and all its dependencies.

ERROR: 'xslt-config' is not recognized as an internal or external command,

operable program or batch file.

** make sure the development packages of libxml2 and libxslt are installed **

Using build configuration of libxslt

In file included from src/lxml/lxml.etree.c:239:0:

src/lxml/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory

compilation terminated.

error: Setup script exited with error: command 'cc' failed with exit status 1

I found this website

downloaded the 

After the above I ran

python setup.py install 

And everything worked out fine.

Cheers

echo % PATH % # To print only the path set # For all

You should see a "successfully installed" message, and some info about the path, like this:

> "WARNING: The script scrapy.exe is installed in > 'C:\Users\username\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\Scripts' > which is not on PATH "

If you try running it again...

C: \Users\ username\ AppData\ Local\ Packages\ PythonSoftwareFoundation.Python .3 .8_ qbz5n2kfra8p0\ LocalCache\ local - packages\ Python38\ Scripts\ scrapy

write below command in command line

conda update - n base - c defaults conda


Scrapy should be in your environment anycodings_scrapy variables. You can check if it's there anycodings_scrapy with the following in windows:,If updating conda doesn't bring the anycodings_scrapy expected result, try to install scrapy anycodings_scrapy in the current environment with pip anycodings_scrapy install scrapy.,You should see a "successfully anycodings_scrapy installed" message, and some info about anycodings_scrapy the path, like this:,same problem here and check up different anycodings_scrapy installation guide link and run below in anycodings_scrapy anaconda prompt:

I installed Scrapy in my python 2.7 anycodings_scrapy environment in windows 7 but when I trying anycodings_scrapy to start a new Scrapy project using scrapy anycodings_scrapy startproject newProject the command prompt anycodings_scrapy show this massage

'scrapy' is not recognized as an internal or external command, operable program or batch file.

Scrapy should be in your environment anycodings_scrapy variables. You can check if it's there anycodings_scrapy with the following in windows:

echo % PATH % # To print only the path set # For all

You should see a "successfully anycodings_scrapy installed" message, and some info about anycodings_scrapy the path, like this:

> "WARNING: The script scrapy.exe is installed in > 'C:\Users\username\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.8_qbz5n2kfra8p0\LocalCache\local-packages\Python38\Scripts' > which is not on PATH "

If you try running it again...

C: \Users\ username\ AppData\ Local\ Packages\ PythonSoftwareFoundation.Python .3 .8_ qbz5n2kfra8p0\ LocalCache\ local - packages\ Python38\ Scripts\ scrapy

write below command in command line

conda update - n base - c defaults conda


Building Scrapy’s dependencies requires the presence of a C compiler and development headers. On OS X this is typically provided by Apple’s Xcode development tools. To install the Xcode command line tools open a terminal window and run:,There’s a known issue that prevents pip from updating system packages. This has to be addressed to successfully install Scrapy and its dependencies. Here are some proposed solutions:,OpenSSL. This comes preinstalled in all operating systems, except Windows where the Python installer ships it bundled.,Close the command prompt window and reopen it so changes take effect, run the following command and check it shows the expected Python version:

conda install - c scrapinghub scrapy

C: \Python27\; C: \Python27\ Scripts\;

c: \python27\ python.exe c: \python27\ tools\ scripts\ win_add2path.py


Building Scrapy’s dependencies requires the presence of a C compiler and development headers. On OS X this is typically provided by Apple’s Xcode development tools. To install the Xcode command line tools open a terminal window and run:,There’s a known issue that prevents pip from updating system packages. This has to be addressed to successfully install Scrapy and its dependencies. Here are some proposed solutions:,You can follow the generic instructions or install Scrapy from AUR Scrapy package:,If you already have installed Anaconda or Miniconda, the company Scrapinghub maintains official conda packages for Linux, Windows and OS X.

You can install Scrapy using pip. To install using pip run:

To install Scrapy using conda, run:

conda install - c scrapinghub scrapy

If you prefer to build the python dependencies locally instead of relying on system packages you’ll need to install their required non-python dependencies first:

sudo apt - get install python - dev python - pip libxml2 - dev libxslt1 - dev zlib1g - dev libffi - dev libssl - dev

Update your PATH variable to state that homebrew packages should be used before system packages (Change .bashrc to .zshrc accordantly if you’re using zsh as default shell):

echo "export PATH=/usr/local/bin:/usr/local/sbin:$PATH" >> ~/.bashrc

Reload .bashrc to ensure the changes have taken place:

Latest versions of python have pip bundled with them so you won’t need to install it separately. If this is not the case, upgrade python:

brew update; brew upgrade python


We strongly recommend that you install Scrapy in a dedicated virtualenv, to avoid conflicting with your system packages.,Close the command prompt window and reopen it so changes take effect, run the following command and check it shows the expected Python version:,Once you have created a virtualenv, you can install scrapy inside it with pip, just like any other Python package. (See platform-specific guides below for non-Python dependencies that you may need to install beforehand).,Installing Scrapy Things that are good to know Using a virtual environment (recommended)

$[sudo] pip install virtualenv

C: \Python27\; C: \Python27\ Scripts\;

c: \python27\ python.exe c: \python27\ tools\ scripts\ win_add2path.py


How do you use Scrapy in CMD?

Using the scrapy tool You can start by running the Scrapy tool with no arguments and it will print some usage help and the available commands: Scrapy X.Y - no active project Usage: scrapy [options] [args] Available commands: crawl Run a spider fetch Fetch a URL using the Scrapy downloader [...]

Where is Scrapy installed?

TL;DR: We recommend installing Scrapy inside a virtual environment on all platforms. Python packages can be installed either globally (a.k.a system wide), or in user-space. We do not recommend installing Scrapy system wide. Instead, we recommend that you install Scrapy within a so-called “virtual environment” ( venv ).

How do you install using Scrapy?

Follow This steps to install scrapy on windows:.
Install Python 2.7..
adjust PATH environment variable to include paths to the Python executable and additional scripts. The following paths need to be added to PATH: C:\Python27;C:\Python27\Scripts;.
Install pywin32 from here..
let's install Scrapy: pip install Scrapy..

How do you install Scrapy in Anaconda?

Create an environment via your terminal. ON windows, this is the command prompt. You can look up the https://conda.io/docs/using/envs.html Once you have that use pip or conda to install scrapy to that environment, and then run the scrapy script in a directory of your choice.