unknown command: crawl error

Nits picture Nits · Apr 12, 2012 · Viewed 36.7k times · Source

I am a newbie to python. I am running python 2.7.3 version 32 bit on 64 bit OS. (I tried 64 bit but it didn't workout).

I followed the tutorial and installed scrapy on my machine. I have created one project, demoz. But when I enter scrapy crawl demoz it shows an error. I came across this thing when i hit scrapy command under (C:\python27\scripts) it shows:

C:\Python27\Scripts>scrapy
Scrapy 0.14.2 - no active project

Usage:
  scrapy <command> [options] [args]

Available commands:
  fetch         Fetch a URL using the Scrapy downloader
  runspider     Run a self-contained spider (without creating a project)
  settings      Get settings values
  shell         Interactive scraping console
  startproject  Create new project
  version       Print Scrapy version
  view          Open URL in browser, as seen by Scrapy

Use "scrapy <command> -h" to see more info about a command

C:\Python27\Scripts>

I guess their is something missing in installation can anybody help please .. Thanks in advance..

Answer

warvariuc picture warvariuc · Apr 12, 2012

You should run scrapy crawl spider_name command being in a scrapy project folder, where scrapy.cfg file resides.

From the docs:

Crawling

To put our spider to work, go to the project’s top level directory and run:

scrapy crawl dmoz