擅长:python、mysql、java
<p>幸运的是,scrapy源代码是开放的,因此您可以遵循<a href="https://github.com/scrapy/scrapy/blob/master/scrapy/commands/crawl.py" rel="noreferrer">crawl command</a>的工作方式,并在代码中执行相同的操作:</p>
<pre><code>...
crawler = self.crawler_process.create_crawler()
spider = crawler.spiders.create(spname, **opts.spargs)
crawler.crawl(spider)
self.crawler_process.start()
</code></pre>