ubuntu上的scrapy安装问题

2024-06-02 18:26:09 发布

您现在位置:Python中文网/ 问答频道 /正文

我是一个最近的linux转换感兴趣使用scrapy。

jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ uname -a
Linux jeremy-Lenovo-G580 3.5.0-52-generic #79~precise1-Ubuntu SMP Fri Jul 4 21:03:49 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux

为此,我安装了Python2.7

$ python -V
Python 2.7.3

然后安装pip(使用sudo easy_install pip),然后使用它安装scrapy0.24

sudo pip install scrapy

scrapy工作了一段时间,我在http://doc.scrapy.org/en/latest/intro/tutorial.htmlfine运行教程。每当scrapy运行时,就有关于服务标识的投诉,所以我用pip安装了它(除非它在某个日志中,否则没有输出)。出于某种原因(可能是服务身份?),残破的-

$ scrapy -V
Traceback (most recent call last):
  File "/usr/local/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 122, in execute
    cmds = _get_commands_dict(settings, inproject)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 46, in _get_commands_dict
    cmds = _get_commands_from_module('scrapy.commands', inproject)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 29, in _get_commands_from_module
    for cmd in _iter_command_classes(module):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 20, in _iter_command_classes
    for module in walk_modules(module_name):
  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 68, in walk_modules
    submod = import_module(fullpath)
  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/bench.py", line 3, in <module>
    from scrapy.tests.mockserver import MockServer
  File "/usr/local/lib/python2.7/dist-packages/scrapy/tests/mockserver.py", line 6, in <module>
    from twisted.internet import reactor, defer, ssl
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/ssl.py", line 223, in <module>
    from twisted.internet._sslverify import (
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 184, in <module>
    verifyHostname, VerificationError = _selectVerifyImplementation()
  File "/usr/local/lib/python2.7/dist-packages/twisted/internet/_sslverify.py", line 159, in _selectVerifyImplementation
    from service_identity import VerificationError
  File "/usr/local/lib/python2.7/dist-packages/service_identity/__init__.py", line 11, in <module>
    from . import pyopenssl
  File "/usr/local/lib/python2.7/dist-packages/service_identity/pyopenssl.py", line 12, in <module>
    from pyasn1_modules.rfc2459 import GeneralNames
  File "/usr/local/lib/python2.7/dist-packages/pyasn1_modules/rfc2459.py", line 72, in <module>
    class AttributeValue(univ.Any): pass
AttributeError: 'module' object has no attribute 'Any'

所以从http://doc.scrapy.org/en/latest/topics/ubuntu.html#topics-ubuntu开始,我尝试先卸载(sudo pip uninstall scrapy),然后

sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 627220E7
echo 'deb http://archive.scrapy.org/ubuntu scrapy main' | sudo tee /etc/apt/sources.list.d/scrapy.list
sudo apt-get update && sudo apt-get install scrapy-0.24

也尝试过用pip install/uninstall卸载/重新装载scrapy几次,也尝试过easy_install scrapy,然后尝试用

 sudo pip install -U scrapy

好像是

sudo pip intall --upgrade scrapy 

(当我第一次尝试安装时,我发现一个旧版本的scrapy正在运行,而一个新版本的scrapy已经安装,在删除之后,是我唯一一次运行scrapy,所以我怀疑更新可能会再次修复问题)

$sudo pip install -U scrapy

Requirement already up-to-date: scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already up-to-date: Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking lxml from https://pypi.python.org/packages/source/l/lxml/lxml-3.3.5.tar.gz#md5=88c75f4c73fc8f59c9ebb17495044f2f (from scrapy)
  Downloading lxml-3.3.5.tar.gz (3.5MB): 3.5MB downloaded
  Running setup.py (path:/tmp/pip_build_root/lxml/setup.py) egg_info for package lxml
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
      warnings.warn(msg)
    Building lxml version 3.3.5.
    Building without Cython.
    ERROR: /bin/sh: 1: xslt-config: not found

    ** make sure the development packages of libxml2 and libxslt are installed **

    Using build configuration of libxslt

    warning: no previously-included files found matching '*.py'
Downloading/unpacking pyOpenSSL from https://pypi.python.org/packages/source/p/pyOpenSSL/pyOpenSSL-0.14.tar.gz#md5=8579ff3a1d858858acfba5f046a4ddf7 (from scrapy)
  Downloading pyOpenSSL-0.14.tar.gz (128kB): 128kB downloaded
  Running setup.py (path:/tmp/pip_build_root/pyOpenSSL/setup.py) egg_info for package pyOpenSSL

    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    no previously-included directories found matching 'doc/_build'
Requirement already up-to-date: cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already up-to-date: six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Downloading/unpacking zope.interface>=3.6.0 from https://pypi.python.org/packages/source/z/zope.interface/zope.interface-4.1.1.tar.gz#md5=edcd5f719c5eb2e18894c4d06e29b6c6 (from Twisted>=10.0.0->scrapy)
  Downloading zope.interface-4.1.1.tar.gz (864kB): 864kB downloaded
  Running setup.py (path:/tmp/pip_build_root/zope.interface/setup.py) egg_info for package zope.interface

    warning: no previously-included files matching '*.dll' found anywhere in distribution
    warning: no previously-included files matching '*.pyc' found anywhere in distribution
    warning: no previously-included files matching '*.pyo' found anywhere in distribution
    warning: no previously-included files matching '*.so' found anywhere in distribution
Downloading/unpacking cryptography>=0.2.1 (from pyOpenSSL->scrapy)
  Downloading cryptography-0.5.1.tar.gz (319kB): 319kB downloaded
  Running setup.py (path:/tmp/pip_build_root/cryptography/setup.py) egg_info for package cryptography

    Installed /tmp/pip_build_root/cryptography/cffi-0.8.6-py2.7-linux-x86_64.egg
    Searching for pycparser
    Reading http://pypi.python.org/simple/pycparser/
    Best match: pycparser 2.10
    Downloading https://pypi.python.org/packages/source/p/pycparser/pycparser-2.10.tar.gz#md5=d87aed98c8a9f386aa56d365fe4d515f
    Processing pycparser-2.10.tar.gz
    Running pycparser-2.10/setup.py -q bdist_egg --dist-dir /tmp/easy_install-LtjYh9/pycparser-2.10/egg-dist-tmp-1dc4kT
    zip_safe flag not set; analyzing archive contents...

    Installed /tmp/pip_build_root/cryptography/pycparser-2.10-py2.7.egg

    building '_Cryptography_cffi_684bb40axf342507b' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_684bb40axf342507b.so
    building '_Cryptography_cffi_8f86901cxc1767c5a' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.o -o /tmp/pip_build_root/cryptography/cryptography/hazmat/primitives/__pycache__/_Cryptography_cffi_8f86901cxc1767c5a.so
    building '_Cryptography_cffi_79a5b0a3x3a8a382' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/include/python2.7 -c cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.c -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o
    gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.o -lcrypto -lssl -o /tmp/pip_build_root/cryptography/cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_79a5b0a3x3a8a382.so
    no previously-included directories found matching 'docs/_build'
    warning: no previously-included files matching '*' found under directory 'vectors'
Downloading/unpacking setuptools from https://pypi.python.org/packages/3.4/s/setuptools/setuptools-5.4.1-py2.py3-none-any.whl#md5=5b7b07029ad2285d1cbf809a8ceaea08 (from zope.interface>=3.6.0->Twisted>=10.0.0->scrapy)
  Downloading setuptools-5.4.1-py2.py3-none-any.whl (528kB): 528kB downloaded
Downloading/unpacking cffi>=0.8 (from cryptography>=0.2.1->pyOpenSSL->scrapy)
  Downloading cffi-0.8.6.tar.gz (196kB): 196kB downloaded
  Running setup.py (path:/tmp/pip_build_root/cffi/setup.py) egg_info for package cffi

Downloading/unpacking pycparser (from cffi>=0.8->cryptography>=0.2.1->pyOpenSSL->scrapy)
  Downloading pycparser-2.10.tar.gz (206kB): 206kB downloaded
  Running setup.py (path:/tmp/pip_build_root/pycparser/setup.py) egg_info for package pycparser

Installing collected packages: lxml, pyOpenSSL, zope.interface, cryptography, setuptools, cffi, pycparser
  Found existing installation: lxml 2.3.2
    Uninstalling lxml:
      Successfully uninstalled lxml
  Running setup.py install for lxml
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'
      warnings.warn(msg)
    Building lxml version 3.3.5.
    Building without Cython.
    ERROR: /bin/sh: 1: xslt-config: not found

    ** make sure the development packages of libxml2 and libxslt are installed **

    Using build configuration of libxslt
    building 'lxml.etree' extension
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
    In file included from src/lxml/lxml.etree.c:346:0:
    /tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory
    compilation terminated.
    error: command 'gcc' failed with exit status 1
    Complete output from command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile:
    /usr/lib/python2.7/distutils/dist.py:267: UserWarning: Unknown distribution option: 'bugtrack_url'

  warnings.warn(msg)

Building lxml version 3.3.5.

Building without Cython.

ERROR: /bin/sh: 1: xslt-config: not found



** make sure the development packages of libxml2 and libxslt are installed **



Using build configuration of libxslt

running install

running build

running build_py

creating build

creating build/lib.linux-x86_64-2.7

creating build/lib.linux-x86_64-2.7/lxml

copying src/lxml/builder.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/sax.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/__init__.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/cssselect.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml

creating build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-2.7/lxml/includes

creating build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/builder.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/defs.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/clean.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/diff.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-2.7/lxml/html

copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml/html

creating build/lib.linux-x86_64-2.7/lxml/isoschematron

copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-2.7/lxml/isoschematron

copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-2.7/lxml

copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-2.7/lxml/includes

copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-2.7/lxml/includes

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng

copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl

creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1

running build_ext

building 'lxml.etree' extension

creating build/temp.linux-x86_64-2.7

creating build/temp.linux-x86_64-2.7/src

creating build/temp.linux-x86_64-2.7/src/lxml

gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/tmp/pip_build_root/lxml/src/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w

In file included from src/lxml/lxml.etree.c:346:0:

/tmp/pip_build_root/lxml/src/lxml/includes/etree_defs.h:9:31: fatal error: libxml/xmlversion.h: No such file or directory

compilation terminated.

error: command 'gcc' failed with exit status 1

----------------------------------------
  Rolling back uninstall of lxml
Cleaning up...
Command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip_build_root/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-Iog1QC-record/install-record.txt --single-version-externally-managed --compile failed with error code 1 in /tmp/pip_build_root/lxml
Storing debug log for failure in /home/jeremy/.pip/pip.log

我刚刚注意到一个wierd的东西(我想是wierd),pip根本不跑,但是sudo pip跑

jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ pip install scrapy
bash: /usr/bin/pip: No such file or directory
jeremy@jeremy-Lenovo-G580:~/Dropbox/projects/scrapy_stuff$ sudo pip install scrapy
Requirement already satisfied (use --upgrade to upgrade): scrapy in /usr/local/lib/python2.7/dist-packages
Requirement already satisfied (use --upgrade to upgrade): Twisted>=10.0.0 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): queuelib in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): lxml in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in /usr/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): cssselect>=0.9 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in /usr/local/lib/python2.7/dist-packages (from scrapy)
Requirement already satisfied (use --upgrade to upgrade): zope.interface>=3.6.0 in /usr/lib/python2.7/dist-packages (from Twisted>=10.0.0->scrapy)
Cleaning up...

所以,我会尝试安装libxml2和libxslt的开发包,但现在我并不乐观——我只是陷在一个兔子洞里,不清楚是什么首先打破了工作的碎片……感谢您的帮助,我棕色的头发变白了,灰色变白了,白色燃烧起来。

也许我应该试着在我设置的windows虚拟机上运行python和scrapy(顺便说一句,这很简单),但是这有点挫败了切换到linux的目的(这是因为我更接近我感兴趣的许多项目的源代码和开放方面)

好吧,所以我试着

sudo apt-get install libxml2-dev
sudo apt-get install libxslt1-dev 
sudo apt-get install python2.7-dev

但scrapy仍在使用attributeerror可怕地死去:module对象没有属性“any”


Tags: pipinpybuildsrclinuxlibpackages
2条回答

我在Ubuntu 14.04上遇到了这个问题。我确定所有的需求都是为列出的服务标识16.0.0安装的。我缺少pyasn1pyasn1模块(通过pip安装),但忽略了attrs。删除并重新安装服务标识时,该软件包已启动:

yes | pip uninstall service_identity; pip install service_identity

但是,将pip的升级功能与service_identity一起使用并不能解决问题。

ubuntu的重新安装解决了这个问题,也许是激烈但有效的。我安装了ubuntu14LTS而不是我刚开始安装的12LTS。只是打嗝

$sudo apt-get install python-pip
$sudo pip install scrapy
...
twisted/runner/portmap.c:10:20: fatal error: Python.h: No such file or directory

但是

 sudo apt-get install build-essential python-dev

处理好了。希望这次斯皮里能工作一个多小时。两天过得好吗?

相关问题 更多 >