ubuntu16.04 lts 使用 anaconda2 安装了 scrapy 结果执行原来在 windows 下可以正常运行的的程序报错,执行 scrapy shell 命令也报错, google 了好久没找到答案,求大神解答,谢谢!

2016-04-28 22:31:28 +08:00
 bigbearme
Traceback (most recent call last):
File "/home/peter/anaconda2/bin/scrapy", line 11, in <module>
sys.exit(execute())
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 143, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 89, in _run_print_help
func(*a, **kw)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 150, in _run_command
cmd.run(args, opts)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/commands/shell.py", line 61, in run
crawler.engine = crawler._create_engine()
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/crawler.py", line 83, in _create_engine
return ExecutionEngine(self, lambda _: self.stop())
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/core/engine.py", line 69, in __init__
self.scraper = Scraper(crawler)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/core/scraper.py", line 70, in __init__
self.itemproc = itemproc_cls.from_crawler(crawler)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 56, in from_crawler
return cls.from_settings(crawler.settings, crawler)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings
mw = mwcls.from_crawler(crawler)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/media.py", line 33, in from_crawler
pipe = cls.from_settings(crawler.settings)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/images.py", line 57, in from_settings
return cls(store_uri)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/files.py", line 160, in __init__
self.store = self._get_store(store_uri)
File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/files.py", line 180, in _get_store
store_cls = self.STORE_SCHEMES[scheme]
KeyError: 'd'
4624 次点击
所在节点    Python
11 条回复
skyrem
2016-04-29 00:20:54 +08:00
scrapy shell 都报错兴许是 scrapy 本身的问题,重新用 pip 装一个吧
bigbearme
2016-04-29 06:33:45 +08:00
@skyrem 重新安装了两次然而并没有效果,不知道是不是因为 ubuntu 版本的问题
wlsnx
2016-04-29 10:24:18 +08:00
wlsnx
2016-04-29 10:37:49 +08:00
突然想到了,你不会是想把文件存在 D 盘吧? Linux 可没有 D 盘哦。
bigbearme
2016-04-29 11:20:51 +08:00
@wlsnx 我运行 scrapy shell 也报这个错啊。 linux 没有 d 盘这种常识我还是有的...
leavic
2016-04-29 12:43:03 +08:00
scrapy shell 都报错那坑定是 scrapy 没装好啊,这和你的代码又没关系。
pc10201
2016-04-29 13:16:58 +08:00
我遇到过一个坑,通过 pip 安装 scrapy ,发现没有 scrapy.exe 文件,后来手动下载源码,运行 python setup.py install 来安装的
bigbearme
2016-04-29 13:22:05 +08:00
@leavic 嗯,我也判断和代码没关系,估计是没装好, scrapy 安装真累心
wlsnx
2016-04-29 14:16:21 +08:00
我不知道 scrapy shell 为什么报错,但是报了 KeyError: 'd'这个错,你最好检查一下 settings.py 里有没有 FILES_STORE="d:\some\file\path"之类的配置。
Neveroldmilk
2016-04-29 15:51:38 +08:00
16.04 太新了,等一段时间再试吧。
bigbearme
2016-04-29 17:25:47 +08:00
@wlsnx 嗯嗯,回去看看,多谢

这是一个专为移动设备优化的页面(即为了让你能够在 Google 搜索结果里秒开这个页面),如果你希望参与 V2EX 社区的讨论,你可以继续到 V2EX 上打开本讨论主题的完整版本。

https://www.v2ex.com/t/275197

V2EX 是创意工作者们的社区,是一个分享自己正在做的有趣事物、交流想法,可以遇见新朋友甚至新机会的地方。

V2EX is a community of developers, designers and creative people.

© 2021 V2EX