我决定使用Python日志记录模块,因为Twisted在std错误上生成的消息太长,并且我想对INFO有意义的消息(例如,由s生成的消息)进行级别化,同时StatsCollector将其写在单独的日志文件中,同时保持屏幕上的消息。
from twisted.python import log import logging logging.basicConfig(level=logging.INFO, filemode='w', filename='buyerlog.txt') observer = log.PythonLoggingObserver() observer.start()
好吧,这很好,我已经收到了消息,但是缺点是我不知道消息是由哪个spider生成的!这是我的日志文件,“ twisted”显示为%(name)s:
%(name)s
INFO:twisted:Log opened. 2 INFO:twisted:Scrapy 0.12.0.2543 started (bot: property) 3 INFO:twisted:scrapy.telnet.TelnetConsole starting on 6023 4 INFO:twisted:scrapy.webservice.WebService starting on 6080 5 INFO:twisted:Spider opened 6 INFO:twisted:Spider opened 7 INFO:twisted:Received SIGINT, shutting down gracefully. Send again to force unclean shutdown 8 INFO:twisted:Closing spider (shutdown) 9 INFO:twisted:Closing spider (shutdown) 10 INFO:twisted:Dumping spider stats: 11 {'downloader/exception_count': 3, 12 'downloader/exception_type_count/scrapy.exceptions.IgnoreRequest': 3, 13 'downloader/request_bytes': 9973,
与扭曲标准错误产生的消息相比:
2019-12-16 17:34:56+0800 [expats] DEBUG: number of rules: 4 2019-12-16 17:34:56+0800 [scrapy] DEBUG: Telnet console listening on 0.0.0.0:6023 2019-12-16 17:34:56+0800 [scrapy] DEBUG: Web service listening on 0.0.0.0:6080 2019-12-16 17:34:56+0800 [iproperty] INFO: Spider opened 2019-12-16 17:34:56+0800 [iproperty] DEBUG: Redirecting (301) to <GET http://www.iproperty.com.sg/> from <GET http://iproperty.com.sg> 2011-12-16 17:34:57+0800 [iproperty] DEBUG: Crawled (200) <
我已经尝试了%(name)s,%(module)s等,但似乎无法显示蜘蛛网名称。有人知道答案吗?
编辑:使用LOG_FILE和LOG_LEVEL设置中的问题是较低级别的消息将不会显示在标准错误上。
LOG_FILE
LOG_LEVEL
你要使用ScrapyFileLogObserver。
ScrapyFileLogObserver
import logging from scrapy.log import ScrapyFileLogObserver logfile = open('testlog.log', 'w') log_observer = ScrapyFileLogObserver(logfile, level=logging.DEBUG) log_observer.start()
很高兴你提出这个问题,我一直想自己做。
使用以下命令重定向输出非常容易: scrapy some-scrapy's-args 2>&1 | tee -a logname
scrapy some-scrapy's-args 2>&1 | tee -a logname
这样,所有进入stdout和stderr的可疑输出,都将被重定向到一个日志名文件,也将被复制到屏幕上。