从Scrapy项目生成exe

如何解决从Scrapy项目生成exe

我正在尝试使用PyInstaller(更具体地说,使用auto-py-to-exe GUI)从使用Scrapy的项目中生成exe文件。

主文件依次执行两个蜘蛛程序:

from scrapy.crawler import CrawlerRunner
from twisted.internet import reactor,defer
from spiderDummy1 import SpiderDummy1
from spiderDummy2 import SpiderDummy2

# ... some log configurations

@defer.inlineCallbacks
def crawl(runner):

    yield runner.crawl(SpiderDummy1)
    yield runner.crawl(SpiderDummy2,start_urls=["https://google.com"])

    reactor.stop()

runner = CrawlerRunner(
    {
        'LOG_STDOUT': True,'LOG_ENABLED': True,'FEEDS':{
            'items.jl': {
                'format': 'jsonlines','encoding': 'utf8'
            }   
        },}
)
crawl(runner)
reactor.run()

这是我的SpiderDummy1

import scrapy
class SpiderDummy1(scrapy.Spider):
    name = "Spider Dummy 1"

    def start_requests(self):
        url = "https://google.com"
        yield scrapy.Request(url,self.parse)

    def parse(self,response):
        yield {"foo1": "bar1"}

这是我的SpiderDummy2

import scrapy
class SpiderDummy2(scrapy.Spider):
    name = "Spider Dummy 2"

    def parse(self,response):
        yield {"foo2": "bar2"}

使用python main.py运行此代码将按预期生成具有以下内容的items.jl:

{"foo1": "bar1"}
{"foo2": "bar2"}

这是运行该程序的日志:

2020-08-13 17:28:37,552 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:28:37,575 INFO scrapy.extensions.telnet Telnet Password: ec57f644dc444404
2020-08-13 17:28:37,622 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats','scrapy.extensions.telnet.TelnetConsole','scrapy.extensions.feedexport.FeedExporter','scrapy.extensions.logstats.LogStats']
2020-08-13 17:28:37,942 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware','scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware','scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware','scrapy.downloadermiddlewares.useragent.UserAgentMiddleware','scrapy.downloadermiddlewares.retry.RetryMiddleware','scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware','scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware','scrapy.downloadermiddlewares.redirect.RedirectMiddleware','scrapy.downloadermiddlewares.cookies.CookiesMiddleware','scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware','scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:28:37,949 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware','scrapy.spidermiddlewares.offsite.OffsiteMiddleware','scrapy.spidermiddlewares.referer.RefererMiddleware','scrapy.spidermiddlewares.urllength.UrlLengthMiddleware','scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:28:37,950 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:28:37,950 INFO scrapy.core.engine Spider opened
2020-08-13 17:28:37,954 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min),scraped 0 items (at 0 items/min)
2020-08-13 17:28:37,955 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:28:38,206 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:28:38,490 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:28:38,593 DEBUG scrapy.core.scraper Scraped from <200 https://www.google.com/>

{'foo1': 'bar1'}
2020-08-13 17:28:38,604 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:28:38,606 INFO scrapy.extensions.feedexport Stored jsonlines feed (1 items) in: items.jl
2020-08-13 17:28:38,608 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,'downloader/request_count': 2,'downloader/request_method_count/GET': 2,'downloader/response_bytes': 7252,'downloader/response_count': 2,'downloader/response_status_count/200': 1,'downloader/response_status_count/301': 1,'elapsed_time_seconds': 0.651448,'finish_reason': 'finished','finish_time': datetime.datetime(2020,8,13,20,28,38,605292),'item_scraped_count': 1,'log_count/DEBUG': 3,'log_count/INFO': 11,'response_received_count': 1,'scheduler/dequeued': 2,'scheduler/dequeued/memory': 2,'scheduler/enqueued': 2,'scheduler/enqueued/memory': 2,'start_time': datetime.datetime(2020,37,953844)}
2020-08-13 17:28:38,609 INFO scrapy.core.engine Spider closed (finished)
2020-08-13 17:28:38,646 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:28:38,647 INFO scrapy.extensions.telnet Telnet Password: 717ee8d5719577dd
2020-08-13 17:28:38,649 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats','scrapy.extensions.logstats.LogStats']
2020-08-13 17:28:38,652 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware','scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:28:38,653 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware','scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:28:38,653 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:28:38,653 INFO scrapy.core.engine Spider opened
2020-08-13 17:28:38,654 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min),scraped 0 items (at 0 items/min)
2020-08-13 17:28:38,655 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:28:38,890 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:28:39,149 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:28:39,252 DEBUG scrapy.core.scraper Scraped from <200 https://www.google.com/>

{'foo2': 'bar2'}
2020-08-13 17:28:39,254 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:28:39,255 INFO scrapy.extensions.feedexport Stored jsonlines feed (1 items) in: items.jl
2020-08-13 17:28:39,256 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,'downloader/response_bytes': 7242,'elapsed_time_seconds': 0.601099,39,255262),654163)}
2020-08-13 17:28:39,256 INFO scrapy.core.engine Spider closed (finished)

但是,在使用auto-py-to-exe生成exe之后,使用以下命令:

pyinstaller --noconfirm --onedir --console --add-data "C:/Users/nayra/Desktop/scrapy/log;log/"  "C:/Users/nayra/Desktop/scrapy/main.py"

它不会生成items.jl文件,并提供以下日志:

2020-08-13 17:36:14,927 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:14,941 INFO scrapy.extensions.telnet Telnet Password: 50e993c30b2eb0fd
2020-08-13 17:36:14,959 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats','scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:15,323 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware','scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:15,326 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware','scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:15,327 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:15,327 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:15,330 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min),scraped 0 items (at 0 items/min)
2020-08-13 17:36:15,331 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:15,570 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:15,825 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:15,926 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py",line 1418,in _inlineCallbacks
StopIteration: <200 https://www.google.com/>

During handling of the above exception,another exception occurred:

Traceback (most recent call last):
  File "scrapy\utils\defer.py",line 55,in mustbe_deferred
  File "scrapy\core\spidermw.py",line 60,in process_spider_input
  File "scrapy\core\scraper.py",line 152,in call_spider
  File "scrapy\utils\misc.py",line 212,in warn_on_generator_with_return_value
  File "scrapy\utils\misc.py",line 197,in is_generator_with_return_value
  File "inspect.py",line 985,in getsource
  File "inspect.py",line 967,in getsourcelines
  File "inspect.py",line 798,in findsource
OSError: could not get source code
2020-08-13 17:36:16,028 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,029 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,'downloader/response_bytes': 7275,'elapsed_time_seconds': 0.698791,36,16,28306),'log_count/DEBUG': 2,'log_count/ERROR': 1,'log_count/INFO': 10,'spider_exceptions/OSError': 1,15,329515)}
2020-08-13 17:36:16,029 INFO scrapy.core.engine Spider closed (finished)
2020-08-13 17:36:16,061 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:16,062 INFO scrapy.extensions.telnet Telnet Password: 345875f820220cf6
2020-08-13 17:36:16,065 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats','scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:16,070 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware','scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware','scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:16,072 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:16,073 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min),scraped 0 items (at 0 items/min)
2020-08-13 17:36:16,074 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:16,308 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:16,573 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:16,675 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py",776 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,777 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,'downloader/response_bytes': 7272,'elapsed_time_seconds': 0.704001,776310),72309)}
2020-08-13 17:36:16,777 INFO scrapy.core.engine Spider closed (finished)

auto-py-to-exe的日志如下:

Running auto-py-to-exe v2.7.5
Building directory: C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3
Provided command: pyinstaller --noconfirm --onedir --console --add-data "C:/Users/nayra/Desktop/scrapy/log;log/"  "C:/Users/nayra/Desktop/scrapy/main.py"
Recursion Limit is set to 5000
Executing: pyinstaller --noconfirm --onedir --console --add-data C:/Users/nayra/Desktop/scrapy/log;log/ C:/Users/nayra/Desktop/scrapy/main.py --distpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\application --workpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build --specpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3

200432 INFO: PyInstaller: 4.0
200437 INFO: Python: 3.8.3 (conda)
200443 INFO: Platform: Windows-10-10.0.18362-SP0
200449 INFO: wrote C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\main.spec
200459 INFO: UPX is not available.
200477 INFO: Extending PYTHONPATH with paths
['C:\\Users\\nayra\\Desktop\\scrapy','C:\\Users\\nayra\\AppData\\Local\\Temp\\tmp4p2fxau3']
200514 INFO: checking Analysis
200520 INFO: Building Analysis because Analysis-00.toc is non existent
200528 INFO: Initializing module dependency graph...
200539 INFO: Caching module graph hooks...
200565 INFO: Analyzing base_library.zip ...
203878 INFO: Processing pre-find module path hook distutils from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
203884 INFO: distutils: retargeting to non-venv dir 'd:\\miniconda3\\lib'
208951 INFO: Caching module dependency graph...
209154 INFO: running Analysis Analysis-00.toc
209189 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
  required by d:\miniconda3\python.exe
209627 INFO: Analyzing C:\Users\nayra\Desktop\scrapy\main.py
212480 INFO: Processing pre-safe import module hook six.moves from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-six.moves.py'.
216208 INFO: Processing module hooks...
216212 INFO: Loading module hook 'hook-cryptography.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
216579 INFO: Loading module hook 'hook-lxml.etree.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
216585 INFO: Loading module hook 'hook-pywintypes.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
217108 INFO: Loading module hook 'hook-distutils.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217114 INFO: Loading module hook 'hook-encodings.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217232 INFO: Loading module hook 'hook-lib2to3.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217287 INFO: Loading module hook 'hook-scrapy.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
221723 INFO: Processing pre-find module path hook site from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-site.py'.
221730 INFO: site: retargeting to fake-dir 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\fake-modules'
223193 INFO: Processing pre-safe import module hook setuptools.extern.six.moves from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-setuptools.extern.six.moves.py'.
227991 INFO: Loading module hook 'hook-setuptools.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229014 INFO: Loading module hook 'hook-sqlite3.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229160 INFO: Loading module hook 'hook-sysconfig.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229168 INFO: Loading module hook 'hook-xml.dom.domreg.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229178 INFO: Loading module hook 'hook-xml.etree.cElementTree.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229184 INFO: Loading module hook 'hook-xml.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229191 INFO: Loading module hook 'hook-_tkinter.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229421 INFO: checking Tree
229429 INFO: Building Tree because Tree-00.toc is non existent
229438 INFO: Building Tree Tree-00.toc
229556 INFO: checking Tree
229565 INFO: Building Tree because Tree-01.toc is non existent
229582 INFO: Building Tree Tree-01.toc
229614 INFO: Loading module hook 'hook-eel.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
229836 INFO: Loading module hook 'hook-pycparser.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
229844 INFO: Loading module hook 'hook-gevent.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
230387 INFO: Determining a mapping of distributions to packages...
250993 WARNING: Unable to find package for requirement zope.event from package gevent.
251003 WARNING: Unable to find package for requirement zope.interface from package gevent.
251019 WARNING: Unable to find package for requirement greenlet from package gevent.
251026 INFO: Packages required by gevent:
['setuptools','cffi']
252462 INFO: Loading module hook 'hook-pkg_resources.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
252903 INFO: Processing pre-safe import module hook win32com from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\pre_safe_import_module\\hook-win32com.py'.
253530 WARNING: Hidden import "pkg_resources.py2_warn" not found!
253542 WARNING: Hidden import "pkg_resources.markers" not found!
253556 INFO: Excluding import '__main__'
253572 INFO:   Removing import of __main__ from module pkg_resources
253588 INFO: Loading module hook 'hook-pythoncom.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
254138 INFO: Loading module hook 'hook-win32com.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
254787 INFO: Looking for ctypes DLLs
254945 INFO: Analyzing run-time hooks ...
254968 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth__tkinter.py'
254979 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_multiprocessing.py'
254996 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgres.py'
255008 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_win32comgenpy.py'
255022 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\rthooks\\pyi_rth_twisted.py'
255061 INFO: Looking for dynamic libraries
255771 INFO: Looking for eggs
255779 INFO: Using Python library d:\miniconda3\python38.dll
255794 INFO: Found binding redirects: 
[]
255821 INFO: Warnings written to C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\warn-main.txt
256088 INFO: Graph cross-reference written to C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\xref-main.html
256165 INFO: Appending 'datas' from .spec
256179 INFO: checking PYZ
256189 INFO: Building PYZ because PYZ-00.toc is non existent
256203 INFO: Building PYZ (ZlibArchive) C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\PYZ-00.pyz
258584 INFO: Building PYZ (ZlibArchive) C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\PYZ-00.pyz completed successfully.
258634 INFO: checking PKG
258643 INFO: Building PKG because PKG-00.toc is non existent
258658 INFO: Building PKG (CArchive) PKG-00.pkg
258716 INFO: Building PKG (CArchive) PKG-00.pkg completed successfully.
258727 INFO: Bootloader d:\miniconda3\lib\site-packages\PyInstaller\bootloader\Windows-64bit\run.exe
258740 INFO: checking EXE
258756 INFO: Building EXE because EXE-00.toc is non existent
258770 INFO: Building EXE from EXE-00.toc
258792 INFO: Appending archive to EXE C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\main.exe
258846 INFO: Building EXE from EXE-00.toc completed successfully.
258870 INFO: checking COLLECT
258886 INFO: Building COLLECT because COLLECT-00.toc is non existent
258911 INFO: Building COLLECT COLLECT-00.toc
262077 INFO: Building COLLECT COLLECT-00.toc completed successfully.

Moving project to: C:\Users\nayra\Desktop\scrapy\dist
Complete.

是否有pyinstaller缺少的软件包或配置会生成功能正常的exe?

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 &lt;select id=&quot;xxx&quot;&gt; SELECT di.id, di.name, di.work_type, di.updated... &lt;where&gt; &lt;if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 &lt;property name=&quot;dynamic.classpath&quot; value=&quot;tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-