<bdo id='ZftpJ'></bdo><ul id='ZftpJ'></ul>
    1. <tfoot id='ZftpJ'></tfoot>

        <i id='ZftpJ'><tr id='ZftpJ'><dt id='ZftpJ'><q id='ZftpJ'><span id='ZftpJ'><b id='ZftpJ'><form id='ZftpJ'><ins id='ZftpJ'></ins><ul id='ZftpJ'></ul><sub id='ZftpJ'></sub></form><legend id='ZftpJ'></legend><bdo id='ZftpJ'><pre id='ZftpJ'><center id='ZftpJ'></center></pre></bdo></b><th id='ZftpJ'></th></span></q></dt></tr></i><div id='ZftpJ'><tfoot id='ZftpJ'></tfoot><dl id='ZftpJ'><fieldset id='ZftpJ'></fieldset></dl></div>

        <small id='ZftpJ'></small><noframes id='ZftpJ'>

        <legend id='ZftpJ'><style id='ZftpJ'><dir id='ZftpJ'><q id='ZftpJ'></q></dir></style></legend>
      1. Spark 安装 - 错误:无法找到或加载主类 org.apache.

        时间:2023-10-08
        • <bdo id='nkxN9'></bdo><ul id='nkxN9'></ul>

          • <small id='nkxN9'></small><noframes id='nkxN9'>

            1. <tfoot id='nkxN9'></tfoot>
                <tbody id='nkxN9'></tbody>
                1. <i id='nkxN9'><tr id='nkxN9'><dt id='nkxN9'><q id='nkxN9'><span id='nkxN9'><b id='nkxN9'><form id='nkxN9'><ins id='nkxN9'></ins><ul id='nkxN9'></ul><sub id='nkxN9'></sub></form><legend id='nkxN9'></legend><bdo id='nkxN9'><pre id='nkxN9'><center id='nkxN9'></center></pre></bdo></b><th id='nkxN9'></th></span></q></dt></tr></i><div id='nkxN9'><tfoot id='nkxN9'></tfoot><dl id='nkxN9'><fieldset id='nkxN9'></fieldset></dl></div>

                  <legend id='nkxN9'><style id='nkxN9'><dir id='nkxN9'><q id='nkxN9'></q></dir></style></legend>
                  本文介绍了Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

                  问题描述

                  安装 spark 2.3 并在 .bashrc 中设置以下环境变量(使用 gitbash)

                  After spark installation 2.3 and setting the following env variables in .bashrc (using gitbash)

                  1. HADOOP_HOME

                  1. HADOOP_HOME

                  SPARK_HOME

                  PYSPARK_PYTHON

                  PYSPARK_PYTHON

                  JDK_HOME

                  执行 $SPARK_HOME/bin/spark-submit 显示以下错误.

                  executing $SPARK_HOME/bin/spark-submit is displaying the following error.

                  错误:无法找到或加载主类 org.apache.spark.launcher.Main

                  Error: Could not find or load main class org.apache.spark.launcher.Main

                  我在 stackoverflow 和其他网站上进行了一些研究检查,但无法找出问题所在.

                  I did some research checking in stackoverflow and other sites, but could not figure out the problem.

                  执行环境

                  1. Windows 10 企业版
                  2. Spark 版本 - 2.3
                  3. Python 版本 - 3.6.4

                  你能提供一些指导吗?

                  推荐答案

                  我收到了那个错误信息.它可能有几个根本原因,但这是我调查和解决问题的方式(在 linux 上):

                  I had that error message. It probably may have several root causes but this how I investigated and solved the problem (on linux):

                  • 不要启动 spark-submit,而是尝试使用 bash -x spark-submit 来查看哪一行失败.
                  • 多次执行该过程(因为 spark-submit 调用嵌套脚本),直到找到调用的底层过程:在我的情况下类似于:
                  • instead of launching spark-submit, try using bash -x spark-submit to see which line fails.
                  • do that process several times ( since spark-submit calls nested scripts ) until you find the underlying process called : in my case something like :

                  /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp '/opt/spark-2.2.0-bin-hadoop2.7/conf/:/opt/spark-2.2.0-bin-hadoop2.7/jars/*' -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name 'Spark shell' spark-shell

                  因此,spark-submit 启动了一个 java 进程,但使用 /opt/spark-2.2.0-bin-hadoop2.7/中的文件找不到 org.apache.spark.launcher.Main 类jars/* (参见上面的 -cp 选项).我在这个 jars 文件夹中做了一个 ls 并计算了 4 个文件而不是整个 spark 分发(约 200 个文件).这可能是安装过程中的一个问题.所以我重新安装了 spark,检查了 jar 文件夹,它就像一个魅力.

                  So, spark-submit launches a java process and can't find the org.apache.spark.launcher.Main class using the files in /opt/spark-2.2.0-bin-hadoop2.7/jars/* (see the -cp option above). I did an ls in this jars folder and counted 4 files instead of the whole spark distrib (~200 files). It was probably a problem during the installation process. So I reinstalled spark, checked the jar folder and it worked like a charm.

                  所以,你应该:

                  • 检查 java 命令(cp 选项)
                  • 检查您的 jars 文件夹(它至少包含所有 spark-*.jar 吗?)

                  希望对你有帮助.

                  这篇关于Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!

                      <tbody id='idnv1'></tbody>
                      <bdo id='idnv1'></bdo><ul id='idnv1'></ul>

                      1. <small id='idnv1'></small><noframes id='idnv1'>

                        <legend id='idnv1'><style id='idnv1'><dir id='idnv1'><q id='idnv1'></q></dir></style></legend>
                      2. <i id='idnv1'><tr id='idnv1'><dt id='idnv1'><q id='idnv1'><span id='idnv1'><b id='idnv1'><form id='idnv1'><ins id='idnv1'></ins><ul id='idnv1'></ul><sub id='idnv1'></sub></form><legend id='idnv1'></legend><bdo id='idnv1'><pre id='idnv1'><center id='idnv1'></center></pre></bdo></b><th id='idnv1'></th></span></q></dt></tr></i><div id='idnv1'><tfoot id='idnv1'></tfoot><dl id='idnv1'><fieldset id='idnv1'></fieldset></dl></div>

                        <tfoot id='idnv1'></tfoot>