admin 管理员组文章数量: 887021
spark
环境:CDH6.3.2
spark版本2.4.0
spark-sql脚本
```shell
#!/bin/bash
export HADOOP_CONF_DIR=/etc/hadoop/conf
export YARN_CONF_DIR=/etc/hadoop/conf
SOURCE="${BASH_SOURCE[0]}"
BIN_DIR="$( dirname "$SOURCE" )"
while [ -h "$SOURCE" ]
do
SOURCE="$(readlink "$SOURCE")"
[[ $SOURCE != /* ]] && SOURCE="$BIN_DIR/$SOURCE"
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
done
BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
LIB_DIR=$BIN_DIR/../lib
export HADOOP_HOME=$LIB_DIR/hadoop
# Autodetect JAVA_HOME if not defined
. $LIB_DIR/bigtop-utils/bigtop-detect-javahome
exec $LIB_DIR/spark2/bin/spark-submit --class org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver "$@"
```
服务器上执行 spark-sql 命令
```shell
spark-sql --master yarn --driver-memory 4G --executor-memory 2G --driver-cores 1 --executor-cores 2 -d dt=20220727 -f /opt/sparksql/dwd/xxx.sql
```
报以下错误
```java
[root@cdh01 ~]# spark-sql --master yarn --driver-memory 4G --executor-memory 2G --driver-cores 1 --executor-cores 2 -d dt=20220727 -f /opt/sparksql/dwd/xxx.sql
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.checked.expressions does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.no.partition.filter does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vector.serde.deserialize does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.orderby.no.limit does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.adaptor.usage.mode does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vectorized.input.format does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.vectorized.input.format.excludes does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.bucketing does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.type.safety does not exist
22/07/28 17:00:47 WARN conf.HiveConf: HiveConf of name hive.strict.checks.cartesian.product does not exist
22/07/28 17:00:47 INFO hive.metastore: Trying to connect to metastore with URI thrift://cdh01:9083
22/07/28 17:00:47 INFO hive.metastore: Connected to metastore.
22/07/28 17:00:48 INFO session.SessionState: Created local directory: /tmp/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e_resources
22/07/28 17:00:48 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e
22/07/28 17:00:48 INFO session.SessionState: Created local directory: /tmp/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e
22/07/28 17:00:48 INFO session.SessionState: Created HDFS directory: /tmp/hive/root/ec4047fc-051f-4f24-9ab5-1e4dc9a2351e/_tmp_space.db
22/07/28 17:00:48 INFO spark.SparkContext: Running Spark version 2.4.0
22/07/28 17:00:48 INFO spark.SparkContext: Submitted application: SparkSQL::192.168.1.20
22/07/28 17:00:48 INFO spark.SecurityManager: Changing view acls to: root
22/07/28 17:00:48 INFO spark.SecurityManager: Changing modify acls to: root
22/07/28 17:00:48 INFO spark.SecurityManager: Changing view acls groups to:
22/07/28 17:00:48 INFO spark.SecurityManager: Changing modify acls groups to:
22/07/28 17:00:48 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
22/07/28 17:00:48 INFO util.Utils: Successfully started service 'sparkDriver' on port 37899.
22/07/28 17:00:48 INFO spark.SparkEnv: Registering MapOutputTracker
22/07/28 17:00:48 INFO spark.SparkEnv: Registering BlockManagerMaster
22/07/28 17:00:48 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/07/28 17:00:48 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/07/28 17:00:48 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-3e62cbd5-f9c7-43e4-8acb-e1d10a30da9b
22/07/28 17:00:48 INFO memory.MemoryStore: MemoryStore started with capacity 2004.6 MB
22/07/28 17:00:48 INFO spark.SparkEnv: Registering OutputCommitCoordinator
22/07/28 17:00:48 INFO util.log: Logging initialized @2977ms
22/07/28 17:00:48 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
22/07/28 17:00:48 INFO server.Server: Started @3042ms
22/07/28 17:00:48 INFO server.AbstractConnector: Started ServerConnector@30364216{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/07/28 17:00:48 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4a734c04{/jobs,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2127e66e{/jobs/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1229a2b7{/jobs/job,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51c959a4{/jobs/job/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4fc3c165{/stages,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@10a0fe30{/stages/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b6860f9{/stages/stage,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@262816a8{/stages/stage/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1effd53c{/stages/pool,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46c269e0{/stages/pool/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6920614{/storage,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6069dd38{/storage/json,null,AVAILABLE,@Spark}
22/07/28 17:00:48 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5fa23c{/storage/rdd,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@558756be{/storage/rdd/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@433348bc{/environment,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6d1dcdff{/environment/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@102ecc22{/executors,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7ff35a3f{/executors/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@26dc9bd5{/executors/threadDump,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@252dc8c4{/executors/threadDump/json,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43045f9f{/static,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65b97f47{/,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@255eaa6b{/api,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43e9089{/jobs/job/kill,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3c5dbdf8{/stages/stage/kill,null,AVAILABLE,@Spark}
22/07/28 17:00:49 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://cdh01:4040
22/07/28 17:00:49 INFO util.Utils: Using initial executors = 0, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/07/28 17:00:49 INFO yarn.Client: Requesting a new application from cluster with 5 NodeManagers
22/07/28 17:00:49 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (38357 MB per container)
22/07/28 17:00:49 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
22/07/28 17:00:49 INFO yarn.Client: Setting up container launch context for our AM
22/07/28 17:00:49 INFO yarn.Client: Setting up the launch environment for our AM container
22/07/28 17:00:49 INFO yarn.Client: Preparing resources for our AM container
22/07/28 17:00:49 INFO yarn.Client: Uploading resource file:/tmp/spark-e18790ed-7bf0-494c-a1b3-1b1e6c45ab66/__spark_conf__8487614089148696127.zip -> hdfs://cdh01:8020/user/root/.sparkStaging/application_1658995826987_0005/__spark_conf__.zip
22/07/28 17:00:49 INFO spark.SecurityManager: Changing view acls to: root
22/07/28 17:00:49 INFO spark.SecurityManager: Changing modify acls to: root
22/07/28 17:00:49 INFO spark.SecurityManager: Changing view acls groups to:
22/07/28 17:00:49 INFO spark.SecurityManager: Changing modify acls groups to:
22/07/28 17:00:49 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
22/07/28 17:00:50 INFO yarn.Client: Submitting application application_1658995826987_0005 to ResourceManager
22/07/28 17:00:50 INFO impl.YarnClientImpl: Submitted application application_1658995826987_0005
22/07/28 17:00:50 INFO cluster.SchedulerExtensionServices: Starting Yarn extension services with app application_1658995826987_0005 and attemptId None
22/07/28 17:00:51 INFO yarn.Client: Application report for application_1658995826987_0005 (state: ACCEPTED)
22/07/28 17:00:51 INFO yarn.Client:
client token: N/A
diagnostics: AM container is launched, waiting for AM container to Register with RM
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: root.users.root
start time: 1658998850473
final status: UNDEFINED
tracking URL: http://cdh01:8088/proxy/application_1658995826987_0005/
user: root
22/07/28 17:00:52 INFO yarn.Client: Application report for application_1658995826987_0005 (state: ACCEPTED)
22/07/28 17:00:53 INFO cluster.YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> cdh01,cdh02, PROXY_URI_BASES -> http://cdh01:8088/proxy/application_1658995826987_0005,http://cdh02:8088/proxy/application_1658995826987_0005, RM_HA_URLS -> cdh01:8088,cdh02:8088), /proxy/application_1658995826987_0005
22/07/28 17:00:53 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 7767897012427002665
java.lang.ClassNotFoundException: org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages$RetrieveDelegationTokens$
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2011)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1875)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1692)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:647)
at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:181)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103)
at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:750)
22/07/28 17:00:53 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /jobs, /jobs/json, /jobs/job, /jobs/job/json, /stages, /stages/json, /stages/stage, /stages/stage/json, /stages/pool, /stages/pool/json, /storage, /storage/json, /storage/rdd, /storage/rdd/json, /environment, /environment/json, /executors, /executors/json, /executors/threadDump, /executors/threadDump/json, /static, /, /api, /jobs/job/kill, /stages/stage/kill.
22/07/28 17:00:53 INFO yarn.Client: Application report for application_1658995826987_0005 (state: RUNNING)
22/07/28 17:00:53 INFO yarn.Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 192.168.1.20
ApplicationMaster RPC port: -1
queue: root.users.root
start time: 1658998850473
final status: UNDEFINED
tracking URL: http://cdh01:8088/proxy/application_1658995826987_0005/
user: root
22/07/28 17:00:53 INFO cluster.YarnClientSchedulerBackend: Application application_1658995826987_0005 has started running.
22/07/28 17:00:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38426.
22/07/28 17:00:53 INFO netty.NettyBlockTransferService: Server created on cdh01:38426
22/07/28 17:00:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
22/07/28 17:00:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManagerMasterEndpoint: Registering block manager cdh01:38426 with 2004.6 MB RAM, BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:53 INFO storage.BlockManager: external shuffle service port = 7337
22/07/28 17:00:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, cdh01, 38426, None)
22/07/28 17:00:54 INFO ui.JettyUtils: Adding filter org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter to /metrics/json.
22/07/28 17:00:54 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3dc39459{/metrics/json,null,AVAILABLE,@Spark}
22/07/28 17:00:54 INFO scheduler.EventLoggingListener: Logging events to hdfs://cdh01:8020/user/spark/applicationHistory/application_1658995826987_0005
22/07/28 17:00:54 INFO util.Utils: Using initial executors = 0, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
22/07/28 17:00:54 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
22/07/28 17:00:54 INFO server.AbstractConnector: Stopped Spark@30364216{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
22/07/28 17:00:54 INFO ui.SparkUI: Stopped Spark web UI at http://cdh01:4040
22/07/28 17:00:54 WARN cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Interrupting monitor thread
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Shutting down all executors
22/07/28 17:00:54 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down
22/07/28 17:00:54 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
services=List(),
started=false)
22/07/28 17:00:54 INFO cluster.YarnClientSchedulerBackend: Stopped
22/07/28 17:00:54 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
22/07/28 17:00:54 INFO memory.MemoryStore: MemoryStore cleared
22/07/28 17:00:54 INFO storage.BlockManager: BlockManager stopped
22/07/28 17:00:54 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
22/07/28 17:00:54 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
22/07/28 17:00:54 INFO spark.SparkContext: Successfully stopped SparkContext
22/07/28 17:00:54 ERROR spark.SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Exception when registering SparkListener
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2398)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:555)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:315)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.NavigatorAppListener
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2682)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2680)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2680)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2387)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2386)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2386)
... 22 more
22/07/28 17:00:54 INFO spark.SparkContext: SparkContext already stopped.
Exception in thread "main" org.apache.spark.SparkException: Exception when registering SparkListener
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2398)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:555)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:48)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:315)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:166)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.cloudera.spark.lineage.NavigatorAppListener
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:238)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2682)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2680)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2680)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2387)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2386)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2386)
... 22 more
22/07/28 17:00:54 INFO util.ShutdownHookManager: Shutdown hook called
22/07/28 17:00:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-8b86611e-65b4-4793-badb-99f77cbdfe9f
22/07/28 17:00:54 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-e18790ed-7bf0-494c-a1b3-1b1e6c45ab66
```
本文标签: spark
版权声明:本文标题:spark 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.freenas.com.cn/jishu/1686606900h16339.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论