如何解决“未解决的依赖性:org.apache.hbase#hbase-client; 1.1.1:找不到”以不同顺序运行任务时发生异常
我有一个Spring Boot应用程序(API),其中我使用spark连接到远程配置单元,并将一些数据缓存到内存中。
现在,我的程序有3个任务:
- 创建SparkSession
- 运行spark.sql并缓存数据
- 启动API(从缓存中搜索一些数据后,它将返回响应)
如果按照我提到的顺序执行这些步骤,则在执行步骤3(启动API)时会出现异常。但是,如果我按1-> 3-> 2的顺序进行操作,则效果很好。
下面是一段代码,但有一个例外:
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hbase#hbase-client;1.1.1: not found]
import org.apache.spark.sql.SparkSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class ApiStart {
private static Logger logger = LoggerFactory.getLogger(ApiStart.class);
public static void main(String[] args) throws Exception{
logger.info(">>> Creating SparkSession");
// This method will return SparkSession with hive enabled
SparkConnect sparkConnect = new SparkConnect();
SparkSession spark = sparkConnect.getSparkSession();
logger.info(">>> SparkSession has been created");
logger.info(">>> Beginning to Cache the Dataset");
// This method will run spark.sql("Query"),cache it,and perform fetch count to trigger caching
sparkConnect.cacheDataset(spark);
logger.info(">>> Dataset has been cached");
logger.info(">>> Starting the API");
// This will start the API
SpringApplication.run(ApiStart.class,args);
logger.info(">>> API Started");
// If the method sparkConnect.cacheDataset(spark); is called here (after starting the API,it works perfectly fine)
}
}
<properties>
<spark.version>2.4.0</spark.version>
<hadoopclient.version>2.7.3</hadoopclient.version>
<hivemetastore.version>2.1.0</hivemetastore.version>
<hiveexec.version>2.2.0</hiveexec.version>
</properties>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.5.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoopclient.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>${hivemetastore.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hiveexec.version}</version>
</dependency>
<!-- THE WORKING SCENARIO I MENTIONED ABOVE WORKS WITHOUT THIS AS WELL -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<addResources>true</addResources>
</configuration>
<executions>
<execution>
<id>build-info</id>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。