<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>评论：cdh hive  2.1.1 升级到 2.3.4</title>
	<atom:link href="http://blog.51yip.com/hadoop/2391.html/feed" rel="self" type="application/rss+xml" />
	<link>http://blog.51yip.com/hadoop/2391.html</link>
	<description>－－一步，二步，三步，N步，二行脚印</description>
	<lastBuildDate>Tue, 07 Jun 2022 01:26:20 +0000</lastBuildDate>
	<generator>http://wordpress.org/?v=2.9.1</generator>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
		<item>
		<title>来自：ccc</title>
		<link>http://blog.51yip.com/hadoop/2391.html/comment-page-1#comment-25484</link>
		<dc:creator>ccc</dc:creator>
		<pubDate>Wed, 11 May 2022 03:44:40 +0000</pubDate>
		<guid isPermaLink="false">http://blog.51yip.com/?p=2391#comment-25484</guid>
		<description>升级后，sqopp通过hcatalog导数会报错，请问大佬有碰到过吗？</description>
		<content:encoded><![CDATA[<p>升级后，sqopp通过hcatalog导数会报错，请问大佬有碰到过吗？</p>
]]></content:encoded>
	</item>
	<item>
		<title>来自：xuanqing</title>
		<link>http://blog.51yip.com/hadoop/2391.html/comment-page-1#comment-18727</link>
		<dc:creator>xuanqing</dc:creator>
		<pubDate>Fri, 25 Jun 2021 02:49:29 +0000</pubDate>
		<guid isPermaLink="false">http://blog.51yip.com/?p=2391#comment-18727</guid>
		<description>是要把spark-hive_2.11-2.4.0-cdh6.3.1.jar换成其他的spark-hive-*.jar?
----------------------------------------
hive on spark, spark on yarn，spark目录下hive包是否已更改。spark-hive_2.11-2.4.0-cdh6.3.1.jar</description>
		<content:encoded><![CDATA[<p>是要把spark-hive_2.11-2.4.0-cdh6.3.1.jar换成其他的spark-hive-*.jar?<br />
----------------------------------------<br />
hive on spark, spark on yarn，spark目录下hive包是否已更改。spark-hive_2.11-2.4.0-cdh6.3.1.jar</p>
]]></content:encoded>
	</item>
	<item>
		<title>来自：smarctor</title>
		<link>http://blog.51yip.com/hadoop/2391.html/comment-page-1#comment-18090</link>
		<dc:creator>smarctor</dc:creator>
		<pubDate>Wed, 06 May 2020 08:17:02 +0000</pubDate>
		<guid isPermaLink="false">http://blog.51yip.com/?p=2391#comment-18090</guid>
		<description>之前没有更改！现在hive-exec-2.3.4.jar拷进去会报别的错误：org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: xxx_table

另外还有一个错误：set  hive.plan.serialization.format=javaXML;
Error while processing statement: hive configuration hive.plan.serialization.format does not exists

像这样的该如何解决，谢谢！</description>
		<content:encoded><![CDATA[<p>之前没有更改！现在hive-exec-2.3.4.jar拷进去会报别的错误：org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: xxx_table</p>
<p>另外还有一个错误：set  hive.plan.serialization.format=javaXML;<br />
Error while processing statement: hive configuration hive.plan.serialization.format does not exists</p>
<p>像这样的该如何解决，谢谢！</p>
]]></content:encoded>
	</item>
	<item>
		<title>来自：张映</title>
		<link>http://blog.51yip.com/hadoop/2391.html/comment-page-1#comment-18089</link>
		<dc:creator>张映</dc:creator>
		<pubDate>Wed, 06 May 2020 06:04:22 +0000</pubDate>
		<guid isPermaLink="false">http://blog.51yip.com/?p=2391#comment-18089</guid>
		<description>hive on spark, spark on yarn，spark目录下hive包是否已更改。spark-hive_2.11-2.4.0-cdh6.3.1.jar</description>
		<content:encoded><![CDATA[<p>hive on spark, spark on yarn，spark目录下hive包是否已更改。spark-hive_2.11-2.4.0-cdh6.3.1.jar</p>
]]></content:encoded>
	</item>
	<item>
		<title>来自：smarctor</title>
		<link>http://blog.51yip.com/hadoop/2391.html/comment-page-1#comment-18088</link>
		<dc:creator>smarctor</dc:creator>
		<pubDate>Wed, 06 May 2020 03:30:44 +0000</pubDate>
		<guid isPermaLink="false">http://blog.51yip.com/?p=2391#comment-18088</guid>
		<description>你好，我采用了你【cdh hive 2.1.1 升级到 2.3.4】的方法，通过hive client可以运行，但是我通过beeline就会卡住，任务无法提交的yarn上(我的环境是CDH6.2 hive on spark, spark on yarn，hive2.1.1升级2.3.4)，我看后台异常如下：
Caused by: java.lang.RuntimeException: java.lang.InterruptedException
	at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-14.0.1.jar:?]
	at org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:125) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.4.jar:2.3.4]
	... 22 more
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method) ~[?:1.8.0_181]
	at java.lang.Object.wait(Object.java:502) ~[?:1.8.0_181]
	at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:232) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
	at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:34) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:32) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]
	at org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:109) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.4.jar:2.3.4]
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.4.jar:2.3.4]
	... 22 more

--请问你有遇上个这个问题吗？</description>
		<content:encoded><![CDATA[<p>你好，我采用了你【cdh hive 2.1.1 升级到 2.3.4】的方法，通过hive client可以运行，但是我通过beeline就会卡住，任务无法提交的yarn上(我的环境是CDH6.2 hive on spark, spark on yarn，hive2.1.1升级2.3.4)，我看后台异常如下：<br />
Caused by: java.lang.RuntimeException: java.lang.InterruptedException<br />
	at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-14.0.1.jar:?]<br />
	at org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:125) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	... 22 more<br />
Caused by: java.lang.InterruptedException<br />
	at java.lang.Object.wait(Native Method) ~[?:1.8.0_181]<br />
	at java.lang.Object.wait(Object.java:502) ~[?:1.8.0_181]<br />
	at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:232) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]<br />
	at io.netty.util.concurrent.DefaultPromise.await(DefaultPromise.java:34) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]<br />
	at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:32) ~[netty-all-4.0.52.Final.jar:4.0.52.Final]<br />
	at org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:109) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:101) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiveSparkClient.java:97) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:73) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:62) ~[hive-exec-2.3.4.jar:2.3.4]<br />
	... 22 more</p>
<p>--请问你有遇上个这个问题吗？</p>
]]></content:encoded>
	</item>
</channel>
</rss>
