LOADING

org.apache.spark.sql.api.java.JavaSQLContext

–>

<!– https://mvnrepository.com/artifact/org.apache.spark/spark-assembly_2.10 –>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-assembly_2.10</artifactId>
<version>1.6.0-cdh5.7.0</version>
</dependency>

maven中的依赖如上所示

org.apache.spark.sql.api.java.JavaSQLContext

该类在包中并没有找到

原因如下:

Prior to Spark 1.3 there were separate Java compatible classes (JavaSQLContext and JavaSchemaRDD)
that mirrored the Scala API. In Spark 1.3 the Java API and Scala API have been unified. Users of either language should use SQLContext and DataFrame.
In general theses classes try to use types that are usable from both languages (i.e. Array instead
of language specific collections). In some cases where no common type exists (e.g., for passing in closures or Maps)
function overloading is used instead.

Additionally the Java specific types API has been removed. Users of both Scala and Java should use the classes present in org.apache.spark.sql.types to
describe schema programmatically.

参考:https://*.com/questions/31648248/which-jar-contains-org-apache-spark-sql-api-java-javasqlcontext解释

本文来源 互联网收集,文章内容系作者个人观点,不代表 本站 对观点赞同或支持。如需转载,请注明文章来源,如您发现有涉嫌抄袭侵权的内容,请联系本站核实处理。

© 版权声明

相关文章