使用PySpark从Bigquery外部表读取数据并创建DataFrame

如何解决使用PySpark从Bigquery外部表读取数据并创建DataFrame

我已经在GCS中创建了NEWLINE DELIM JSON文件。我还在同一个JSON文件的顶部创建了一个外部表,并且能够从BigQuery UI读取数据。

我想使用PySpark访问外部表数据并创建一个数据框,然后从Dataproc中运行相同的作业。以下是我编写的代码段:

#!/usr/bin/python
import sys
import json
from pyspark.sql.functions import udf,lit,when,date_sub
from pyspark.sql.types import ArrayType,IntegerType,StructType,StructField,StringType,BooleanType,DateType
from pyspark import SparkContext,SparkConf,SQLContext
from pyspark.sql import SparkSession
from pyspark.sql import Row
from datetime import datetime

TargetTableUri=sys.argv[1]

spark = SparkSession \
  .builder \
  .master('yarn') \
  .appName('spark-bigquery-demo') \
  .getOrCreate()


bucket1 = "gs://first-bucket-arpan/output1"
spark.conf.set('temporaryGcsBucket',bucket1)

src_tbl = spark.read.format('bigquery') \
  .option('table','turing-thought-277215:first_dataset.ext_employee_details') \
  .load()
src_tbl.createOrReplaceTempView('src_tbl')

src_tbl_df = spark.sql( 'SELECT EMPID,EMPNAME,STREETADRESS,REGION,STATE,COUNTRY FROM src_tbl' )
src_tbl_df.show()
src_tbl_df.print_schema()

从dataproc集群运行作业时,出现以下错误: “:java.lang.UnsupportedOperationException:目前不支持表turing-thought-277215.first_dataset.ext_employee_details的类型:EXTERNAL”

PySpark for BigQuery Connector是否不支持BigQuery外部表? 以下是完整的日志。

20/08/13 16:44:25 INFO org.spark_project.jetty.util.log: Logging initialized @4863ms
20/08/13 16:44:25 INFO org.spark_project.jetty.server.Server: jetty-9.3.z-SNAPSHOT,build timestamp: unknown,git hash: unknown
20/08/13 16:44:25 INFO org.spark_project.jetty.server.Server: Started @5045ms
20/08/13 16:44:25 INFO org.spark_project.jetty.server.AbstractConnector: Started ServerConnector@5cf22b28{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
20/08/13 16:44:25 WARN org.apache.spark.scheduler.FairSchedulableBuilder: Fair Scheduler configuration file not found so jobs will be scheduled in FIFO order. To use fair scheduling,configure pools in fairscheduler.xml or set spark.scheduler.allocation.file to a file that contains the configuration.
20/08/13 16:44:27 INFO org.apache.hadoop.yarn.client.RMProxy: Connecting to ResourceManager at my-dataproc-cluster-m/10.148.0.40:8032
20/08/13 16:44:27 INFO org.apache.hadoop.yarn.client.AHSProxy: Connecting to Application History server at my-dataproc-cluster-m/10.148.0.40:10200
20/08/13 16:44:31 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl: Submitted application application_1596957621647_0022
Traceback (most recent call last):
  File "/tmp/job-scd2curation-6/scdtype2curation.py",line 25,in <module>
    .option('table','turing-thought-277215:first_dataset.ext_employee_details') \
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py",line 172,in load
  File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py",line 1257,in __call__
  File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py",line 63,in deco
  File "/usr/lib/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py",line 328,in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o59.load.

: java.lang.UnsupportedOperationException: The type of table turing-thought-277215.first_dataset.ext_employee_details is currently not supported: EXTERNAL

    at com.google.cloud.spark.bigquery.BigQueryRelationProvider.createRelationInternal(BigQueryRelationProvider.scala:83)
    at com.google.cloud.spark.bigquery.BigQueryRelationProvider.createRelation(BigQueryRelationProvider.scala:40)
    at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:341)
    at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:239)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:227)
    at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:164)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:282)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)

解决方法

它仅支持tableview,不支持externalSee the source

    val table = Option(bigquery.getTable(opts.tableId))
      .getOrElse(sys.error(s"Table $tableName not found"))
    table.getDefinition[TableDefinition].getType match {
      case TABLE => new DirectBigQueryRelation(opts,table)(sqlContext)
      case VIEW | MATERIALIZED_VIEW => if (opts.viewsEnabled) {
        new DirectBigQueryRelation(opts,table)(sqlContext)
      } else {
        sys.error(
          s"""Views were not enabled. You can enable views by setting
             |'${SparkBigQueryOptions.ViewsEnabledOption}' to true.
             |Notice additional cost may occur."""
            .stripMargin.replace('\n',' '))
      }
      case unsupported => throw new UnsupportedOperationException(
        s"The type of table $tableName is currently not supported: $unsupported")
    }

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 &lt;select id=&quot;xxx&quot;&gt; SELECT di.id, di.name, di.work_type, di.updated... &lt;where&gt; &lt;if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 &lt;property name=&quot;dynamic.classpath&quot; value=&quot;tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-