如何将基于案例类的 RDD 转换为 DataFrame?

问题描述:

Spark 文档 展示了如何从 RDD 创建 DataFrame,使用 Scala 案例类来推断模式.我正在尝试使用 sqlContext.createDataFrame(RDD, CaseClass) 重现这个概念,但我的 DataFrame 最终为空.这是我的 Scala 代码:

The Spark documentation shows how to create a DataFrame from an RDD, using Scala case classes to infer a schema. I am trying to reproduce this concept using sqlContext.createDataFrame(RDD, CaseClass), but my DataFrame ends up empty. Here's my Scala code:

// sc is the SparkContext, while sqlContext is the SQLContext.

// Define the case class and raw data
case class Dog(name: String)
val data = Array(
    Dog("Rex"),
    Dog("Fido")
)

// Create an RDD from the raw data
val dogRDD = sc.parallelize(data)

// Print the RDD for debugging (this works, shows 2 dogs)
dogRDD.collect().foreach(println)

// Create a DataFrame from the RDD
val dogDF = sqlContext.createDataFrame(dogRDD, classOf[Dog])

// Print the DataFrame for debugging (this fails, shows 0 dogs)
dogDF.show()

我看到的输出是:

Dog(Rex)
Dog(Fido)
++
||
++
||
||
++

我错过了什么?

谢谢!

您只需要

val dogDF = sqlContext.createDataFrame(dogRDD)

第二个参数是 Java API 的一部分,并期望您遵循 Java bean 约定(getter/setter).您的案例类不遵循此约定,因此未检测到任何属性,从而导致没有列的空 DataFrame.

Second parameter is part of Java API and expects you class follows java beans convention (getters/setters). Your case class doesn't follow this convention, so no property is detected, that leads to empty DataFrame with no columns.