2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

Spark のテーブルの filter 条件を Map から読み込む

Posted at

Spark のテーブルの filter 条件を Map から読み込む

環境

Scala 2.11.8
Spark 2.3.1

やりたいこと

  • Json の中にテーブル read の条件を書きたい。
  • spark.sql を使わずに read したい

読み込んだ json の中身が、

"tableName": "tableA",
"filterCondition": {
    "date": "20180101",
    "type": "A"
}

の場合、

val df = spark.read.table("tableA").filter($"date" === filterCondition.date && $"type" === filterCondition.type)

みたいなものを、動的に生成してほしい。

解決策

Map 型に map (ややこしい)を適用する際には、case を使う必要がある。

  def loadTable(spark: SparkSession, tableName: String, filterCondition: Map[String, String]): DataFrame = {
    if (filter.isEmpty) {
      spark.read.table(tableName)
    }
    else {
      spark.read.table(tableName).filter(filterCondition.map{case (key, value) => col(key) === lit(value)}.reduce(_ && _))
    }
  }

参考:https://stackoverflow.com/questions/8812338/scala-mismatch-while-mapping-map

2
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?