LoginSignup
0
0

More than 5 years have passed since last update.

save json via spark dataframe on AWS S3

Posted at

spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "[access key id]")
spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "[secrete access key]")

val bucketName = "[buckect name]"
val s3path = s"s3n://$bucketName/some_directory/data.json"
df.write.mode("overwrite").json(s3path)

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0