spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "[access key id]")
spark.sparkContext.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "[secrete access key]")
val bucketName = "[buckect name]"
val s3path = s"s3n://$bucketName/some_directory/data.json"
df.write.mode("overwrite").json(s3path)
More than 5 years have passed since last update.
Register as a new user and use Qiita more conveniently
- You get articles that match your needs
- You can efficiently read back useful information
- You can use dark theme
List of users who liked
00