添加链接
link之家
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接
Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I am trying to connect to s3 provided by minio using spark But it is saying the bucket minikube does not exists . (created bucket already)

val spark = SparkSession.builder().appName("AliceProcessingTwentyDotTwo")
    .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer").master("local[1]")
    .getOrCreate()
  val sc= spark.sparkContext
  sc.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
  sc.hadoopConfiguration.set("fs.s3a.endpoint", "http://localhost:9000")
  sc.hadoopConfiguration.set("fs.s3a.access.key", "minioadmin")
  sc.hadoopConfiguration.set("fs.s3a.secret.key", "minioadmin")
  sc.hadoopConfiguration.set("fs.s3`a`.path.style.access", "true")
  sc.hadoopConfiguration.set("fs.s3a.connection.ssl.enabled","false")  
  sc.textFile("""s3a://minikube/data.json""").collect()

I am using the following guide to connect.

https://github.com/minio/cookbook/blob/master/docs/apache-spark-with-minio.md

These are the dependencies I used in scala.

"org.apache.spark" %% "spark-core" % "2.4.0", "org.apache.spark" %% "spark-sql" % "2.4.0", "com.amazonaws" % "aws-java-sdk" % "1.11.712", "org.apache.hadoop" % "hadoop-aws" % "2.7.3",

1) At which point does the error appear? Be more precise. 2) This bit: ("fs.s3a.path.style.access", "true") you have backticks in there, is that in your original code as well? They don't really belong there. – frandroid Mar 9, 2020 at 22:16 1. Cut "fs.s3a.impl" -that's just some superstition passed down by others. 2. use the same version of the AWS SDK the five year old version of hadoop you are using mvnrepository.com/artifact/org.apache.hadoop/hadoop-aws/2.7.0 – stevel Mar 11, 2020 at 10:49

Try spark 2.4.3 without hadoop and use Hadoop 2.8.2 or 3.1.2. After trying steps in below link I am able to connect minio using cli

https://www.jitsejan.com/setting-up-spark-with-minio-as-object-storage.html

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.