English 中文(简体)
2. 检察官计算、集群和Exit状况 143
原标题:Spark Executor calculation, Cluster Size and Exit status 143 error

I have 3gb of data 400000 rows in a table. I need to create GCP data proc cluster.

  1. Require CLuster size like (1 master 3 worker with size 2 core 8 GB RAM each )
  2. executor memory
  3. executor memory overhead
  4. executor core
  5. executor intance
  6. driver memory

我试图按照gle角适用以下规定:

  1. CLuster size like (1 master 4 worker with size 16 core 64 GB RAM each )
  2. executor memory -19g
  3. executor memory overhead - 2g
  4. executor core - 5
  5. executor intance -3
  6. driver memory -2g

还是有143个外差错。

问题回答

暂无回答




相关问题
how to use phoenix5.0 with spark 3.0 preview

case "phoenix" =>{ outputStream.foreachRDD(rdd=>{ val spark=SparkSession.builder().config(rdd.sparkContext.getConf).getOrCreate() val ds=spark.createDataFrame(rdd,...

同一S3bucket使用多位证书

I'm using 2.1.1 with Hadoop 2.7.3 and I'm use data from different S3 sites in one管线。

热门标签