0

According to this answer, python instances are set up when foreachPartition, mapPartitions functions are used in the nodes where the executors run. How are the memory / compute capacities of these instances set? Do we get to control it through some configuration?

3
  • Like all other configuration properties? Commented Feb 13, 2023 at 18:07
  • @mazaneicha, The closest property in the list that comes to mentioning this is spark.executor.pyspark.memory but even here, in the description, it does not explicitly mention if this is for the python instances created by the executor. Also, is there a similar parameter for cores? Commented Feb 13, 2023 at 18:39
  • python will be able to use all spark.executor.cores cores. Commented Feb 13, 2023 at 18:49

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.