I'm new to Scala and Spark, and current working on a scala spark jobs project, one thing frustrated me is I don't know how to debug the code in IntelliJ like I did with java.
after I imported the scala project, one thing i noticed is the spark-jobs folder was not marked as source code folder even though some other sub folders in the same module are market as source code folder.
-- utility (marked as source code folder) -- event-sender (marked as source code folder) -- spark-jobs (not marked as source code folder) -- src --main -- resources -- scala -- com -- example -- spark -- jobs as I checked the spark job I'm working on , there is no main method.
class DailyExport( env: String, )(implicit sc: SparkContext, sqlContext: SQLContext, logger: SparkJobLogger) extends JobAudit with PartitionedWriter { def run(): Unit = ... object DailyExport extends App with SparkJobParameters { { for { env <- getStringParameter("environment", 0, args) } yield { val jobConfig = SparkJobConfig.fromConfig.exportConfig ... new DailyExport( jobConfig = jobConfig ).run() } }.fold( error => { println(s"Some provided parameters are wrong: $error") sys.exit(-1) }, identity ) } however there is a main method defined in 'App'
trait App extends DelayedInit { ... @deprecatedOverriding("main should not be overridden", "2.11.0") def main(args: Array[String]) = { this._args = args for (proc <- initCode) proc() if (util.Properties.propIsSet("scala.time")) { val total = currentTime - executionStart Console.println("[total " + total + "ms]") } } then I right clicked on the job that I'm working to choose 'Run..', it complained
'Error: Could not find or load main class com.exmaple.spark.jobs.DailyExport' this is so different from Java, can anyone tell me how to debug it?