I'm trying to create generic DataSet[T] reader in order to avoid dataframe.as[..] for each reader call. There's support for primitive types and case classes so I was thinking about something like:
def read[T <: Product](sql : String): Dataset[T] = { import sparkSession.implicits._ val sqlContext = sparkSession.sqlContext val df: DataFrame = sqlContext.read.option("query", sql).load() df.as[T] } But I'm getting 'Unable to find encoder for type stored in Dataset' error. Is it possible to do something like that ?
Second cycle:
def read[T <: Product](sql : String) : Dataset[T] = { import sparkSession.implicits._ innerRead(sql) } private def innerRead[T <: Product : Encoder](sql : String): Dataset[T] = { val sqlContext = sparkSession.sqlContext val df: DataFrame = sqlContext.read.option("query", sql).load() df.as[T] } ends with type mismatch (foudn Encoder[Nothing] , required Encoder[T]).
I was trying to import newProductEncoder only , but ended the same.