Spark does not support dataframe (or dataset or RDD) nesting.
You can break down your problem into two separate steps.
First, you need to parse JSON and build a case class consisting entirely of types Spark supports. This problem has nothing to do with Spark so let's assume you've coded this as:
def buildMyCaseClass(json: String): MyCaseClass = { ... }
Then, you need to transform your dataframe such that the string column becomes a struct column. The easiest way to do this is via a UDF.
val builderUdf = udf(buildMyCaseClass _) df.withColumn("myCol", builderUdf('myCol))