I am playing around with Apache Spark with the Azure CosmosDB connectors in Scala and was wondering if anyone had examples or insight on how I would write my DataFrame back to a collection in my CosmosDB. Currently I am able to connect to my one collection and return the data and manipulate it but I want to write the results back to a different collection inside the same database.
I created a writeConfig that contains my EndPoint, MasterKey, Database, and the Collection that I want to write to.
I then tried writing it to the collection using the following line.
manipulatedData.toJSON.write.mode(SaveMode.Overwrite).cosmosDB(writeConfig) This runs fine and does not display any errors but nothing is showing up in my collection.
I went through the documentation I could find at https://github.com/Azure/azure-cosmosdb-spark but did not have much luck with finding any examples of writing data back to the database.
If there is an easier way to write to a documentDB/cosmosDB than what I am doing? I am open to any options.
Thanks for any help.