I am trying the below code but it is throwing some random error that I am unable to understand:
df.registerTempTable("Temp_table") spark.sql("Update Temp_table set column_a='1'") Currently spark sql does not support UPDATE statments. The workaround is to use create a delta lake / iceberg table using your spark dataframe and execute you sql query directly on this table.
For iceberg implementation refer to : https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-format-iceberg.html
.withColumn()), to overwrite the column