4

I am trying the below code but it is throwing some random error that I am unable to understand:

df.registerTempTable("Temp_table") spark.sql("Update Temp_table set column_a='1'") 
5
  • 1
    please share the error traceback as text. Commented Aug 19, 2022 at 7:30
  • This is the main error i am getting now '"message": "An error occurred while calling o69.sql.\n: java.lang.UnsupportedOperationException: UPDATE TABLE is not supported temporarily.'; any workaround please? Commented Aug 19, 2022 at 7:47
  • 1
    use df api (.withColumn()), to overwrite the column Commented Aug 19, 2022 at 7:48
  • Can you please share me an example? If possible with multiple joins. I could not find a better example for this Commented Aug 19, 2022 at 7:51
  • then your stated question is different from your problem at hand. maybe ask a new question with your problem statement. Commented Aug 19, 2022 at 7:52

1 Answer 1

2

Currently spark sql does not support UPDATE statments. The workaround is to use create a delta lake / iceberg table using your spark dataframe and execute you sql query directly on this table.

For iceberg implementation refer to : https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-format-iceberg.html

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.