Linked Questions
31 questions linked to/from Unpivot in Spark SQL / PySpark
4 votes
3 answers
4k views
Does Spark supports melt and dcast [duplicate]
We use melt and dcast to convert data from wide->long and long->wide format. Refer http://seananderson.ca/2013/10/19/reshape.html for more details. Either scala or SparkR is fine. I've gone through ...
2 votes
3 answers
4k views
Pyspark - Combine Multiple Columns of Data into a Single Column Spread Out Across Rows [duplicate]
I have a pyspark data frame with multiple columns as follows: name col1 col2 col3 A 1 6 7 B 2 7 6 C 3 8 5 D 4 9 ...
0 votes
1 answer
4k views
Exploding StructType as MapType Spark [duplicate]
Converting structType to MapType in Spark. Schema: event: struct (nullable = true) | | event_category: string (nullable = true) | | event_name: string (nullable = true) | | properties: ...
0 votes
0 answers
3k views
I want to convert columns to row in py spark dataframe without using pandas? [duplicate]
I have already gone through the above answers and posted my concerns in comments, so please do not close this before answering my comments. I went through some of the answers available but none was ...
0 votes
0 answers
3k views
How to explode several columns into rows in Spark SQL [duplicate]
I am using Spark SQL 2.2.0 and DataFrame/DataSet API. I need to explode several columns one per row. I have: +------+------+------+------+------+ |col1 |col2 |col3 |col4 |col5 | +------+------+-...
3 votes
2 answers
2k views
Flip a Dataframe [duplicate]
I am working on Databricks using Python 2. I have a PySpark dataframe like: |Germany|USA|UAE|Turkey|Canada... |5 | 3 |3 |42 | 12.. Which, as you can see, consists of hundreds of columns and ...
0 votes
1 answer
930 views
Equivalent of R's reshape2::melt() in Scala? [duplicate]
I have a data frame and I would like to use Scala to explode rows into multiple rows using the values in multiple columns. Ideally I am looking to replicate the behavior of the R function melt(). All ...
1 vote
1 answer
349 views
pyspark how to unpivot the csv with two header lines [duplicate]
How can I use pyspark or pandas to achieve the below transformation? Thanks a lot. Source File is csv with following info: expected:
93 votes
10 answers
136k views
How to pivot Spark DataFrame?
I am starting to use Spark DataFrames and I need to be able to pivot the data to create multiple columns out of 1 column with multiple rows. There is built in functionality for that in Scalding and I ...
68 votes
6 answers
65k views
How to melt Spark DataFrame?
Is there an equivalent of Pandas Melt function in Apache Spark in PySpark or at least in Scala? I was running a sample dataset till now in Python and now I want to use Spark for the entire dataset.
51 votes
9 answers
95k views
Transpose column to row with Spark
I'm trying to transpose some columns of my table to row. I'm using Python and Spark 1.5.0. Here is my initial table: +-----+-----+-----+-------+ | A |col_1|col_2|col_...| +-----+-------------------...
3 votes
2 answers
7k views
How to unpivot a large spark dataframe?
I have seen a few solutions to unpivot a spark dataframe when the number of columns is reasonably low and that the columns' names can be hardcoded. Do you have a scalable solution to unpivot a ...
3 votes
3 answers
7k views
Convert columns to rows in Spark SQL
I have some data like this: ID Value1 Value2 Value40 101 3 520 2001 102 29 530 2020 I want to take this data and convert in to a KV style pair instead ID ValueVv ValueDesc 101 3 Value1 101 520 Value2 ...
3 votes
2 answers
2k views
Transpose DataFrame single row to column in Spark with scala
I saw this question here: Transpose DataFrame Without Aggregation in Spark with scala and I wanted to do exactly the opposite. I have this Dataframe with a single row, with values that are string, ...
0 votes
1 answer
4k views
stack() in spark sql - Runtime Exception
(Using apache spark version 1.6) I referred below link to attempt unpivot feature: unpivot in spark-sql/pyspark The issue here I'm getting some runtime exception when executing : df.select($"A", ...