0

I wanted to change a column mapping to be append. Is there a better way to customize the column mappings with Spark Cassandra Connector in Java than the following?

 ColumnName song_id = new ColumnName("song_id", Option.empty()); CollectionColumnName key_codes = new ColumnName("key_codes", Option.empty()).append(); List<ColumnRef> collectionColumnNames = Arrays.asList(song_id, key_codes); scala.collection.Seq<ColumnRef> columnRefSeq = JavaApiHelper.toScalaSeq(collectionColumnNames); javaFunctions(songStream) .writerBuilder("demo", "song", mapToRow(PianoSong.class)) .withColumnSelector(new SomeColumns(columnRefSeq)) .saveToCassandra(); 

This is taken from this Spark Streaming code sample.

1 Answer 1

1

Just make your column ref's using the CollectionColumnName

Which has a constructor

case class CollectionColumnName( columnName: String, alias: Option[String] = None, collectionBehavior: CollectionBehavior = CollectionOverwrite) extends ColumnRef 

You can rename by setting alias and you can change the insert behavior with collectionBehavior which takes the following classes.

Api Link

/** Insert behaviors for Collections. */ sealed trait CollectionBehavior case object CollectionOverwrite extends CollectionBehavior case object CollectionAppend extends CollectionBehavior case object CollectionPrepend extends CollectionBehavior case object CollectionRemove extends CollectionBehavior 

Which means you can just do

CollectionColumnName appendColumn = new CollectionColumnName("ColumnName", Option.empty(), CollectionPrepend$.MODULE$); 

Which looks a bit more Java-y and is a bit more explicit. Did you have any other goals for this code?

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.