Skip to content

Conversation

@Yohahaha
Copy link
Contributor

@Yohahaha Yohahaha commented Jan 13, 2026

Purpose

As title.

Linked issue: close #xxx

Brief change log

Tests

org.apache.fluss.spark.SparkStreamingTest

API and Format

Documentation

@Yohahaha
Copy link
Contributor Author

@YannByron @wuchong


/** An interface that extends from Spark [[StreamingWrite]]. */
trait FlussStreamingWrite extends StreamingWrite with Serializable {
override def useCommitCoordinator(): Boolean = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we have to disable to use commit coordinator?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kafka streaming write disabled too, batch writer disabled too.

will reenable when needed.


val result = rowsWithType.zip(expectInput).forall {
case (flussRowWithType, expect) =>
flussRowWithType._1.equals(expect._1) && flussRowWithType._2.getLong(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use Assertions.assertThat to check the result, and then we don't have to print the mismatch data.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just follow spark ut to print whole data when mismatched

@Yohahaha
Copy link
Contributor Author

more comments? @YannByron cc @wuchong

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants