Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/source/user-guide/latest/expressions.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ Expressions that are not Spark-compatible will fall back to Spark by default and
| DayOfYear | `dayofyear` | Yes | |
| WeekOfYear | `weekofyear` | Yes | |
| Quarter | `quarter` | Yes | |
| PreciseTimestampConversion | `window_time` | Yes | Only supports conversions between TimestampType and LongType |

## Math Expressions

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,8 @@ object QueryPlanSerde extends Logging with CometExprShim {
classOf[WeekDay] -> CometWeekDay,
classOf[DayOfYear] -> CometDayOfYear,
classOf[WeekOfYear] -> CometWeekOfYear,
classOf[Quarter] -> CometQuarter)
classOf[Quarter] -> CometQuarter,
classOf[PreciseTimestampConversion] -> CometPreciseTimestampConversion)

private val conversionExpressions: Map[Class[_ <: Expression], CometExpressionSerde[_]] = Map(
classOf[Cast] -> CometCast)
Expand Down
23 changes: 21 additions & 2 deletions spark/src/main/scala/org/apache/comet/serde/datetime.scala
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,8 @@ package org.apache.comet.serde

import java.util.Locale

import org.apache.spark.sql.catalyst.expressions.{Attribute, DateAdd, DateDiff, DateFormatClass, DateSub, DayOfMonth, DayOfWeek, DayOfYear, GetDateField, Hour, LastDay, Literal, MakeDate, Minute, Month, NextDay, Quarter, Second, TruncDate, TruncTimestamp, UnixDate, UnixTimestamp, WeekDay, WeekOfYear, Year}
import org.apache.spark.sql.types.{DateType, IntegerType, StringType, TimestampType}
import org.apache.spark.sql.catalyst.expressions.{Attribute, DateAdd, DateDiff, DateFormatClass, DateSub, DayOfMonth, DayOfWeek, DayOfYear, GetDateField, Hour, LastDay, Literal, MakeDate, Minute, Month, NextDay, PreciseTimestampConversion, Quarter, Second, TruncDate, TruncTimestamp, UnixDate, UnixTimestamp, WeekDay, WeekOfYear, Year}
import org.apache.spark.sql.types.{DateType, IntegerType, LongType, StringType, TimestampType}
import org.apache.spark.unsafe.types.UTF8String

import org.apache.comet.CometSparkSessionExtensions.withInfo
Expand Down Expand Up @@ -586,3 +586,22 @@ object CometDateFormat extends CometExpressionSerde[DateFormatClass] {
}
}
}

object CometPreciseTimestampConversion extends CometExpressionSerde[PreciseTimestampConversion] {
override def getSupportLevel(expr: PreciseTimestampConversion): SupportLevel = {
(expr.fromType, expr.toType) match {
case (TimestampType, LongType) | (LongType, TimestampType) =>
Compatible()
case _ =>
Unsupported(Some(s"PreciseTimestampConversion from ${expr.fromType} to ${expr.toType}"))
}
}

override def convert(
expr: PreciseTimestampConversion,
inputs: Seq[Attribute],
binding: Boolean): Option[ExprOuterClass.Expr] = {
// Both types are i64 micros in Arrow, so no conversion needed — return child directly.
exprToProtoInternal(expr.child, inputs, binding)
Comment on lines +604 to +605
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
-- Licensed to the Apache Software Foundation (ASF) under one
-- or more contributor license agreements. See the NOTICE file
-- distributed with this work for additional information
-- regarding copyright ownership. The ASF licenses this file
-- to you under the Apache License, Version 2.0 (the
-- "License"); you may not use this file except in compliance
-- with the License. You may obtain a copy of the License at
--
-- http://www.apache.org/licenses/LICENSE-2.0
--
-- Unless required by applicable law or agreed to in writing,
-- software distributed under the License is distributed on an
-- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-- KIND, either express or implied. See the License for the
-- specific language governing permissions and limitations
-- under the License.


statement
CREATE TABLE test_window_time(time timestamp, value int) USING parquet

statement
INSERT INTO test_window_time VALUES (timestamp('2023-01-01 12:00:00'), 1), (timestamp('2023-01-01 12:05:00'), 2), (timestamp('2023-01-01 12:15:00'), 3), (NULL, 4)

-- spark_answer_only: window() uses unsupported CreateNamedStruct and KnownNullable

-- basic window_time with tumbling window
query spark_answer_only
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both queries use spark_answer_only. Is there a reason these can't run with the default mode that verifies native execution? If there's another operator in the plan that causes fallback, could you note which one?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @andygrove for review.
While this PR supports native execution, Spark's window() introduces unsupported ops that trigger a full fallback. Therefore, we temporarily use spark_answer_only for result verification. I'll add an inline comment to document this.
If I have misunderstood, please correct me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, windowed aggregates are marked as incompatible. Maybe you can enable in the test with spark.comet.operator.WindowExec.allowIncompatible=true? You may need to check exact config key but it should be something like this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @andygrove . I dug into this and tried enabling the config spark.comet.operator.WindowExec.allowIncompatible=true, but the test still fails if I remove spark_answer_only.

The root cause might be at the expression level: Spark's TimeWindowing analyzer rule automatically expands window() into a CreateNamedStruct wrapped inside a KnownNullable expression. Since Comet doesn't currently support KnownNullable natively, the planner is forced to fall back to Spark anyway.
https://github.com/search?q=repo%3Aapache%2Fspark+KnownNullable&type=code

Fully supporting native struct creation and KnownNullable would be a broader effort outside the scope of this PR (which focuses on PreciseTimestampConversion issue for window_time).

Because of this, it might be better to keep spark_answer_only here and open a follow-up issue for those specific expressions. I'll make sure to update the inline comment to explicitly document the exact cause of the fallback for future reference.
Let me know your thoughts on this.

SELECT max(window_time(window)), sum(value) FROM (SELECT window(time, '10 minutes') AS window, value FROM test_window_time) GROUP BY window

-- window_time with sliding window
query spark_answer_only
SELECT max(window_time(window)), count(value) FROM (SELECT window(time, '10 minutes', '5 minutes') AS window, value FROM test_window_time) GROUP BY window