-
Notifications
You must be signed in to change notification settings - Fork 120
feat: introduce byte, short, decimal, timestampNTZ types #816
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: introduce byte, short, decimal, timestampNTZ types #816
Conversation
e851e20 to
96ecdba
Compare
e51437a to
898c4be
Compare
5472292 to
5bcadee
Compare
789742e to
b81a02a
Compare
common/src/main/scala/org/neo4j/spark/converter/DataConverter.scala
Outdated
Show resolved
Hide resolved
| DataTypes.createArrayType(DataTypes.TimestampType, false) -> "LIST<LOCAL DATETIME NOT NULL>", | ||
| DataTypes.createArrayType(DataTypes.TimestampType, true) -> "LIST<LOCAL DATETIME NOT NULL>", | ||
| DataTypes.createArrayType(DataTypes.TimestampType, false) -> "LIST<ZONED DATETIME NOT NULL>", | ||
| DataTypes.createArrayType(DataTypes.TimestampType, true) -> "LIST<ZONED DATETIME NOT NULL>", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
question: we cannot really have null zoned datetimes in lists, can we?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd assume so as well. No idea what that would look like / represent like if it was the case
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in that case, we cannot call createArray with true. Afaik, Cypher does not support nulls in lists.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm... confused about this to be honest. when I did durations I added both {true, false} because timestamps already had it.
what's the consequences here to remove? I'll remove and see if all tests pass in the meantime.
common/src/main/scala/org/neo4j/spark/converter/TypeConverter.scala
Outdated
Show resolved
Hide resolved
test-support/src/main/scala/org/neo4j/spark/SparkConnectorScalaBaseTSE.scala
Outdated
Show resolved
Hide resolved
this commit formats with black formatter
17f0c3c to
aaddd82
Compare
adds support to natively write these spark types to neo4j:
BREAKING CHANGE:
Before, writing a Timestamp would remove the timezone information.
Now it will be normalized to UTC-time instead. If you want to retain old behaviour,
please use TimestampNTZType or be mindful when you read the stamp.
Assuming you interpret time zone when reading and writing, you should not notice.
But if you indeed intend to use LocalDateTime, this is moved and maps to TimestampNTZType