object SparkDateTimeUtils extends SparkDateTimeUtils
- Alphabetic
- By Inheritance
- SparkDateTimeUtils
- SparkDateTimeUtils
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
val
TimeZoneUTC: TimeZone
- Definition Classes
- SparkDateTimeUtils
-
def
anyToDays(obj: Any): Int
Converts an Java object to days.
Converts an Java object to days.
- obj
Either an object of
java.sql.Date
orjava.time.LocalDate
.- returns
The number of days since 1970-01-01.
- Definition Classes
- SparkDateTimeUtils
-
def
anyToMicros(obj: Any): Long
Converts an Java object to microseconds.
Converts an Java object to microseconds.
- obj
Either an object of
java.sql.Timestamp
orjava.time.{Instant,LocalDateTime}
.- returns
The number of micros since the epoch.
- Definition Classes
- SparkDateTimeUtils
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
def
convertTz(micros: Long, fromZone: ZoneId, toZone: ZoneId): Long
Converts the timestamp
micros
from one timezone to another.Converts the timestamp
micros
from one timezone to another.Time-zone rules, such as daylight savings, mean that not every local date-time is valid for the
toZone
time zone, thus the local date-time may be adjusted.- Definition Classes
- SparkDateTimeUtils
-
def
daysToLocalDate(days: Int): LocalDate
Obtains an instance of
java.time.LocalDate
from the epoch day count.Obtains an instance of
java.time.LocalDate
from the epoch day count.- Definition Classes
- SparkDateTimeUtils
-
def
daysToMicros(days: Int, zoneId: ZoneId): Long
Converts days since 1970-01-01 at the given zone ID to microseconds since 1970-01-01 00:00:00Z.
Converts days since 1970-01-01 at the given zone ID to microseconds since 1970-01-01 00:00:00Z.
- Definition Classes
- SparkDateTimeUtils
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
def
fromJavaDate(date: Date): Int
Converts a local date at the default JVM time zone to the number of days since 1970-01-01 in the hybrid calendar (Julian + Gregorian) by discarding the time part.
Converts a local date at the default JVM time zone to the number of days since 1970-01-01 in the hybrid calendar (Julian + Gregorian) by discarding the time part. The resulted days are rebased from the hybrid to Proleptic Gregorian calendar. The days rebasing is performed via UTC time zone for simplicity because the difference between two calendars is the same in any given time zone and UTC time zone.
Note: The date is shifted by the offset of the default JVM time zone for backward compatibility with Spark 2.4 and earlier versions. The goal of the shift is to get a local date derived from the number of days that has the same date fields (year, month, day) as the original
date
at the default JVM time zone.- date
It represents a specific instant in time based on the hybrid calendar which combines Julian and Gregorian calendars.
- returns
The number of days since the epoch in Proleptic Gregorian calendar.
- Definition Classes
- SparkDateTimeUtils
-
def
fromJavaTimestamp(t: Timestamp): Long
Converts an instance of
java.sql.Timestamp
to the number of microseconds since 1970-01-01T00:00:00.000000Z.Converts an instance of
java.sql.Timestamp
to the number of microseconds since 1970-01-01T00:00:00.000000Z. It extracts date-time fields from the input, builds a local timestamp in Proleptic Gregorian calendar from the fields, and binds the timestamp to the system time zone. The resulted instant is converted to microseconds since the epoch.The conversion is performed via the system time zone because it is used internally in
java.sql.Timestamp
while extracting date-time fields.The goal of the function is to have the same local date-time in the original calendar - the hybrid calendar (Julian + Gregorian) and in the target calendar which is Proleptic Gregorian calendar, see SPARK-26651.
- t
It represents a specific instant in time based on the hybrid calendar which combines Julian and Gregorian calendars.
- returns
The number of micros since epoch from
java.sql.Timestamp
.
- Definition Classes
- SparkDateTimeUtils
-
def
fromJavaTimestampNoRebase(t: Timestamp): Long
Converts an instance of
java.sql.Timestamp
to the number of microseconds since 1970-01-01T00:00:00.000000Z.Converts an instance of
java.sql.Timestamp
to the number of microseconds since 1970-01-01T00:00:00.000000Z.- t
an instance of
java.sql.Timestamp
.- returns
The number of micros since epoch from
java.sql.Timestamp
.
- Definition Classes
- SparkDateTimeUtils
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getLocalDateTime(micros: Long, zoneId: ZoneId): LocalDateTime
- Attributes
- protected
- Definition Classes
- SparkDateTimeUtils
-
def
getTimeZone(timeZoneId: String): TimeZone
- Definition Classes
- SparkDateTimeUtils
-
def
getZoneId(timeZoneId: String): ZoneId
- Definition Classes
- SparkDateTimeUtils
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
instantToMicros(instant: Instant): Long
Gets the number of microseconds since the epoch of 1970-01-01 00:00:00Z from the given instance of
java.time.Instant
.Gets the number of microseconds since the epoch of 1970-01-01 00:00:00Z from the given instance of
java.time.Instant
. The epoch microsecond count is a simple incrementing count of microseconds where microsecond 0 is 1970-01-01 00:00:00Z.- Definition Classes
- SparkDateTimeUtils
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
localDateTimeToMicros(localDateTime: LocalDateTime): Long
- Definition Classes
- SparkDateTimeUtils
-
def
localDateToDays(localDate: LocalDate): Int
Converts the local date to the number of days since 1970-01-01.
Converts the local date to the number of days since 1970-01-01.
- Definition Classes
- SparkDateTimeUtils
-
def
microsToDays(micros: Long, zoneId: ZoneId): Int
Converts microseconds since 1970-01-01 00:00:00Z to days since 1970-01-01 at the given zone ID.
Converts microseconds since 1970-01-01 00:00:00Z to days since 1970-01-01 at the given zone ID.
- Definition Classes
- SparkDateTimeUtils
-
def
microsToInstant(micros: Long): Instant
Obtains an instance of
java.time.Instant
using microseconds from the epoch of 1970-01-01 00:00:00Z.Obtains an instance of
java.time.Instant
using microseconds from the epoch of 1970-01-01 00:00:00Z.- Definition Classes
- SparkDateTimeUtils
-
def
microsToLocalDateTime(micros: Long): LocalDateTime
- Definition Classes
- SparkDateTimeUtils
-
def
microsToMillis(micros: Long): Long
Converts the timestamp to milliseconds since epoch.
Converts the timestamp to milliseconds since epoch. In Spark timestamp values have microseconds precision, so this conversion is lossy.
- Definition Classes
- SparkDateTimeUtils
-
def
millisToMicros(millis: Long): Long
Converts milliseconds since the epoch to microseconds.
Converts milliseconds since the epoch to microseconds.
- Definition Classes
- SparkDateTimeUtils
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
parseTimestampString(s: UTF8String): (Array[Int], Option[ZoneId], Boolean)
Trims and parses a given UTF8 timestamp string to the corresponding timestamp segments, time zone id and whether it is just time without a date.
Trims and parses a given UTF8 timestamp string to the corresponding timestamp segments, time zone id and whether it is just time without a date. value. The return type is Option in order to distinguish between 0L and null. The following formats are allowed:
[+-]yyyy*
[+-]yyyy*-[m]m
[+-]yyyy*-[m]m-[d]d
[+-]yyyy*-[m]m-[d]d
[+-]yyyy*-[m]m-[d]d [h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]
[+-]yyyy*-[m]m-[d]dT[h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]
[h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]
T[h]h:[m]m:[s]s.[ms][ms][ms][us][us][us][zone_id]
where
zone_id
should have one of the forms:- Z - Zulu time zone UTC+0
- +|-[h]h:[m]m
- A short id, see https://docs.oracle.com/javase/8/docs/api/java/time/ZoneId.html#SHORT_IDS
- An id with one of the prefixes UTC+, UTC-, GMT+, GMT-, UT+ or UT-,
and a suffix in the formats:
- +|-h[h]
- +|-hh[:]mm
- +|-hh:mm:ss
- +|-hhmmss
- Region-based zone IDs in the form
area/city
, such asEurope/Paris
- returns
timestamp segments, time zone id and whether the input is just time without a date. If the input string can't be parsed as timestamp, the result timestamp segments are empty.
- Definition Classes
- SparkDateTimeUtils
-
def
stringToDate(s: UTF8String): Option[Int]
Trims and parses a given UTF8 date string to a corresponding Int value.
Trims and parses a given UTF8 date string to a corresponding Int value. The return type is Option in order to distinguish between 0 and null. The following formats are allowed:
[+-]yyyy*
[+-]yyyy*-[m]m
[+-]yyyy*-[m]m-[d]d
[+-]yyyy*-[m]m-[d]d
[+-]yyyy*-[m]m-[d]d *
[+-]yyyy*-[m]m-[d]dT*
- Definition Classes
- SparkDateTimeUtils
-
def
stringToDateAnsi(s: UTF8String, context: SQLQueryContext = null): Int
- Definition Classes
- SparkDateTimeUtils
-
def
stringToTimestamp(s: UTF8String, timeZoneId: ZoneId): Option[Long]
Trims and parses a given UTF8 timestamp string to the corresponding a corresponding Long value.
Trims and parses a given UTF8 timestamp string to the corresponding a corresponding Long value. The return type is Option in order to distinguish between 0L and null. Please refer to
parseTimestampString
for the allowed formats- Definition Classes
- SparkDateTimeUtils
-
def
stringToTimestampAnsi(s: UTF8String, timeZoneId: ZoneId, context: SQLQueryContext = null): Long
- Definition Classes
- SparkDateTimeUtils
-
def
stringToTimestampWithoutTimeZone(s: UTF8String, allowTimeZone: Boolean): Option[Long]
Trims and parses a given UTF8 string to a corresponding Long value which representing the number of microseconds since the epoch.
Trims and parses a given UTF8 string to a corresponding Long value which representing the number of microseconds since the epoch. The result will be independent of time zones.
If the input string contains a component associated with time zone, the method will return
None
ifallowTimeZone
is set tofalse
. IfallowTimeZone
is set totrue
, the method will simply discard the time zone component. Enable the check to detect situations like parsing a timestamp with time zone as TimestampNTZType.The return type is Option in order to distinguish between 0L and null. Please refer to
parseTimestampString
for the allowed formats.- Definition Classes
- SparkDateTimeUtils
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toJavaDate(days: Int): Date
Converts days since the epoch 1970-01-01 in Proleptic Gregorian calendar to a local date at the default JVM time zone in the hybrid calendar (Julian + Gregorian).
Converts days since the epoch 1970-01-01 in Proleptic Gregorian calendar to a local date at the default JVM time zone in the hybrid calendar (Julian + Gregorian). It rebases the given days from Proleptic Gregorian to the hybrid calendar at UTC time zone for simplicity because the difference between two calendars doesn't depend on any time zone. The result is shifted by the time zone offset in wall clock to have the same date fields (year, month, day) at the default JVM time zone as the input
daysSinceEpoch
in Proleptic Gregorian calendar.Note: The date is shifted by the offset of the default JVM time zone for backward compatibility with Spark 2.4 and earlier versions.
- days
The number of days since 1970-01-01 in Proleptic Gregorian calendar.
- returns
A local date in the hybrid calendar as
java.sql.Date
from number of days since epoch.
- Definition Classes
- SparkDateTimeUtils
-
def
toJavaTimestamp(micros: Long): Timestamp
Converts microseconds since the epoch to an instance of
java.sql.Timestamp
via creating a local timestamp at the system time zone in Proleptic Gregorian calendar, extracting date and time fields likeyear
andhours
, and forming new timestamp in the hybrid calendar from the extracted fields.Converts microseconds since the epoch to an instance of
java.sql.Timestamp
via creating a local timestamp at the system time zone in Proleptic Gregorian calendar, extracting date and time fields likeyear
andhours
, and forming new timestamp in the hybrid calendar from the extracted fields.The conversion is based on the JVM system time zone because the
java.sql.Timestamp
uses the time zone internally.The method performs the conversion via local timestamp fields to have the same date-time representation as
year
,month
,day
, ...,seconds
in the original calendar and in the target calendar.- micros
The number of microseconds since 1970-01-01T00:00:00.000000Z.
- returns
A
java.sql.Timestamp
from number of micros since epoch.
- Definition Classes
- SparkDateTimeUtils
-
def
toJavaTimestampNoRebase(micros: Long): Timestamp
Converts microseconds since the epoch to an instance of
java.sql.Timestamp
.Converts microseconds since the epoch to an instance of
java.sql.Timestamp
.- micros
The number of microseconds since 1970-01-01T00:00:00.000000Z.
- returns
A
java.sql.Timestamp
from number of micros since epoch.
- Definition Classes
- SparkDateTimeUtils
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()