object SparkUtils
- Alphabetic
- By Inheritance
- SparkUtils
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
implicit
class
DataFrameOps extends AnyRef
Implicit class conversion for Spark DataFrame, to enhance it with prepareStream method which will process streaming dataframe by adding windowing provided with windowing column.
-
implicit
class
DurationOps extends AnyRef
Implicit class conversion for scala duration object, to enhance it with following methods:
Implicit class conversion for scala duration object, to enhance it with following methods:
- convert duration into spark interval string
- convert duration into string with short notation of time unit (e.g. '10s' or '3h') Conversion is always made in terms of seconds.
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
def
getRowEncoder(schema: StructType)(implicit spark: SparkSession): ExpressionEncoder[Row]
Gets row encoder for provided schema.
Gets row encoder for provided schema. Purpose of this method is to provide ability to create row encoder for different versions of Spark. Thus, Encoders API has changes in version 3.5.0. Scala reflection API is used to invoke proper row encoder constructor thus supporting different Encoders APIs.
- schema
Dataframe schema to construct encoder for.
- spark
Implicit spark session object.
- returns
Row encoder (expression encoder for row).
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
makeFileSystem(spark: SparkSession): Result[FileSystem]
Creates hadoop filesystem object provided with a spark session
Creates hadoop filesystem object provided with a spark session
- spark
SparkSession object
- returns
Either Hadoop file system object or a list of errors
-
def
makeSparkSession(sparkConf: SparkConf, appName: Option[String]): Result[SparkSession]
Creates Spark Session
Creates Spark Session
- sparkConf
Spark configuration object
- appName
Application name
- returns
Either SparkSession object or a list of errors.
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toDataType(typeString: String): DataType
Matches type string literal to a corresponding Spark DataType.
Matches type string literal to a corresponding Spark DataType.
- typeString
Type string literal
- returns
Spark DataType
-
def
toStorageLvlString(sLvl: StorageLevel): String
Converts Spark StorageLevel type to a string that defines this type.
Converts Spark StorageLevel type to a string that defines this type. Note: default implementation of .toString methods does not convert StorageLevel type back to original string.
- sLvl
Spark StorageLevel
- returns
Original string definition of the StorageLevel
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()