final case class HiveSourceConfig(id: ID, description: Option[NonEmptyString], schema: NonEmptyString, table: NonEmptyString, persist: Option[StorageLevel], partitions: Seq[HivePartition] = Seq.empty, options: Seq[SparkParam] = Seq.empty, keyFields: Seq[NonEmptyString] = Seq.empty, metadata: Seq[SparkParam] = Seq.empty) extends SourceConfig with Product with Serializable
Hive table source configuration
- id
Source ID
- description
Source description
- schema
Hive schema
- table
Hive table
- persist
Spark storage level in order to persist dataframe during job execution.
- partitions
Sequence of partitions to read. The order of partition columns should correspond to order in which partition columns are defined in hive table DDL.
- options
Sequence of additional Kafka options
- keyFields
Sequence of key fields (columns that identify data row)
- metadata
List of metadata parameters specific to this source
- Alphabetic
- By Inheritance
- HiveSourceConfig
- Product
- Equals
- SourceConfig
- JobConfigEntity
- Serializable
- Serializable
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
HiveSourceConfig(id: ID, description: Option[NonEmptyString], schema: NonEmptyString, table: NonEmptyString, persist: Option[StorageLevel], partitions: Seq[HivePartition] = Seq.empty, options: Seq[SparkParam] = Seq.empty, keyFields: Seq[NonEmptyString] = Seq.empty, metadata: Seq[SparkParam] = Seq.empty)
- id
Source ID
- description
Source description
- schema
Hive schema
- table
Hive table
- persist
Spark storage level in order to persist dataframe during job execution.
- partitions
Sequence of partitions to read. The order of partition columns should correspond to order in which partition columns are defined in hive table DDL.
- options
Sequence of additional Kafka options
- keyFields
Sequence of key fields (columns that identify data row)
- metadata
List of metadata parameters specific to this source
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
val
description: Option[NonEmptyString]
- Definition Classes
- HiveSourceConfig → JobConfigEntity
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
val
id: ID
- Definition Classes
- HiveSourceConfig → JobConfigEntity
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
val
keyFields: Seq[NonEmptyString]
- Definition Classes
- HiveSourceConfig → SourceConfig
-
val
metadata: Seq[SparkParam]
- Definition Classes
- HiveSourceConfig → JobConfigEntity
-
val
metadataString: Option[String]
- Definition Classes
- JobConfigEntity
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- val options: Seq[SparkParam]
- val partitions: Seq[HivePartition]
-
val
persist: Option[StorageLevel]
- Definition Classes
- HiveSourceConfig → SourceConfig
- val schema: NonEmptyString
-
val
streamable: Boolean
- Definition Classes
- HiveSourceConfig → SourceConfig
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
- val table: NonEmptyString
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()