Packages

final case class AppSettings(executionDateTime: EnrichedDT, referenceDateTime: EnrichedDT, allowNotifications: Boolean, allowSqlQueries: Boolean, aggregatedKafkaOutput: Boolean, enableCaseSensitivity: Boolean, errorDumpSize: Int, outputRepartition: Int, metricEngineAPI: MetricEngineAPI, checkFailureTolerance: CheckFailureTolerance, storageConfig: Option[StorageConfig], emailConfig: Option[EmailConfig], mattermostConfig: Option[MattermostConfig], streamConfig: StreamConfig, encryption: Option[Encryption], sparkConf: SparkConf, isLocal: Boolean, isShared: Boolean, doMigration: Boolean, applicationName: Option[String], prependVars: String, loggingLevel: Level, versionInfo: VersionInfo) extends Product with Serializable

Application settings

executionDateTime

Job execution date-time (actual time when job is started)

referenceDateTime

Reference date-time (for which the job is performed)

allowNotifications

Enables notifications to be sent from DQ application

allowSqlQueries

Enables SQL arbitrary queries in virtual sources

aggregatedKafkaOutput

Enables sending aggregates messages for Kafka Targets (one per each target type, except checkAlerts where one message per checkAlert will be sent)

enableCaseSensitivity

Enables columns case sensitivity

errorDumpSize

Maximum number of errors to be collected per single metric.

outputRepartition

Sets the number of partitions when writing outputs. By default writes single file.

metricEngineAPI

Metric processor API used to process metrics: either Spark RDD or Spark DF.

checkFailureTolerance

Returns the failure status if any of the checks fail.

storageConfig

Configuration of connection to Data Quality Storage

emailConfig

Configuration of connection to SMTP server

mattermostConfig

Configuration of connection to Mattermost API

streamConfig

Streaming settings (used in streaming applications only)

encryption

Encryption settings

sparkConf

Spark configuration parameters

isLocal

Boolean flag indicating whether spark application must be run locally.

isShared

Boolean flag indicating whether spark application running within shared spark context.

doMigration

Boolean flag indication whether DQ storage database migration needs to be run prior result saving.

applicationName

Name of Checkita Data Quality spark application

prependVars

Multiline HOCON string with variables to be prepended to configuration files during their parsing.

loggingLevel

Application logging level

versionInfo

Information about application and configuration API versions.

Linear Supertypes
Serializable, Serializable, Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AppSettings
  2. Serializable
  3. Serializable
  4. Product
  5. Equals
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AppSettings(executionDateTime: EnrichedDT, referenceDateTime: EnrichedDT, allowNotifications: Boolean, allowSqlQueries: Boolean, aggregatedKafkaOutput: Boolean, enableCaseSensitivity: Boolean, errorDumpSize: Int, outputRepartition: Int, metricEngineAPI: MetricEngineAPI, checkFailureTolerance: CheckFailureTolerance, storageConfig: Option[StorageConfig], emailConfig: Option[EmailConfig], mattermostConfig: Option[MattermostConfig], streamConfig: StreamConfig, encryption: Option[Encryption], sparkConf: SparkConf, isLocal: Boolean, isShared: Boolean, doMigration: Boolean, applicationName: Option[String], prependVars: String, loggingLevel: Level, versionInfo: VersionInfo)

    executionDateTime

    Job execution date-time (actual time when job is started)

    referenceDateTime

    Reference date-time (for which the job is performed)

    allowNotifications

    Enables notifications to be sent from DQ application

    allowSqlQueries

    Enables SQL arbitrary queries in virtual sources

    aggregatedKafkaOutput

    Enables sending aggregates messages for Kafka Targets (one per each target type, except checkAlerts where one message per checkAlert will be sent)

    enableCaseSensitivity

    Enables columns case sensitivity

    errorDumpSize

    Maximum number of errors to be collected per single metric.

    outputRepartition

    Sets the number of partitions when writing outputs. By default writes single file.

    metricEngineAPI

    Metric processor API used to process metrics: either Spark RDD or Spark DF.

    checkFailureTolerance

    Returns the failure status if any of the checks fail.

    storageConfig

    Configuration of connection to Data Quality Storage

    emailConfig

    Configuration of connection to SMTP server

    mattermostConfig

    Configuration of connection to Mattermost API

    streamConfig

    Streaming settings (used in streaming applications only)

    encryption

    Encryption settings

    sparkConf

    Spark configuration parameters

    isLocal

    Boolean flag indicating whether spark application must be run locally.

    isShared

    Boolean flag indicating whether spark application running within shared spark context.

    doMigration

    Boolean flag indication whether DQ storage database migration needs to be run prior result saving.

    applicationName

    Name of Checkita Data Quality spark application

    prependVars

    Multiline HOCON string with variables to be prepended to configuration files during their parsing.

    loggingLevel

    Application logging level

    versionInfo

    Information about application and configuration API versions.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. val aggregatedKafkaOutput: Boolean
  5. val allowNotifications: Boolean
  6. val allowSqlQueries: Boolean
  7. val applicationName: Option[String]
  8. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  9. val checkFailureTolerance: CheckFailureTolerance
  10. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  11. val doMigration: Boolean
  12. val emailConfig: Option[EmailConfig]
  13. val enableCaseSensitivity: Boolean
  14. val encryption: Option[Encryption]
  15. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. val errorDumpSize: Int
  17. val executionDateTime: EnrichedDT
  18. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  19. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  21. val isLocal: Boolean
  22. val isShared: Boolean
  23. val loggingLevel: Level
  24. val mattermostConfig: Option[MattermostConfig]
  25. val metricEngineAPI: MetricEngineAPI
  26. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  27. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  28. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  29. val outputRepartition: Int
  30. val prependVars: String
  31. val referenceDateTime: EnrichedDT
  32. val sparkConf: SparkConf
  33. val storageConfig: Option[StorageConfig]
  34. val streamConfig: StreamConfig
  35. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  36. val versionInfo: VersionInfo
  37. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  38. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from Serializable

Inherited from Serializable

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped