case class TopNDFMetricCalculator(metricId: String, columns: Seq[String], maxCapacity: Int, targetNumber: Int) extends DFMetricCalculator with Product with Serializable
- Alphabetic
- By Inheritance
- TopNDFMetricCalculator
- Serializable
- Serializable
- Product
- Equals
- DFMetricCalculator
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
val
columns: Seq[String]
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
val
emptyValue: Column
TopN metric return empty string as value and NaN as frequency when applied to empty sequence.
TopN metric return empty string as value and NaN as frequency when applied to empty sequence.
- Attributes
- protected
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
errorConditionExpr(implicit colTypes: Map[String, DataType]): Column
If casting value to StringType yields null, then it is a signal that value is not a string.
If casting value to StringType yields null, then it is a signal that value is not a string. Thus, TopN computation can't be incremented for this row. This is a metric increment failure.
- colTypes
Map of column names to their datatype.
- returns
Spark row-level expression yielding boolean result.
- Attributes
- protected
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
def
errorExpr(rowData: Column)(implicit colTypes: Map[String, DataType]): Column
Error collection expression: collects row data in case of metric error.
Error collection expression: collects row data in case of metric error.
- rowData
Array of row data from columns related to this metric calculator (source keyFields + metric columns + window start time column for streaming applications)
- colTypes
Map of column names to their datatype.
- returns
Spark expression that will yield row data in case of metric error.
- Attributes
- protected
- Definition Classes
- DFMetricCalculator
-
def
errorMessage: String
Error message that will be returned when column value cannot be cast to string.
Error message that will be returned when column value cannot be cast to string.
- returns
Metric increment failure message.
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
def
errors(implicit errorDumpSize: Int, keyFields: Seq[String], colTypes: Map[String, DataType]): Column
Final metric errors aggregation expression.
Final metric errors aggregation expression. Collects all metric errors into an array column. The size of array is limited by maximum allowed error dump size parameter.
- errorDumpSize
Maximum allowed number of errors to be collected per single metric.
- keyFields
Sequence of source/stream key fields.
- colTypes
Map of column names to their datatype.
- returns
Spark expression that will yield array of metric errors.
- Definition Classes
- DFMetricCalculator
-
val
errorsCol: String
Name of the column that will store metric errors
Name of the column that will store metric errors
- Definition Classes
- DFMetricCalculator
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] )
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- val maxCapacity: Int
-
val
metricId: String
Unlike RDD calculators, DF calculators are not groped by its type.
Unlike RDD calculators, DF calculators are not groped by its type. For each metric defined in DQ job, there will be created its own instance of DF calculator. Thus, DF metric calculators can be linked to metric definitions by metricId.
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
val
metricName: MetricName
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
-
def
result(implicit colTypes: Map[String, DataType]): Column
Overriding result expression since for TopN metric the result is not a double value but an array with top-N values from column along with their occurrence frequencies.
Overriding result expression since for TopN metric the result is not a double value but an array with top-N values from column along with their occurrence frequencies.
- colTypes
Map of column names to their datatype.
- returns
Spark expression that will yield result of following type:
array(struct(string, double))
.
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
val
resultAggregateFunction: (Column) ⇒ Column
User custom aggregation function to find topN values based on SpaceSaver.
User custom aggregation function to find topN values based on SpaceSaver.
- Attributes
- protected
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
-
val
resultCol: String
Name of the column that will store metric result
Name of the column that will store metric result
- Definition Classes
- DFMetricCalculator
-
def
resultExpr(implicit colTypes: Map[String, DataType]): Column
Spark expression yielding numeric result for processed row.
Spark expression yielding numeric result for processed row. Metric will be incremented with this result using associated aggregation function.
- colTypes
Map of column names to their datatype.
- returns
Spark row-level expression yielding numeric result.
- Attributes
- protected
- Definition Classes
- TopNDFMetricCalculator → DFMetricCalculator
- Note
Spark expression MUST process single row but not aggregate multiple rows.
-
def
rowDataExpr(keyFields: Seq[String]): Column
Row data collection expression: collects values of selected columns to array for row where metric error occurred.
Row data collection expression: collects values of selected columns to array for row where metric error occurred.
- keyFields
Sequence of source/stream key fields.
- returns
Spark expression that will yield array of row data for column related to this metric calculator.
- Attributes
- protected
- Definition Classes
- DFMetricCalculator
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
- val targetNumber: Int
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()