Skip to contents

This function conducts a visual inference lineup check with a computer vision model. The result will be stored in self$check_result.

Usage

AUTO_VI$lineup_check(
  lineup_size = 20L,
  fitted_model = self$fitted_model,
  keras_model = self$keras_model,
  null_method = self$null_method,
  data = self$get_data(),
  node_index = self$node_index,
  extract_feature_from_layer = NULL
)

Arguments

lineup_size

Integer. Number of plots in a lineup.

fitted_model

Model. A model object, e.g. lm.

keras_model

Keras model. A trained computer vision model.

null_method

Function. A method to simulate residuals from the null hypothesis distribution. For lm, the recommended method is residual rotation AUTO_VI$rotate_resid().

data

Data frame. The data used to fit the model. See also AUTO_VI$get_data().

node_index

Integer. An index indicating which node of the output layer contains the visual signal strength. This is particularly useful when the keras model has more than one output nodes.

extract_feature_from_layer

Character/Integer. A layer name or an integer layer index for extracting features from a layer.

Value

Return the object itself.

Examples

keras_model <- try(get_keras_model("vss_phn_32"))
if (!inherits(keras_model, "try-error")) {
  myvi <- auto_vi(lm(dist ~ speed, data = cars), keras_model)

  myvi$lineup_check()
  myvi
}
#>  Generate null data.
#>  Generate null plots.
#>  Compute auxilary inputs.
#>  Predict visual signal strength for 19 images.
#>  Predict visual signal strength for 1 image.
#> 
#> ── <AUTO_VI object>
#> Status:
#>  - Fitted model: lm
#>  - Keras model: (None, 32, 32, 3) + (None, 5) -> (None, 1)
#>     - Output node index: 1
#>  - Result:
#>     - Observed visual signal strength: 3.162 (p-value = 0.05)
#>     - Null visual signal strength: [19 draws]
#>        - Mean: 1.43
#>        - Quantiles: 
#>           ╔══════════════════════════════════════════╗
#>           ║  25%   50%   75%   80%   90%   95%   99% ║
#>           ║1.026 1.393 1.789 1.885 2.082 2.208 2.366 ║
#>           ╚══════════════════════════════════════════╝