Arguments
- prev
The condition's prevalence
prev(i.e., the probability of condition beingTRUE).- sens
The decision's sensitivity
sens(i.e., the conditional probability of a positive decision provided that the condition isTRUE).- spec
The decision's specificity value
spec(i.e., the conditional probability of a negative decision provided that the condition isFALSE).
Value
Overall accuracy acc as a probability (proportion).
A warning is provided for NaN values.
See acc for definition
and accu for other accuracy metrics.
comp_accu_freq and comp_accu_prob
compute accuracy metrics from frequencies and probabilities.
Details
comp_acc uses probabilities (not frequencies) as
inputs and returns an exact probability (proportion)
without rounding.
Understanding the probability acc:
Definition:
accis the (non-conditional) probability:acc = p(dec_cor) = dec_cor/Nor the base rate (or baseline probability) of a decision being correct, but not necessarily positive.
accvalues range from 0 (no correct decision/prediction) to 1 (perfect decision/prediction).Computation:
acccan be computed in 2 ways:(a) from
prob:acc = (prev x sens) + [(1 - prev) x spec](b) from
freq:acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)When frequencies in
freqare not rounded, (b) coincides with (a).Perspective:
accclassifies a population ofNindividuals by accuracy/correspondence (acc = dec_cor/N).accis the "by accuracy" or "by correspondence" counterpart toprev(which adopts a "by condition" perspective) and toppod(which adopts a "by decision" perspective).Alternative names of
acc: base rate of correct decisions, non-erroneous casesIn terms of frequencies,
accis the ratio ofdec_cor(i.e.,hi + cr) divided byN(i.e.,hi + mi+fa + cr):acc = dec_cor/N = (hi + cr)/(hi + mi + fa + cr)Dependencies:
accis a feature of both the environment (true condition) and of the decision process or diagnostic procedure. It reflects the correspondence of decisions to conditions.
See accu for other accuracy metrics
and several possible interpretations of accuracy.
See also
acc defines accuracy as a probability;
accu lists all accuracy metrics;
comp_accu_prob computes exact accuracy metrics from probabilities;
comp_accu_freq computes accuracy metrics from frequencies;
comp_sens and comp_PPV compute related probabilities;
is_extreme_prob_set verifies extreme cases;
comp_complement computes a probability's complement;
is_complement verifies probability complements;
comp_prob computes current probability information;
prob contains current probability information;
is_prob verifies probabilities.
Other functions computing probabilities:
comp_FDR(),
comp_FOR(),
comp_NPV(),
comp_PPV(),
comp_accu_freq(),
comp_accu_prob(),
comp_comp_pair(),
comp_complement(),
comp_complete_prob_set(),
comp_err(),
comp_fart(),
comp_mirt(),
comp_ppod(),
comp_prob(),
comp_prob_freq(),
comp_sens(),
comp_spec()
Other metrics:
acc,
accu,
comp_accu_freq(),
comp_accu_prob(),
comp_err(),
err
Examples
# ways to work:
comp_acc(.10, .200, .300) # => acc = 0.29
#> [1] 0.29
comp_acc(.50, .333, .666) # => acc = 0.4995
#> [1] 0.4995
# watch out for vectors:
prev.range <- seq(0, 1, by = .1)
comp_acc(prev.range, .5, .5) # => 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5
#> [1] 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5 0.5
# watch out for extreme values:
comp_acc(1, 1, 1) # => 1
#> [1] 1
comp_acc(1, 1, 0) # => 1
#> [1] 1
comp_acc(1, 0, 1) # => 0
#> [1] 0
comp_acc(1, 0, 0) # => 0
#> [1] 0
comp_acc(0, 1, 1) # => 1
#> [1] 1
comp_acc(0, 1, 0) # => 0
#> [1] 0
comp_acc(0, 0, 1) # => 1
#> [1] 1
comp_acc(0, 0, 0) # => 0
#> [1] 0
