Skip to contents

Computes classification metrics by comparing predicted adjacency matrices to a ground truth binary network and visualizes the performance via a radar (spider) plot.

Usage

pscores(ground_truth, predicted_list, zero_diag = TRUE)

Arguments

ground_truth

A square binary adjacency matrix representing the ground truth network. Values must be 0 or 1. Only the upper triangle is used for evaluation.

predicted_list

A list of predicted adjacency matrices to evaluate. Each matrix must have the same dimensions and row/column names as ground_truth.

zero_diag

Logical. If TRUE (default), sets the diagonal of ground_truth to zero before evaluation, removing self-loops.

Value

A list with one element:
Statistics: Data frame of evaluation metrics (TP, TN, FP, FN, TPR, FPR, Precision, F1, MCC) for each predicted matrix.

Details

For each predicted matrix, the confusion matrix is computed using the upper triangle (non-self edges). Metrics including True Positive Rate (TPR), False Positive Rate (FPR), Precision, F1-score, and Matthews Correlation Coefficient (MCC) are calculated.

A radar plot is automatically generated summarizing the key scores across matrices.

Note

Requires the fmsb, dplyr, and tidyr packages.

Examples

data(count_matrices)
data(adj_truth)

networks <- infer_networks(
    count_matrices_list = count_matrices,
    method = "GENIE3",
    nCores = 1
)

wadj_list <- generate_adjacency(networks)
swadj_list <- symmetrize(wadj_list, weight_function = "mean")

binary_listj <- cutoff_adjacency(
    count_matrices = count_matrices,
    weighted_adjm_list = swadj_list,
    n = 2,
    method = "GENIE3",
    quantile_threshold = 0.99,
    nCores = 1,
    debug = TRUE
)
#> [Method: GENIE3] Matrix 1 → Cutoff = 0.09865
#> [Method: GENIE3] Matrix 2 → Cutoff = 0.10124
#> [Method: GENIE3] Matrix 3 → Cutoff = 0.10253

pscores_data <- pscores(adj_truth, binary_listj)