Bluesky Facebook Reddit Email

ProbsCut: enhancing adversarial robustness via global probability constraints

04.24.26 | Higher Education Press

SAMSUNG T9 Portable SSD 2TB

SAMSUNG T9 Portable SSD 2TB transfers large imagery and model outputs quickly between field laptops, lab workstations, and secure archives.


Deep neural networks (DNNs) are demonstrated to be vulnerable to adversarial examples. Adversarial training is mainstrem method to improve adversarial robustness of DNNs, which augments the training set with adversarial examples and adopts adversarial regularization loss to improve the robustness of DNNs. Existing adversarial training methods are facing the challenge to balance the accuracy and robustness.

To alleviate the issues, a research team led by Yun Li published their new research on 15 April 2026 in Frontiers of Computer Science co-published by Higher Education Press and Springer Nature.

In the research, the team employ bias-variance decomposition to analyze the generalization error in the adversarial setting. In this way, exploring the trade-off between accuracy and robustness can be turned into solving optimal expectations. Furthermore, the proposed method, namely Probscut, consists of two loss items: global loss and local loss. Global loss concerns the relationship among different examples, while local loss constrains the loss corresponding to every single example. The global loss can be combined with existing methods, such as TRADES and MART. The work transforms the heuristic search for a trade-off between accuracy and robustness into an exploration of the optimal expected probability for each category by introducing a variance-bias decomposition of the generalization error, specifically tailored to adversarial settings.

In detail, the local loss is the single-element Kullback–Leibler divergence, whose inputs have only one element, which reduces the difference between the probability of target category and global optimal expected probability for all examples, being viewed first-order moment estimation to reduce variance. While the local loss aligns the probability vectors of each legitimate example and their corresponding adversarial example. The optimal expected probability of each category is simultaneously updated with model parameters.

In future work, more efficient methods for determining the optimal expected probability for each category will be explored.

Frontiers of Computer Science

10.1007/s11704-025-41225-3

Experimental study

Not applicable

ProbsCut: enhancing adversarial robustness via global probability constraints

15-Apr-2026

Keywords

Article Information

Contact Information

Rong Xie
Higher Education Press
xierong@hep.com.cn

Source

How to Cite This Article

APA:
Higher Education Press. (2026, April 24). ProbsCut: enhancing adversarial robustness via global probability constraints. Brightsurf News. https://www.brightsurf.com/news/86Z0O468/probscut-enhancing-adversarial-robustness-via-global-probability-constraints.html
MLA:
"ProbsCut: enhancing adversarial robustness via global probability constraints." Brightsurf News, Apr. 24 2026, https://www.brightsurf.com/news/86Z0O468/probscut-enhancing-adversarial-robustness-via-global-probability-constraints.html.