FARCLUSS: Fuzzy Adaptive Rebalancing and Contrastive Uncertainty Learning for Semi-Supervised Semantic Segmentation

Ebenezer Tarubinga · Jenifer Kalafatovich · Seong-Whan Lee
Department of Artificial Intelligence, Korea University, Seoul, Korea
Neural Networks 2026
FARCLUSS Framework Overview

Overview of the FARCLUSS framework. Supervised learning on labeled data, unsupervised learning with fuzzy pseudo-labeling and uncertainty-aware weighting on unlabeled data, and lightweight prototype-based contrastive regularization.

80.3%
mIoU · VOC Classic 1/2
81.0%
mIoU · Cityscapes 1/2
79.0%
mIoU · VOC Classic 1/4
4
Novel Components

Abstract

Semi-supervised semantic segmentation leverages unlabeled data to reduce annotation costs, but current pseudo-labeling approaches discard uncertain pixels, ignore class imbalance, and lack explicit feature-level regularization. We present FARCLUSS, a unified framework that transforms prediction uncertainty into a learning asset through four synergistic components: (1) Fuzzy Pseudo-Labeling preserves soft class distributions from top-K teacher predictions instead of forcing hard one-hot assignments; (2) Uncertainty-Aware Dynamic Weighting modulates pixel-wise loss contributions via normalized entropy so that confident pixels drive learning while noisy ones are dampened; (3) Adaptive Class Rebalancing dynamically adjusts per-class loss weights based on batch-level pseudo-label frequencies; and (4) a Lightweight Contrastive Regularization module that tightens intra-class feature clusters and pushes apart inter-class prototypes without adding heavy memory overhead. Extensive experiments on Pascal VOC 2012 and Cityscapes demonstrate consistent improvements over competitive baselines across multiple label partitions and backbone architectures.

Key Contributions

1
Fuzzy Pseudo-Labeling

Retains soft class distributions from top-K teacher predictions, preserving inter-class ambiguity instead of forcing hard one-hot pseudo-labels.

2
Uncertainty-Aware Weighting

Modulates each pixel's loss contribution using normalized entropy, so confident predictions drive learning while uncertain ones are dampened.

3
Adaptive Class Rebalancing

Dynamically adjusts per-class loss weights using batch-level pseudo-label frequencies to counteract class imbalance in unlabeled data.

4
Contrastive Regularization

Lightweight prototype-based contrastive loss that tightens intra-class feature clusters and separates inter-class prototypes without heavy memory overhead.

Quantitative Results

Pascal VOC 2012 — Classic (mIoU %)

Backbone 1/16 1/8 1/4 1/2
ResNet-5072.9076.1876.5077.69
ResNet-10176.4078.2079.0080.30
FARCLUSS (Ours)76.4078.2079.0080.30

Cityscapes (mIoU %)

Backbone 1/16 1/8 1/4 1/2
ResNet-5075.2077.5078.0079.60
ResNet-10177.2078.8080.0081.00
FARCLUSS (Ours)77.2078.8080.0081.00

All experiments use DeepLabV3+ architecture with mean-teacher EMA. Best results highlighted.

Citation

@article{tarubinga2026farcluss,
  title     = {FARCLUSS: Fuzzy Adaptive Rebalancing and Contrastive
               Uncertainty Learning for Semi-Supervised Semantic
               Segmentation},
  author    = {Tarubinga, Ebenezer and Kalafatovich, Jenifer
               and Lee, Seong-Whan},
  journal   = {Neural Networks},
  year      = {2026}
}

Acknowledgement

This research was supported by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant, funded by the Korea government (MSIT).