absorb.md

Bert

2 mentions across 1 person

Visit ↗
Unknown speaker
paper · 2026-04-10
Recommended

Finally, we extend the BERT architecture to incorporate multi-dimensional information, particularly user historical behavior, enabling the model to capture dynamic linguistic patterns and uncover implicit sarcastic cues in comments.

GAN and LLM-Driven Data Augmentation for Chinese Sarcasm Detection