Leaderboard#

Tasks:
  • ASD: Argumentative Stance Detection

  • ARC: Argumentative Relation Classification

  • ACC: Argumentative Component Classification

  • AFC: Argumentative Fallacy Classification

Test classification performance on MAM datasets#

Model

UKDebates

M-Arg γ

MM-USED

MM-USED

MMUSED-fallacy

Task

ASD

ARC

ASD

ACC

AFC

Text Only

BiLSTM

.552± .047

.120± .006

.811± .004

.663± .002

.525± .113

BERT

.654± .003

.132± .004

.824± .009

.679± .004

.594± .122

RoBERTa

.692± .005

.172± .015

.839± .010

.680± .001

.615± .097

Audio Only

BiLSTM w/ MFCCs

.302± .047

.003± .005

.722± .027

.527± .004

.657 ± .000

BiLSTM w/ Wav2Vec2

.376± .023

.000± .000

.774 ± .008

.596 ± .005

.655± .117

BiLSTM w/ HuBERT

.364± .012

.024 ± .012

.745± .009

.566± .007

.638± .000

BiLSTM w/ WavLM

.393 ± .040

.010± .010

.772± .015

.583± .002

.652± .000

Transformer w/ Wav2Vec2

.440± .030

.000± .000

.771 ± .019

.514± .000

.567± .225

Transformer w/ HuBERT

.425± .033

.000± .000

.765± .016

.524± .004

.629 ± .162

Transformer w/ WavLM

.455 ± .004

.000± .000

.768± .005

.526 ± .004

.594± .217

Text Audio

BiLSTM w/ MFCCs

.528± .039

.065± .014

.807± .010

.662± .006

.572 ± .099

BiLSTM w/ Wav2Vec2

.533 ± .009

.079± .014

.808± .012

.665± .004

.505± .168

BiLSTM w/ HuBERT

.409± .017

.055± .020

.807± .013

.653± .003

.456± .131

BiLSTM w/ WavLM

.501± .022

.084 ± .016

.815 ± .006

.667 ± .000

.526± .174

MM-BERT w/ Wav2Vec2

662 ± .004

.153± .017

841 ± .005

.677± .003

.561± .114

MM-BERT w/ HuBERT

.626± .003

.160 ± .015

.840± .006

.677± .004

.599 ± .128

MM-BERT w/ WavLM

.654± .019

.152± .008

.836± .005

.680 ± .004

.580± .103

MM-RoBERTa w/ Wav2Vec2

.674± .009

.178 ± .012

.833± .006

.678 ± .003

.608± .126

MM-RoBERTa w/ HuBERT

.624± .015

.147± .004

.837± .003

.677± .008

.576± .097

MM-RoBERTa w/ WavLM

.687 ± .010

.165± .018

.837 ± .009

678 ± .003

.624 ± .074

CSA w/ Wav2Vec2

.663 ± .014

.137± .027

.822± .002

.693 ± .001

.555± .118

CSA w/ HuBERT

.632± .018

.160 ± .015

.813± .004

.693 ± .001

.582 ± .114

CSA w/ WavLM

.655± .029

.155± .030

.833 ± .011

.697± .001

.535± .102

Ensemble w/ Wav2Vec2

.586 ± .015

.011 ± .011

.825± .004

.681 ± .002

.612 ± .134

Ensemble w/ HuBERT

.531± .039

.010± .004

.822± .007

.681± .003

.611:sub:`± .107