A Deep Dive into Electra: Transfer Learning for Fine-Grained Text Classification on SST-2 - Dalam bentuk pengganti sidang - Rancangan Karya Akhir

MUHAMMAD ARYA FIKRIANSYAH

Informasi Dasar

24.04.163
006.7
Karya Ilmiah - Skripsi (S1) - Reference

In the dynamic landscape of Natural Language Processing (NLP), a transformative revolution is underway, marked by the rapid evolution and proliferation of pre-trained language models that have irrevocably reshaped the boundaries of text understanding and classification. Electra, a language model introduced by Clark et al. in 2020, stands out as an innovation with a distinctive pre-training approach, particularly in finegrained text classification tasks. This research aims to rigorously evaluate Electra’s performance in fine-grained text classification, primarily focusing on sentiment analysis tasks within the SST2 dataset. Additionally, the study seeks to provide invaluable guidance to researchers and practitioners by elucidating the most effective fine-tuning strategies and configuration settings. The results highlight the significance of gradual fine-tuning, with increased layer unfreezing positively impacting model accuracy. This underscores Electra’s vast potential for NLP tasks and the importance of thoughtful fine-tuning processes.

Subjek

DEEP LEARNING
WEB PROGRAMMING,

Katalog

A Deep Dive into Electra: Transfer Learning for Fine-Grained Text Classification on SST-2 - Dalam bentuk pengganti sidang - Rancangan Karya Akhir
 
 
Indonesia

Sirkulasi

Rp. 0
Rp. 0
Tidak

Pengarang

MUHAMMAD ARYA FIKRIANSYAH
Perorangan
Hilal Hudan Nuha
 

Penerbit

Universitas Telkom, S1 Informatika
Bandung
2024

Koleksi

Kompetensi

 

Download / Flippingbook

 

Ulasan

Belum ada ulasan yang diberikan
anda harus sign-in untuk memberikan ulasan ke katalog ini