24.04.163
006.7 - Multimedia systems
Karya Ilmiah - Skripsi (S1) - Reference
Deep Learning, Web Programming,
329 kali
<p>In the dynamic landscape of Natural Language Processing (NLP), a transformative revolution is underway, marked by the rapid evolution and proliferation of pre-trained language models that have irrevocably reshaped the boundaries of text understanding and classification. Electra, a language model introduced by Clark et al. in 2020, stands out as an innovation with a distinctive pre-training approach, particularly in finegrained text classification tasks. This research aims to rigorously evaluate Electra’s performance in fine-grained text classification, primarily focusing on sentiment analysis tasks within the SST2 dataset. Additionally, the study seeks to provide invaluable guidance to researchers and practitioners by elucidating the most effective fine-tuning strategies and configuration settings. The results highlight the significance of gradual fine-tuning, with increased layer unfreezing positively impacting model accuracy. This underscores Electra’s vast potential for NLP tasks and the importance of thoughtful fine-tuning processes.</p>
Tersedia 1 dari total 1 Koleksi
Nama | MUHAMMAD ARYA FIKRIANSYAH |
Jenis | Perorangan |
Penyunting | Hilal Hudan Nuha |
Penerjemah |
Nama | Universitas Telkom, S1 Informatika |
Kota | Bandung |
Tahun | 2024 |
Harga sewa | IDR 0,00 |
Denda harian | IDR 0,00 |
Jenis | Non-Sirkulasi |