Informasi Umum

Kode

24.04.163

Klasifikasi

006.7 - Multimedia systems

Jenis

Karya Ilmiah - Skripsi (S1) - Reference

Subjek

Deep Learning, Web Programming,

Dilihat

32 kali

Informasi Lainnya

Abstraksi

<p>In the dynamic landscape of Natural Language Processing (NLP), a transformative revolution is underway, marked by the rapid evolution and proliferation of pre-trained language models that have irrevocably reshaped the boundaries of text understanding and classification. Electra, a language model introduced by Clark et al. in 2020, stands out as an innovation with a distinctive pre-training approach, particularly in finegrained text classification tasks. This research aims to rigorously evaluate Electra’s performance in fine-grained text classification, primarily focusing on sentiment analysis tasks within the SST2 dataset. Additionally, the study seeks to provide invaluable guidance to researchers and practitioners by elucidating the most effective fine-tuning strategies and configuration settings. The results highlight the significance of gradual fine-tuning, with increased layer unfreezing positively impacting model accuracy. This underscores Electra’s vast potential for NLP tasks and the importance of thoughtful fine-tuning processes.</p>

Koleksi & Sirkulasi

Tersedia 1 dari total 1 Koleksi

Anda harus log in untuk mengakses flippingbook

Pengarang

Nama MUHAMMAD ARYA FIKRIANSYAH
Jenis Perorangan
Penyunting Hilal Hudan Nuha
Penerjemah

Penerbit

Nama Universitas Telkom, S1 Informatika
Kota Bandung
Tahun 2024

Sirkulasi

Harga sewa IDR 0,00
Denda harian IDR 0,00
Jenis Non-Sirkulasi