Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Chapter

Download BibTeX

Title

Fine-Tuning Fine-Tuned Models: Towards a Practical Methodology for Sentiment Analysis with Small In-Domain Supervised Dataset

Authors

[ 1 ] Instytut Informatyki, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.3] Information and communication technology

Year of publication

2025

Chapter type

chapter in monograph / paper

Publication language

english

Abstract

EN Sentiment classifiers are typically built by annotating a relatively small data sample and fine-tuning a pre-trained language model. This approach overlooks the opportunity created by the emergence of open-source sentiment classifiers trained on large collections of supervised data from a variety of domains. These models often exhibit superior classification performance and can be used out-of-the-box, but still their performance may be negatively affected by the domain shift. This could potentially be eliminated by annotating in-domain data and further fine-tuning the model, but fine-tuning of the already fine-tuned models has not been investigated in the context of sentiment analysis and has often been unsuccessful for other NLP tasks. This paper presents an experimental analysis of this issue, studying the performance of three off-the-shelf sentiment classifiers fine-tuned using 14 different methods on customer reviews in three languages. The results show that fine-tuning of already fine-tuned models on in-domain data leads to significant performance improvements. In particular, unsupervised domain adaptation techniques in this new setup outperform standard supervised fine-tuning of general-purpose language models.

Date of online publication

24.06.2025

DOI

10.1007/978-981-96-7005-5_1

URL

https://link.springer.com/chapter/10.1007/978-981-96-7005-5_1

Book

Neural Information Processing : 31st International Conference, ICONIP 2024, Auckland, New Zealand, December 2–6, 2024, Proceedings, Part XII

Presented on

31st International Conference on Neural Information Processing ICONIP 2024, 2-6.12.2024, Auckland, New Zealand

Ministry points / chapter

20

Ministry points / conference (CORE)

70

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.