Overview
Improving a Japanese Text-Generation Model through Fine-Tuning with BERT (University Research Project).
This was a natural language processing research project I conducted as my undergraduate thesis.
By fine-tuning BERT, a neural-network–based NLP model developed by Google, I built a Japanese text-generation model.
Through this project, I gained foundational knowledge of general machine learning, developed an understanding of the Transformer architecture, and gained hands-on experience applying cutting-edge LLM technology at the time specifically BERT to build a custom model.

