Dubossarsky, HaimSindi, Kenan2024-02-012024-02-012023-10-23https://hdl.handle.net/20.500.14154/71355This study employs pretrained BERT models— AraBERT, CAMeLBERT (CA), and CAMeLBERT (MSA)—to investigate semantic change in Arabic across distinct time periods. Analyzing word embeddings and cosine distance scores reveals variations in capturing semantic shifts. The research highlights the significance of training data quality and diversity, while acknowledging limitations in data scope. The project's outcome—a list of most stable and changed words—contributes to Arabic NLP by shedding light on semantic change detection, suggesting potential model selection strategies and areas for future exploration.15enNatural Language ProcessingArabic NLPLangauge ModelsBERTData ScienceSemantic ChangeUnsupervisedUnsupervised Semantic Change Detection in ArabicThesis