PENERAPAN TRANFORMASI LINEAR DALAM RUANG LINGKUP BAHASA PADA ERA DIGITAL
DOI:
https://doi.org/10.24853/fbc.9.1.1-12Keywords:
transformasi linear, word embedding, matematika, aljabarAbstract
Transformasi linier merupakan topik yang sering ditemui dalam kajian Aljabar. Beberapa peneliti dalam bidang matematika menggunakan transformasi linear dalam penelitian mereka. Penggunaan transformasi linear tersebut tidak hanya diterapkan dan diteliti dalam bidang matematika saja, tetapi di bidang lain juga dapat diterapkan. Salah satunya adalah penerapan dalam bidang bahasa. Pada paper ini dijelaskan mengenai beberapa penelitian dalam bidang tersebut yang menggunakan transformasi linear pada penelitiannya, Dari penelitian tersebut, para peneliti mengungkapkan bahwa penggunaan transformasi linear dapat memudahkan peneliti dalam menganalisis kesamaan makna kata dari vekor representasinya serta penerapannya lebih efektif dan efisien dibanding dengan metode yang lainReferences
Abdul Majid, A., Noliza Bakar, N., & Yanita. 2019. “Sifat-Sifat Matriks Ortogonal Dan Transformasi Ortogonal”. Jurnal Matematika UNAND.Vol. VIII (2), pp: 7 – 14.
Aeni, S. N. 2022. Menilik Sejarah Media Sosial, Manfaat, dan Contohnya - Teknologi Katadata.co.id. [Online] Tersedia: https://katadata.co.id/sitinuraeni/digital/6246823429ac2/menilik-sejarah-media-sosial-manfaat-dan-contohnya
Alipour, G., Bagherzadeh Mohasefi, J., & Feizi-Derakhshi, M. R. 2022. “Learning Bilingual Word Embedding Mappings with Similar Words in Related Languages Using GAN”. Applied Artificial Intelligence. Vol. 36(1).
Annur, C. M. 2021. Imbas Pandemi Covid-19, Pendapatan Zoom Meroket 191% pada Kuartal I-2021. [Online] Tersedia: https://databoks.katadata.co.id/datapublish/2021/07/14/imbas-pandemi-covid-19-pendapatan-zoom-meroket-191-pada-kuartal-i-2021
Anton, H., & Rorres, C. 2013. Elementary Linear Algebra: Applications Version, 11th Edition. Dalam John Wiley & Sons Incorporated (11th ed.). https://doi.org/10.1201/b17671-9
Artetxe, M., Labaka, G., Agirre, E., & Cho, K. 2017. Unsupervised Neural Machine Translation. http://arxiv.org/abs/1710.11041
Bollegala, D., Hayashi, K., & Kawarabayashi, K.-I. 2017. “Learning linear transformations between counting-based and prediction-based word embeddings”. https://doi.org/10.1371/journal.pone.0184544
Brownlee, J. 2019. What Are Word Embeddings for Text? Machine Learning Mastery. [Online] Tersedia: https://machinelearningmastery.com/what-are-word-embeddings/
Brychcín, T. 2020. “Linear transformations for cross-lingual semantic textual similarity”. Knowledge-Based Systems, 187. https://doi.org/10.1016/j.knosys.2019.06.027
Conneau, A., Lample, G., Ranzato, M., Denoyer, L., & Jégou, H. 2017. Word Translation Without Parallel Data. [Online] Tersedia: http://arxiv.org/abs/1710.04087
Curry, D. 2023.. Most Popular Apps (2023) - Business of Apps. [Online] Tersedia: https://www.businessofapps.com/data/most-popular-apps/
Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. [Online] Tersedia: http://arxiv.org/abs/1810.04805
Dewantara, A. 2021.. Bekerja dari Rumah (Work From Home/WFH) : Menghadapi COVID-19 pada PPKM Level 4. [Online] Tersedia: https://www.djkn.kemenkeu.go.id/kpknl-palu/baca-artikel/14156/Bekerja-dari-Rumah-Work-From-HomeWFH-Menghadapi-COVID-19-pada-PPKM-Level-4.html
Doval, Y., Camacho-Collados, J., Espinosa-Anke, L., & Schockaert, S. 2018. Improving Cross-Lingual Word Embeddings by Meeting in the Middle. [Online] Tersedia: http://arxiv.org/abs/1808.08780
Hewitt, J., & Manning, C. D. 2019. A Structural Probe for Finding Syntax in Word Representations. [Online] Tersedia: https://github.com/john-hewitt/structural-probes.
Himayati, A. I. A. 2020. “Regulatritas dan Relasi Green pada Semigrup Transformasi Linear Parsial Injektif dengan Restriksi Range”. Jurnal Karya Pendidikan Matematika. Vol. 7 (2), pp: 73 -80.
Indonesia: smartphone users 2026 | Statista. 2023. Statista Research Department. [Online] Tersedia: https://www.statista.com/statistics/266729/smartphone-users-in-indonesia/
Joulin, A., Grave, E., Bojanowski, P., & Mikolov, T. 2016. Bag of Tricks for Efficient Text Classification. [Online] Tersedia: http://arxiv.org/abs/1607.01759
Kedem, D., Tyree, S., Weinberger, K. Q., Sha, F., & Lanckriet, G.2012. Non-linear Metric Learning.
Keerthi, S. S., Schnabel, T., & Khanna, R. 2015. Towards a Better Understanding of Predict and Count Models. [Online] Tersedia: http://arxiv.org/abs/1511.02024
Khandelwal, R. 2019. Word Embeddings for NLP. Understanding word embeddings and their… | by Renu Khandelwal | Towards Data Science. Towards Data Science. [Online] Tersedia: https://towardsdatascience.com/word-embeddings-for-nlp-5b72991e01d4
Levy, O., & Goldberg, Y. 2014. Neural Word Embedding as Implicit Matrix Factorization.
Li, X., Liu, S., Kautz, J., & Yang, M.-H. 2019. “Learning Linear Transformations for Fast Image and Video Style Transfer”. 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). https://doi.org/10.1109/CVPR.2019.00393
Liddy, E. D. 2001. Natural Language Processing. [Online] Tersedia: https://surface.syr.edu/istpub
Lidwina, A. 2021.. Jumlah Unduhan Aplikasi Global Naik 8,7% pada Kuartal I-2021. Databoks. [Online] Tersedia: https://databoks.katadata.co.id/datapublish/2021/05/07/jumlah-unduhan-aplikasi-global-naik-87-pada-kuartal-i-2021
Mahdi, I. M. 2022. Zoom Kuasai Pangsa Platform Konferensi Video Dunia pada 2021. DataIndonesiai.d. [Online] Tersedia: https://dataindonesia.id/digital/detail/zoom-kuasai-pangsa-platform-konferensi-video-dunia-pada-2021
Mikolov, T., Chen, K., Corrado, G., & Dean, J. 2013. Efficient Estimation of Word Representations in Vector Space. [Online] Tersedia: http://arxiv.org/abs/1301.3781
Mikolov, T., Sutskever, I., Chen, K., Corrado, G., & Dean, J. 2013. Distributed Representations of Words and Phrases and their Compositionality. [Online] Tersedia: http://arxiv.org/abs/1310.4546
Nurdin, A., Anggo Seno Aji, B., Bustamin, A., & Abidin, Z. 2020. “Perbandingan Kinerja Word Embedding Word2Vec, Glove, Dan Fasttext Pada Klasifikasi Teks”. Jurnal Tekno Kompak. Vol. 14(2), pp:74. https://doi.org/10.33365/jtk.v14i2.732
Oktaviana, D., Noviani, E., & Fran, F. 2020. “Transformasi Givens dan Penerapannya”. Bimaster : Buletin Ilmiah Matematika, Statistika Dan Terapannya. Vol 9, pp: 213–222.
Pennington, J., Socher, R., & Manning, C. D. 2014. GloVe: Global Vectors for Word Representation. [Online} Tersedia: https://doi.org/10.3115/v1/D14-1162
Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., & Zettlemoyer, L. 2018. Deep contextualized word representations. [Online] Tersedia: http://arxiv.org/abs/1802.05365
Rika Ayu Febrilia, B., Chairun Nissa, I., Pujilestari, & Utami Setyawati, D. 2020. Analisis Keterlibatan Dan Respon Mahasiswa Dalam Pembelajaran Daring Menggunakan Google Classroom Di Masa Pandemi Covid-19. Fibonacci: Jurnal Pendidikan Matematika Dan Matematika. Vol. 6(2), pp: 175–184.
Susanti, R. D., & Effendi, M. M. 2022. Efektivitas Penggunaan Edmodo Dalam Pelaksanaan Ulangan Harian Matematika. Fibonacci: Jurnal Pendidikan Matematika Dan Matematika. Vol. 6(1), pp: 9–16.
Turner, A. (t.t.). 2023.How Many People Have Smartphones Worldwide. [Online] Tersedia: https://www.bankmycell.com/blog/how-many-phones-are-in-the-world
Downloads
Published
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).