English-Sinhala multilingual word embedding alignment resources
-
Updated
Feb 15, 2025 - Jupyter Notebook
English-Sinhala multilingual word embedding alignment resources
An implementation of knowledge distillation using contrastive learning (CLIP loss) to transfer knowledge from a large teacher model (OpenAI's CLIP) to a smaller multilingual student model (LaBSE) on a parallel English-Persian dataset.
A multilingual, topic-based language learning platform powered by generative AI. Learn real-life concepts (driving, digestion, emotion) across Spanish, French, Portuguese, and Italian, while sneakily introducing users to science, math, and well-being through natural everyday examples. Built for educational AI applications and gen-UX prototype
Add a description, image, and links to the multilingual-embeddings topic page so that developers can more easily learn about it.
To associate your repository with the multilingual-embeddings topic, visit your repo's landing page and select "manage topics."