This project implements and compares two powerful models, the Recurrent Neural Network (RNN) using Gated Recurrent Units (GRU) and the Transformer, for next-word prediction. We built these models using a Persian Wikipedia dataset, making it suitable for Persian text generation tasks.
You can use this software for various applications, such as creating chatbots, generating stories, and more. Our goal is to provide an easy way for you to experiment with these advanced models.
Follow these steps to download and run the application.
To start using the project, visit this page to download:
- Go to the link above.
- Find the version you want to download. Click on it.
- Choose the appropriate file for your operating system.
- Click on the file to download it.
To run this application, your system should meet the following requirements:
- Operating System: Windows 10 or later, macOS Catalina or later, or any recent Linux distribution.
- Memory (RAM): At least 4GB of RAM.
- Disk Space: At least 1GB of available storage space.
Once you have downloaded the file, follow these steps to install:
- Locate the downloaded file on your computer.
- Double-click the file to start the installation.
- Follow the on-screen instructions to complete the process.
After installation, you can run the application by:
- Finding the application’s shortcut on your desktop or in your applications folder.
- Double-clicking the shortcut to open it.
Once the application is open, you will see options to select the RNN or Transformer model for your project. Choose the model you wish to test, follow the prompts, and start generating text.
The project includes several features designed for ease of use and functionality:
- Next-Word Prediction: Generate words based on the input text using advanced models.
- Model Comparison: Easily compare the performance of RNN and Transformer models.
- Dataset Utilization: Uses a curated Persian Wikipedia dataset for better context and relevance.
- User Interface: A simple interface that allows for straightforward input and model selection.
If you're interested in understanding more about RNNs or Transformers, consider checking out the following resources:
- RNN and GRU Tutorial: An introduction to Recurrent Neural Networks and how GRUs work.
- Transformers Explained: A beginner-friendly explanation of the Transformer architecture and its applications.
- Persian NLP Tutorial: An introduction to natural language processing techniques specifically for Persian text.
We welcome contributions! If you'd like to help with this project, please follow these steps:
- Fork the repository.
- Create a new branch for your feature or bug fix.
- Make your changes and commit them with a clear message.
- Push your changes back to your fork.
- Submit a pull request to the original repository.
If you encounter any issues, feel free to contact us through our GitHub page. We’ll do our best to assist you.
For more information and community support, consider checking out relevant forums and discussion groups focused on machine learning and natural language processing.
The project covers several topics, including:
- Course project
- Encoder-decoder structures
- Generative models
- GRU
- Hazm
- Multi-head attention
- N-grams
- Perplexity
- Persian NLP
- Persian text generation
- Persian text preprocessing
- Positional encoding
- Recurrent neural networks (RNN)
- Top-k sampling
- Transformer architecture
- University project
For additional information, see the documentation file included in the release.
Here are some helpful links to further explore the project:
- GitHub Repository
- Documentation
- More about RNNs and Transformers on Wikipedia.
Download Latest Release and dive into the world of text generation today.