class AlyyanAhmed:
def __init__(self):
self.role = "AI/ML Engineer"
self.focus_areas = [
self.focus_areas =[
"Computer Vision ποΈ",
"Agentic LLMs π€",
"Production MLOps βοΈ"
]
self.expertise = {
"pipeline": "research β training β deployment β monitoring",
"infrastructure": ["CI/CD", "Docker", "Model Serving", "Edge AI"],
"frameworks": ["LangChain", "LangGraph", "RAG", "Agentic Workflows"]
"infrastructure":["CI/CD", "Docker", "Model Serving", "Edge AI"],
"frameworks":["LangChain", "LangGraph", "RAG", "Agentic Workflows"]
}
def now(self):
return "shipping ML that actually works in production"|
End-to-end MLOps pipeline with version control, CI/CD, and live Hugging Face deployment with interactive Gradio interface. |
Natural language to SQL with real-time analytics. Hybrid RAG + SQL architecture with coaching-style feedback loop. |
|
Edge-optimised CNN for low-resource environments. Clinical-grade accuracy on mobile and embedded devices. |
Multi-class transfer learning with advanced augmentation. Full evaluation suite and detailed classification reports. |
Other Notable Work:
- π€ Smart Cleaning Bot β Autonomous navigation with real-time edge inference (Raspberry Pi Β· Arduino Β· TFLite) β View
Result: π Top 12% Finish (0.9802 Balanced Accuracy OOF)
- Built a weighted ensemble using XGBoost (GPU) and LightGBM for multiclass irrigation prediction.
- Engineered advanced features including digit extraction, N-gram composites, and community-inspired βMagic Scoreβ features.
- Applied in-fold multi-class Target Encoding with weighted augmentation from the original dataset.
Result: π Top 10% Finish (0.917 AUC)
- Built a 2-Level Stacking Ensemble combining LightGBM, XGBoost, CatBoost, Ridge, and Pytabkit.
- Engineered advanced features including Nested Target Encoding, N-grams, and Z-scores.
- Maintained disciplined 10-fold CV with a Logistic Regression meta-learner.
Result: π― Score: 0.955 ROC-AUC
- Developed a CPU-optimized AutoGluon pipeline keeping training under 6 hours.
- Utilized L1 & L2 Stacking with Gradient Boosted Trees (XGB/LGBM/Cat).
- Executed strategic data alignment and sampling of external UCI datasets.
Result: π‘ Tabular Data Optimization
- Handled severe class imbalance using SMOTE and class weights.
- Trained robust XGBoost & Random Forest ensemble models.
- Automated hyperparameter tuning via Optuna.
timeline
Jul 2023 - Sep 2023 : MACHINE LEARNING INTERN β Air University
: Implemented ML algorithms for anomaly detection
: Analyzed large-scale network traffic datasets
: Built predictive analytics for cybersecurity
: Supported AI model development for threat detection
Jul 2025 - Sep 2025 : AI & ML INTERN β CSERA
: Built Vision + GenAI inference systems
: Enhanced detection pipelines using YOLOv8 + OpenCV
: Integrated MLOps workflows with PyTorch + MLflow + FastAPI
Nov 2025 - Present : JR. AI ENGINEER β iOPTIME
: EV battery analytics ML systems (91% acc.)
: XGBoost, LSTM, Bi-LSTM, PINNs deployment
: End-to-end ML pipelines + APIs
: RAG-powered automated auditing platform
: Inference validation + diagnostic tooling
mindmap
root(( SKILLS USED))
AI/ML
PyTorch
TensorFlow
Scikit-learn
XGBoost
PINNs
Computer Vision
YOLOv8
OpenCV
Tracking
Video Analytics
Backend
FastAPI
Flask
REST APIs
RAG Systems
MLOps
MLflow
Docker
CI/CD
Deployment
| Area | Details |
|---|---|
| π€ Agentic AI | Tool use, memory systems, multi-step planning |
| π Industrial RAG | Hybrid RAG + SQL for enterprise scale |
| π Scalable ML | FastAPI, Docker, GPU/TPU deployments |
| π€ Robotics | Path planning, RL, control systems |
| π» Local LLMs | vLLM, Ollama, efficient inference |