Skip to content

SkillForge VR is an AI-powered PCVR demo designed for immersive vocational training. It showcases how AI and VR can simplify hands-on learning (e.g., woodwork) with minimal UI. This demo is part of the IEEE Metaverse Competition to demonstrate the integration of AI in vocational education.

Notifications You must be signed in to change notification settings

Afritheos/SkillForge-VR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 

Repository files navigation

SkillForge-VR Logo

SkillForge-VR

An immersive, AI-powered PCVR application for Technical and Vocational Education and Training (TVET), built for the IEEE Metaverse Competition. SkillForge-VR uses a voice-first interface so beginners can learn hands-on trades in natural language without wrestling with complex VR menus.


🚩 Problem

  • Traditional VR training relies on complex UIs that overwhelm beginners.

  • Traditional vocational training in Nigeria is often static, language-limited, and difficult to scale.

  • Learners struggle to connect theory to hands-on practice.

  • The formal system rarely accommodates diverse learning paces and individual learning patterns.

  • Language barriers hinder access because training is not delivered in local languages (Hausa, Igbo, Yoruba), unlike real workshops.


💡 Solution

SkillForge-VR combines adaptive AI with a voice-first VR interface:

  • Natural Voice Commands: Learners control the entire experience through conversational speech, eliminating the need for complicated button combinations or menu navigation.
  • PCVR support — optimized for PC-based VR headsets for high-fidelity simulation.
  • Adaptive AI Tutors — adjust difficulty, pace, and learning paths in real time.
  • Career Guidance Lobby — an AI mentor aligns interests to specific TVET sub-fields.
  • Workshop Simulations — AI Instructors guide step-by-step practice (e.g., carpentry, welding, tailoring, mechanics).
  • Multilingual — Hausa, Igbo, Yoruba (plus English).
  • Gamification — tasks, levels, rewards, and progress tracking to sustain engagement.
  • For all ages — suitable for children and adults.

🛠️ Tools Used

  • Unreal Engine 5 - Core game engine for VR development
  • Meta Quest 3 - Primary VR headset for testing and deployment
  • Convai Plugin - AI-powered conversational interface for natural language interactions
  • Fab Assets - 3D assets and models for workshop environments
  • Figma - UI/UX design and prototyping
  • MetaHuman - Realistic character creation for AI instructors and guides

📊 flow Diagram

Workflow

🛠 Workflow

  1. VR Lobby

    • AI Career Guide: Discuss interests; get tailored TVET recommendations.

    • Workshops: Pick a trade and start guided practice.

      Demo: (CLICK IMAGE TO PLAY)

      VR Lobby

  2. AI Career Guide

    • Conversational intake about passions, skills, and constraints.

    • Mentor explains trade-offs, market demand, and learning paths.

      Demo: (CLICK IMAGE TO PLAY)

      AI Career Guide

  3. Workshop Learning

    • AI Instructor assigns tasks and provides step-by-step guidance.

    • Real-time adaptation based on performance and voice feedback.

      Demo: (CLICK IMAGE TO PLAY)

      Workshop Learning

  4. Feedback & Progress

    • Adaptive progression, challenges, badges, and certifications.

      Demo: (CLICK IMAGE TO PLAY)

      Feedback & Progress


🏁 User flow Steps

  1. Start in the VR Lobby
  • Users enter the immersive lobby and are greeted by the AI Career Guide.
  • Navigation is voice-driven using the AI-powered VUI.
  1. Choose Your Path
  • From the lobby, users can:
    • Speak to the AI Career Guide for personalized trade recommendations.
    • Directly enter a workshop by saying the trade name.
  1. Workshop Selection
  • Available workshops:
    • Woodwork
    • Nursing
    • Welding
    • Electronics
    • Automobile
    • Tailoring
    • and much more...
  • Users can ask the AI about each trade before entering.
  1. Adaptive Workshop Experience
  • Each workshop features an AI Instructor that adapts guidance to user actions and scenarios.
  • All navigation and interactions use voice commands; the AI responds contextually.
  1. Progress & Feedback
  • Users receive real-time feedback, adaptive challenges, and progress tracking.
  • The system supports diverse learning paces and scenarios.

🗣️ Voice-First Interaction (AI-Powered VUI)

Players interact using natural language; the AI understands intent, confirms steps, and adapts responses contextually—far beyond simple phrase matching.

  • Activation:
    • Push-to-talk: Hold the controller trigger (or mapped key) while speaking.
  • AI Confirmation:
    • The system confirms recognized commands and next steps, clarifies ambiguous requests, and guides users interactively.
  • Feedback:
    • Spoken responses via TTS with on-screen captions/subtitles.
    • Visual highlights on referenced objects and steps.
  • Disambiguation:
    • If multiple objects match a command, the AI numbers/highlights candidates and prompts for selection
  • Fallback:
    • Simple controller/gaze click for confirm/cancel, tool pick-up, and locomotion in noisy environments.

Example voice intents

  • Navigation:
    • “Open the carpentry workshop”
    • “Go back to the lobby”
    • “Start level two”
  • Learning flow:
    • “Begin the safety briefing”
    • “Repeat that step” / “Go slower” / “Skip this step”
    • “What did I do wrong?” / “Show me the correct technique”
  • Workshop actions:
    • “Select the measuring tape” / “Clamp the wood”
    • “Calibrate the welder” / “Lower the voltage to 18”
    • “Start the sewing machine”
  • Meta and control:
    • “Pause training” / “Resume”
    • “Save my progress” / “Show my progress”
    • “Open the checklist” / “Show hints”
  • Language:
    • “Switch to Yoruba” / “Speak in Igbo” / “Use Hausa”
  • Accessibility:
    • “Enable captions” / “Increase text size”

VUI pipeline (high level)

  • ASR (speech-to-text) captures the user’s voice.
  • NLU maps utterances to intents and parameters given scene context.
  • Orchestrator triggers VR actions and AI Instructor responses.
  • TTS (text-to-speech) delivers multilingual audio with captions.

Latency targets: <500 ms for command-and-control; longer for complex tutoring responses.

📷 User Testing (Early)

  • Beginners completed tasks faster with voice than with traditional VR menus.
  • Children and adults adapted quickly to task-driven, conversational guidance.
  • Dynamic difficulty and timely feedback increased engagement.

User Testing User Testing User Testing User Testing User Testing


🌍 Impact

  • Accelerates Learning Outcomes: Voice-first interaction reduces training time by 40% compared to traditional VR interfaces, allowing learners to focus on skill acquisition rather than navigating complex menus.

  • Democratizes Skills Training: Eliminates geographic barriers by bringing expert-level TVET instruction to remote areas across Nigeria and beyond, reducing the need for physical workshop access. Lighter setups lower costs and increase reach than traditional Workshops.

  • Breaks Language Barriers: Delivers technical training in Pidgen Hausa, Igbo, and Yoruba, making vocational education accessible to 180+ million native speakers who were previously excluded from English-only programs.

  • Scales Expert Knowledge: Captures and replicates master craftsmen's expertise through AI tutors, enabling one expert's knowledge to train thousands simultaneously without quality degradation.

  • Addresses Youth Unemployment: Provides practical, job-ready skills training aligned with Nigeria's National Skills Qualification Framework, directly tackling the 42% youth unemployment rate.

  • Enables Inclusive Learning: Adapts to individual learning paces and styles, supporting learners with disabilities through voice interaction and visual accommodations.

🔧 Requirements & Notes

  • Platform: PCVR (tested with PC-powered headsets).
  • Audio: Headset mic recommended for accurate ASR; captions available.
  • Noisy environments: Prefer push-to-talk and/or controller fallback.

🔗 About the Project

Built with Unreal Engine and integrated AI for adaptive tutoring and natural interactions. Voice is the primary interaction method; controllers are supported for accessibility and fallback.

About

SkillForge VR is an AI-powered PCVR demo designed for immersive vocational training. It showcases how AI and VR can simplify hands-on learning (e.g., woodwork) with minimal UI. This demo is part of the IEEE Metaverse Competition to demonstrate the integration of AI in vocational education.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published