Total Pageviews

DA NANG UNIVERSITY OF MEDICAL TECHNOLOGY & PHARMACY IN THE AI-QA PROJECT


Faculty Members from Danang University of Medical Technology and Pharmacy Participate in Workshop on AI Applications in Quality Assurance in Higher Education


From 26 to 28 November 2025, faculty members from Danang University of Medical Technology and Pharmacy participated in Workshop 1: “AI Applications in Quality Assurance in Higher Education”, held in Ho Chi Minh City and Dong Nai under the InnoAIQA-VN Project (2025–2026), within the framework of the DIES – National Multiplication Training (NMT) Programme. The workshop was conducted in person, gathering experts, administrators, and lecturers from higher education institutions across the country.
On the first day, the workshop took place at the Administration Building of Vietnam National University, Ho Chi Minh City (VNU-HCM), focusing on an overview of AI and its strategic directions in quality assurance (QA) in higher education. The sessions highlighted the role of AI in analysing labour market demands, forecasting disciplinary trends, developing learning outcomes based on the Outcome-Based Education (OBE) approach, and supporting curriculum design, course syllabi, examination and assessment rubrics, and digital learning resources.
The second day featured a field visit to Lac Hong University (Dong Nai). Here, participants engaged in direct discussions and experience sharing on implementing AI in academic management and quality assurance. The visit also included tours of exemplary facilities such as the AI Lab, Studio, and Open Lab, along with demonstrations of AI-driven projects currently applied effectively in teaching and scientific research.
On the final day, the programme continued with in-depth sessions on AI applications in monitoring and analysing Self-Assessment Reports (SARs), External Quality Assessment (EQA) reports, developing QA-support chatbots, and fostering continuous improvement. Group discussions and project idea presentations also strengthened professional networking and collaboration among participating institutions.
Participation in this workshop provided valuable opportunities for the faculty members of Danang University of Medical Technology and Pharmacy to update emerging trends, enhance professional competencies, and gradually adopt AI-based solutions to support quality assurance and quality enhancement in the near future.
📌 The workshop “AI Applications in Quality Assurance in Higher Education” is a meaningful initiative that helps promote innovative thinking and modern QA practices, contributing to the sustainable development of higher education in the digital era.



STUDY TOUR AT LAC HONG UNIVERSITY

 

Learning from Practical Applications of AI at Lac Hong University

The second day of Workshop 1, themed “AI Applications in Quality Assurance in Higher Education,” held in Ho Chi Minh City and Dong Nai as part of the InnoAIQA-VN Project (2025–2026) under the DIES – National Multiplication Training (NMT) Programme, left us with profound impressions as we had the opportunity to directly observe and learn at Lac Hong University. As a group of lecturers from Danang University of Medical Technology and Pharmacy, this was not merely a routine study visit; it was an opportunity for us to gain deeper insights into the practical role of Artificial Intelligence (AI) in quality assurance (QA) and quality enhancement (QE) in higher education.

Right from the beginning of the visit, the sharing session by Dr. Lam Thanh Hien – Rector of Lac Hong University helped us better understand how a higher education institution develops strategic directions linked with digital transformation and AI integration. What impressed us the most was that AI was not implemented in a fragmented or trend-following manner; instead, it was systematically integrated into core functions such as academic administration, quality assurance, scientific research, and student support services. This approach demonstrates the importance of strategic thinking when adopting emerging technologies in education.

The presentation on the learning outcomes measurement system delivered by Dr. Le Phuong Truong provided us with valuable insights. The use of AI to collect and analyse data makes the evaluation of learning outcomes more objective, transparent, and evidence-based. This sparked many reflections on the potential shift from manual processes toward a data-driven approach in our own institution’s QA activities.

Our visit to various learning and research spaces—including the Studio, AI Lab, and Open Lab—was a highly visual and practical experience. It became clear that open learning environments with strong technological infrastructure play a crucial role in motivating both lecturers and students to actively engage with AI in teaching, learning, and research. This is an indispensable factor for the effective and sustainable adoption of AI in higher education.

The afternoon sessions, focusing on AI applications in scientific research and data analytics, continued to offer substantial value. The use of AI to examine research novelty, identify research gaps, support academic writing and editing, and analyse indicators such as dropout rates or course performance demonstrated how AI can serve as a powerful tool supporting lecturers in both teaching and research.

At the end of the second learning day at Lac Hong University, we realized that AI is not a destination but a means to enhance educational quality when implemented appropriately. The practical insights gained from this workshop have become a valuable reference point, enabling us to further reflect on, propose, and gradually apply suitable AI-based solutions to teaching and quality assurance at Danang University of Medical Technology and Pharmacy in the near future.


CHANGE PROJECT

CHANGE PROJECT

AI-Enabled Quality Assurance Dashboard for Danang University of Medical Technology and Pharmacy

1. Project Title

          “Leveraging Artificial Intelligence to Develop a Quality Assurance Monitoring Dashboard at Danang University of Medical Technology and Pharmacy.”

2. Background and Rationale

    Over the past years, Danang University of Medical Technology and Pharmacy has strengthened its Quality Assurance (QA) system to comply with national and regional accreditation standards and to advance continuous improvement. Despite these efforts, several systemic challenges persist:

    (1) Fragmented and manual data management: Survey data, student learning outcomes, and accreditation evidence are stored across multiple platforms. Data compilation—primarily via Excel and Word—is labour-intensive and susceptible to human error.

    (2) Delayed analysis and decision-making: Periodic reporting limits access to real-time data, reducing the institution’s ability to identify emerging issues and implement timely interventions.

    (3) Lack of comprehensive data visualization and early-warning mechanisms: The absence of an institutional KPI Dashboard restricts leadership’s ability to gain a holistic view of academic quality during governance meetings.

    (4) Limited transparency and information accessibility: Faculty members and students have insufficient access to QA data, limiting engagement in quality enhancement processes.

    Opportunity for innovation: The rapid advancement of Artificial Intelligence (AI) offers considerable potential for modernizing QA practices. AI can automate survey analytics, extract insights from qualitative feedback, visualize performance indicators, and issue early-warning alerts. These functions contribute to enhanced transparency and strengthen evidence-based decision-making across academic units.

      Consequently, the project “Leveraging AI to Develop a QA Monitoring Dashboard” aims to directly address current limitations while supporting the institution’s transition toward a data-driven quality culture.

3. Project Objectives

3.1. General Objective

    To design and implement an AI-enhanced Quality Assurance Dashboard that provides real-time data, supports informed decision-making, and promotes transparency across the university.

3.2. Specific Objectives

     (1) Integrate AI tools to analyse student survey data and learning outcomes, reducing reporting time by ≥50%; complete the pilot analytical module by February 2026.

       (2) Develop a QA Dashboard featuring at least 10 automatically updated KPIs; conduct internal testing by March 2026.

    (3) Promote data-informed decision-making, with ≥80% QA staff and ≥50% faculty administrators utilizing the Dashboard by April 2026.

       (4) Enhance transparency and stakeholder engagement: at least 70% of faculty members and 40% of students gain access to QA data; satisfaction levels increase by ≥20%.

4. Scope of Implementation

    - Scale: A pilot phase will be implemented in 1–2 academic faculties, followed by institution-wide deployment.

   - Stakeholder Groups:

    + University Leadership: Gains access to integrated visual analytics to support strategic governance.

     + Quality Assurance Office/Center: Reduces workload in data consolidation and reporting; enhances analytical capacity; strengthens quality enhancement processes.

    + Academic Departments and Faculty Members: Access course-level student feedback and learning outcome data to support timely pedagogical improvement.

   + Students: Benefit from increased access to synthesized, transparent QA information, fostering trust and engagement.

5. Key Activities (PDCA Framework)

Phase

Timeline

Core Activities

PLAN Nov–Dec 2025 Conduct QA data mapping; assess stakeholder needs; define institutional KPIs; review IT infrastructure readiness.
DO Jan–Feb 2026 Design AI models (ML, NLP); standardize and integrate datasets; develop Dashboard architecture.
CHECK Mar 2026 Build prototype Dashboard; integrate data from pilot faculties; conduct pilot testing and collect evaluative feedback.
ACT Apr 2026 Optimize based on feedback; finalize implementation report; scale to institution-wide deployment; propose mechanisms for sustainability.

6. Expected Results

6.1. Outputs

          (1) AI-enhanced QA Dashboard with functionalities including:

        - Monitoring student satisfaction using survey analytics and NLP-based text analysis.

   - Tracking student learning outcomes (pass rates, on-time graduation rates) with disaggregation by program, faculty, and course.

         - Real-time monitoring of program accreditation and self-assessment progress.

     - Visualization of compliance with Course Learning Outcomes (CLOs) and Program Learning Outcomes (PLOs), supporting curriculum review and improvement.

          (2) Automated analytical reports, including:

        - Reports synthesized from student surveys, faculty evaluations, and academic performance data.

        - Trend analysis and predictive risk alerts (e.g., sudden increases in course failure rates).

    - Early-warning notifications sent to leadership and QA units when indicators exceed threshold values.

    (3) Comprehensive user documentation, detailing data entry protocols, report interpretation, user authorization levels, and system maintenance procedures.

6.2. Outcomes

          - Enhanced institutional transparency: Leadership, faculty, and students gain access to objective and timely QA data, fostering shared responsibility for quality enhancement.

    - Improved decision-making capacity: Real-time data visualization enables timely adjustments to curriculum, teaching practices, and strategic planning.

       - Strengthened data-driven QA culture: The institution transitions from manual reporting to AI-supported smart governance.

       - Scalability and transferability: The Dashboard model can be expanded to additional units and adapted by other higher education institutions seeking similar innovations.

7. Potential Risks and Mitigation Strategies

Risk/Barrier

Impact

Likelihood

Mitigation Strategy

Limited AI capacity Delays and suboptimal analytical performance Medium Engage external mentors; collaborate with IT units; provide short-term internal capacity-building.
Fragmented or inconsistent data Integration challenges and inaccurate analysis High Standardize data structures; conduct pilot implementation; establish institutional data governance protocols.
Resistance to change Low adoption rate Medium Implement communication strategies; provide training workshops; adopt phased implementation.
Sustainability concerns System underutilization Medium Establish a QA–AI taskforce; secure annual funding; collaborate with the InnoAIQA-VN network.

8. Support Required from the InnoAIQA-VN Program

8.1. Technical Support

    - Guidance on AI tools for analysing student surveys, learning outcomes, and QA evidence.

   - Consultation on selecting and integrating appropriate data visualization platforms aligned with institutional IT capacity.

8.2. Professional Support

    - Continuous mentorship throughout the design, implementation, piloting, and scaling phases.

    - Advice on ethical standards related to AI in higher education, particularly concerning data privacy, confidentiality, and governance.

8.3. Networking and Knowledge Sharing

    - Connections with national and international experts in AI and QA to exchange best practices.

   - Participation in the program’s professional learning network to share progress, obtain feedback, and access emerging practices in AI-enabled QA.

    - Support for disseminating project outcomes to other higher education institutions, especially those with limited technological resources.

IMAGES