Bridging User Experience, Feedback Systems, and Instructional Design
- Nancy Puga Leal
- 5 days ago
- 4 min read

April 5, 2026
As I continue developing my instructional design prototype focused on dyslexia-informed, data-driven professional learning, this week’s readings on user experience (UX) and feedback systems from Qualtrics pushed me to think more intentionally about how educators interact with learning environments, and how their experiences can shape both design and outcomes.
One of the most powerful takeaways from The Essential Website Experience & UX Playbook is the emphasis on capturing user feedback at critical interaction points. Strategies such as site exit surveys, help surveys, and page-level feedback highlight a key principle: meaningful design is not static, and it evolves through user voice (Qualtrics, 2020). This aligns directly with my research plan’s foundation in design-based research (DBR), where iterative cycles of feedback and refinement are essential (Sandoval, 2013).

For example, site exit surveys, which are triggered when users leave a platform, “get straight to the questions you care about” (Qualtrics, 2020). In the context of my Canvas-based professional learning course, this idea translates into understanding why educators disengage from modules or fail to complete tasks. Are they overwhelmed by content that connects to Cognitive Load Theory? Are the instructions unclear? Or is the learning not perceived as relevant to their classroom context (Situated Learning)? Embedding exit-type reflections could provide critical insight into these questions and strengthen the alignment between design features and teacher needs.
Another point to look at his how help surveys mirror the importance of responsive, just-in-time support. My research emphasizes formative assessment as a continuous feedback loop, and help surveys function in much the same way by adapting to user needs in real time (Qualtrics, 2020; Wilson, 2005). This reinforces my high-level conjecture mapping that structured, responsive supports can mediate teacher learning processes, particularly reflection and instructional decision-making.
Perhaps most impactful is the idea of page-level feedback surveys, which ask users directly what is missing or unclear. This resonates deeply with my goal of improving teacher capacity in data literacy and dyslexia-informed practices. If educators can indicate where confusion exists within specific modules (e.g., MTSS planning or interpreting screening data), I can iteratively refine those learning experiences to better support clarity and application (Qualtrics, 2020).
The second Qualtrics resource on employee engagement further expands this thinking by positioning feedback as a driver of organizational growth, creativity, and productivity (Qualtrics, 2020). In many ways, educators in professional development environments function similarly to employees within an organization. Their engagement, sense of empowerment, and perception of value directly influence outcomes. The template’s emphasis on honest, psychologically safe feedback connects to my study’s ethical commitments and the importance of creating a learning environment where teachers feel comfortable sharing challenges and uncertainties.
This also challenges me to think beyond traditional pre/post assessments. While my research design includes quantitative and qualitative data sources (e.g., surveys, reflections, artifacts), it also integrates engagement-focused survey items, such as perceptions of collaboration, confidence, and instructional relevance, which could provide a more holistic understanding of teacher growth (Qualtrics, 2020).
Ultimately, these readings reinforce a critical insight that effective instructional design is deeply intertwined with user experience design. This connection emphasizes considering the learner's perspective throughout the design process, ensuring that educational experiences are not only informative but also engaging and user-friendly. My prototype represents an evolving system that is continuously shaped and refined through teacher interaction, constructive feedback, and the lived experiences of both educators and learners alike. This iterative process ensures inclusion of diverse viewpoints, which enriches the learning environment and enhances its relevance.
By embedding UX-informed feedback mechanisms into the design, I can significantly strengthen the connection between theoretical frameworks such as conjecture mapping, cognitive load theory, and situated learning, and their real-world applications. Conjecture mapping allows for a visual representation of the relationships and hypotheses that underpin instructional strategies, facilitating a clearer understanding of how various elements interact within the learning experience. Cognitive load theory emphasizes the importance of managing the amount of information presented to learners, ensuring that they are neither overwhelmed nor under-stimulated, which is critical for effective knowledge retention and skill acquisition.
In addition, situated learning emphasizes the context in which learning occurs and advocates for educational experiences grounded in real-world scenarios and authentic tasks. By integrating these theoretical concepts into the design process, I can create a more cohesive and impactful learning experience that resonates with users. The feedback mechanisms will not only help assess the effectiveness of the instructional strategies employed but also identify areas for improvement, therefore fostering an adaptive learning environment that evolves in response to user needs and preferences.
In essence, the interplay between instructional design and user experience design is crucial for developing educational products that are not only theoretically sound but also practically applicable. This holistic approach ensures that learners are actively engaged, motivated, and able to apply what they have learned in real-life situations. As I continue to refine my prototype, I remain committed to prioritizing user experience, recognizing that it is through this lens that we can truly transform educational practices and enhance learning outcomes for all participants involved.
This reflection pushes my work forward by reminding me that the success of my design will not only be measured by outcomes, but by how well it listens, adapts, and responds to the educators it is built to serve.
References
Berryhill, M., Reggio, K., Skinner, K., Piestrzynski, L., & Barrera IV, E. S. (2025). A case
study of a new dyslexia course created for university teacher preparation programs. Annals of Dyslexia, 75(3), 592–615. https://doi.org/10.1007/s11881-025-00341-2
Qualtrics LLC. (2020). The Essential Website Experience & UX playbook. Qualtrics.
Qualtrics LLC. (2020). Creating a survey in Qualtrics using a template. Qualtrics.
Reed, D. K., & Zhang, H. (2025). The impact of teacher professional development on
the grade 3 reading performance of students with characteristics of dyslexia.
Annals of Dyslexia, 75(3), 616–636. https://doi.org/10.1007/s11881-025-00338-x
Sandoval, W. (2013). Conjecture mapping: An approach to systematic educational
design research. The Journal of the Learning Sciences, 1–19.
Wilson, B. G. (2005). Broadening our foundation for instructional design: Four pillars
of practice. Educational Technology, 45(2), 10–15.




Comments