Through its Education Innovation Grants, the Jameel World Education Lab (J-WEL) at MIT Open Learning aspires to develop the building blocks, ideas, and connections that power global transformation in learning. J-WEL grants support educational innovations across a rich variety of fields including: linguistics, mechanical engineering, literature, architecture, physics, management, political science, and more. More than $5.8 million in funding has been awarded to MIT researchers since 2017.
As part of an ongoing series, we are taking a closer look at each 2023 grantee’s projects. In the spotlight today is John Liu, principal investigator of the Learning Engineering and Practice Group (LEAP), a lecturer in the Department of Mechanical Engineering and a digital learning scientist at MIT Open Learning. His project, “Developing an ACT-R and error-based cognitive architecture for the development of virtual reality hands-on training,” aims to utilize extended reality (XR) – a broad term that encompasses virtual reality (VR), augmented reality (AR), and mixed reality (MR) – as a promising instructional tool for manufacturing workforce training.
Hands-on manufacturing skills are crucial to compete in today’s technologically-dependent world. However, the pace of advancement in manufacturing technology is creating a widening gap between skills supply and demand. While conventional training is effective, it requires significant resources, is not scalable, and has a reduced ‘shelf life,’ according to Liu. At the same time, macroeconomic impacts, including recent efforts to reroute strategic supply chains through U.S. allies and reshore manufacturing, are leading to surges in demand for engineers and technicians.
Liu says that XR presents a compelling opportunity to address large-scale demand for hands-on STEM skills because of the technology’s three dimensional immersion, scalability, responsiveness, and ability to capture and process user data. In particular, XR intelligent tutoring systems (ITS) can provide customized feedback and craft individual learning trajectories, and could support students in non-traditional pathways, including online credential programs, community colleges, and vocational education institutions.
What excites you most about your project?
The potential to help millions of people who could pick up in-demand skills to get good jobs in the manufacturing industry.
We anticipate an XR ITS system based on the proposed cognitive framework will lead to better cognitive, affective, and psychomotor outcomes for introductory manufacturing students in high schools, community colleges, and universities. These competencies in turn will better enable these students to fill and succeed in in-demand manufacturing careers. A successful XR module is scalable and could be offered as an example of how XR can address the widening gap between skills supply and demand.
What problem or challenge is your project trying to solve?
We are proposing to build and demonstrate a prototype of a virtual reality intelligent tutoring system (ITS) based on Adaptive Control of Thought—Rational (ACT-R) and a cognitive framework of errors.
ACT-R is a cognitive architecture that attempts to explain how the human mind works. It's based on the idea that people act rationally and aims to model the human brain to predict and analyze human behavior. In theory, each task that humans can perform should consist of a series of these discrete operations. Most of the ACT-R's basic assumptions are also inspired by the progress of cognitive neuroscience, and ACT-R can be seen and described as a way of specifying how the brain itself is organized in a way that enables individual processing modules to produce cognition.
In ITS design, ACT-R has successfully been applied to cognitive learning, like language or math tutoring, using the computer. We're not aware of any work for VR environments or for learning psychomotor skills. Using a pre- and post-assessment, we will assess cognitive, affective, and psychomotor learning outcomes. A successful development of this new type of VR ITS would be a novel contribution to the field of immersive learning. The VR module we create can support both engineering students and students in non-traditional pathways, such as those in online credential programs, community colleges, and vocational education institutions.
What does success look like for your project, and what milestones are you aiming to achieve?
We plan on generating:
- A taxonomy for mechanical drilling errors categorized by their cognitive mechanisms and potential instructional solutions.
- An XR ITS design based on the ACT-R cognitive architecture. We would populate the architecture – particularly the perceptual-motor modules – using the taxonomy for drilling errors.
- A VR training module to train for manual drilling skills with two types of ITS – a traditional ITS and one based on our cognitive framework.
- Empirical results that compare the learning outcomes from students who train using traditional vs novel XR ITS.
At this point in the project, the milestone we're working towards right now is user-testing a small prototype to explore advantages of this ACT-R-based ITS vs traditional ITS.
What role does collaboration play in the development and implementation of your project?
The project team is diverse! We have an instructional designer, mechanical engineering students, UI/UX designer, and software developer. Having different perspectives across the team allows us to create a more dynamic end product that will better serve learners.