Research Spotlight: Researchers Use Xbox Technology for Engineering Education
A team of University of Akron researchers is using Microsoft's Kinect™ sensor device for Xbox to show how it can transition from the family room to the classroom. Dr. Yang Yun, associate professor of biomedical engineering; Dr. Philip Allen, professor of psychology; and Dr. Yingcai Xiao, associate professor of computer science, are developing an educational tool that uses Kinect™ to noninvasively stimulate the visual center of students' brains to help them comprehend engineering principles.
The National Science Foundation awarded the researchers $150,000 for the project, titled "Interactive Learning to Stimulate the Brain's Visual Center and to Enhance Memory Retention."
During the project's two-year span, the researchers will use Kinect™ to capture students' gestures to allow them to interact virtually and in real-time with educational materials. The system's natural interface, which connects visual and motor domains, befits engineering education better than traditional language-based teaching with visual supplementation, say the researchers.
interactive visual approach helpful to students
"The existing teaching method requires students to process inherently visual-spatial information into language-based codes," says Allen, who adds that this methodology also disrupts the brain's visual component with other inputs, such as oral information, and lowers learning performance. In contrast, the proposed system presents learning materials interactively with rich details, establishing the interactive visual approach as the primary manner of learning.
"Our project is a series of 'video games with a purpose,'" Allen says. "We hope that i will not only be easier to learn and remember information encoded in this manner, but it also will make engineering exciting to more students."
The project integrates biomedical engineering education with computer science and biological mechanisms associated with cognition and memory. Kinect™ is a new generation of HCI (human computer interface) termed NUI (natural user interface). Equipped with a 3D camera, it captures users' motions and tracks their gestures, Xiao explains.
"With our system, students will be able to use their hands to grab virtual viruses and use their feet to kick virtual bacteria. The full-body interaction allows students to travel into cells to explore various cell subunits and to dive into nanosized molecules, such as DNA and enzymes, to figure out their structures," Xiao says.
Learning process could be transformed
When applied to engineering education, this new tool can improve instructors' ability to teach effectively and positively transform the learning process for students, especially for at-risk students, according to Yun.
"If our research team appropriately designs and integrates the education materials into the Kinect™ system, the learning experience can be similar to playing games such as World of Warcraft and Halo. Students will have a natural affinity toward learning and will engage, interact and participate with the educational materials," Yun says.
The researchers will begin their study in January by adopting Kinect™ into two existing UA biomedical engineering courses related to gene therapy, one freshman-level class and the other, senior-level. The researchers will add a third class that uses and promotes Kinect™ as a research and teaching tool.
Their goals are not only to improve educational instruction for K-12 and undergraduate students, but also to encourage students to choose a major in engineering. After they develop the teaching tool, the researchers plan to distribute 18 Kinect™ systems and companion educational materials to McKinley, North and Firestone high schools.
"An important mission of our research team is to reach students from economically disadvantaged backgrounds for whom a career in engineering has traditionally been out of reach," Yun says.