NSF Awards: 2017042
We aim to increase participation and interest in groups traditionally underrepresented in the educational and career pathways of computer science (CS), including females and some minority students, who often exhibit lower confidence in STEM-related abilities relative to other students. This project explores how embodied experience in extended reality can support human-centered, collaborative computing through the development of an Embodied Coding Environment (ECE) for the creation of interactive experiences. It is comprised of tools that support drawing and gestures, smart selection, a smart search/command bar, a text editor with syntax highlighting, grouping and movement tools for organizing elements in space, and the ability to save and share projects through the cloud. The ECE allows users to be immersed in their problem space and contextualize their code within this space. Further, it offers an abstract Annotation System where ‘annotations’ take the form of 3D drawn lines, hand gestures, spoken comments, quick 3D modeling, and more, as free form diagramming or whiteboarding can play a key role in understanding and breaking down problems and working through the design of algorithms to solve those problems. Finally, coders can use spatial signals from hand and controller-tracking to directly select locations in space, and directly specify movements, control signals, and other parameters through gestural time series input. Controller and hand movements can be recorded and visualized as annotations or data within the ECE. These data can be linked as input to programming nodes to drive a variety of processes. Our next goal is to field test this system with high school CS learners from underrepresented backgrounds.
Ying Wu
Project Scientist
Welcome to the Embodied Coding Environment!
This NSF-funded project explores how the affordances of 3D space in Virtual and Augmented Reality can be leveraged to support computational concept learning. This work is motivated by embodied learning theory, which centers on the idea that learners’ abilities to understand and reason about functions, algorithms, conditionals, and other abstract computational concepts stem in part from more fundamental sensorimotor and perceptual experiences of the physical world. Key features of the Embodied Coding Environment include the following:
1) a novel visual-spatial XR representation of coding allowing immersion in the problem and design spaces
2) whiteboarding/annotation tools situated in a shared environment with code activities,
3) gesture and movement paths for the direct specification of program instrumentation and data
Our current goal centers on testing the impact of 3D spatial coding in the classroom. The research team is building partnerships with San Diego high schools and developing customized coding lessons to be implemented in the Embodied Coding environment.
Your comments and feedback are greatly appreciated!
https://embodiedcode.net/
https://insight.ucsd.edu/
Lorna Quandt
Dan Roy
Research Scientist, Interest-based Learning Mentor, Learning Game Designer
Thanks for sharing Embodied Code. It looks like an intriguing tool to explore. A few questions about the role embodied cognition plays is making learning more accessible:
-How much of the value of embodied learning in CS comes from making the abstract concrete?
-Is this a question you explored or found in the literature?
-Does making the abstract concrete without embodiment capture much of the value?
A few questions about the value of sensorimotor and perceptual experiences of the physical world:
-To the extent that it makes learning easier, could it be from reducing cognitive load?
-Maybe processing certain ideas without using the body takes more mental effort to the extent that it becomes less efficient.
-Should we think of embodied cognition like adding a GPU to a task that's currently relying fully on a CPU?
About the kinds of topics best suited to embodied cognition:
-Does EC help with anything abstract and challenging?
-Does it mainly help with spatial concepts?
-How much overlap is there between spatial concepts and challenging abstract concepts?
Have you already tested Embodied Code with users? What reactions have you seen? Any insights into efficacy, and for which topics in particular?
How would you like to build on Embodied Code going forward, if you get the chance?
Looking forward to discussing more!
Lorna Quandt
Victor Minces
Hey Yin! Can't wait to try it. Should we have an in-person meeting now that we can? Our flow-based music programming language is ready to use, I can show it to you as well.
Ying Wu
Project Scientist
Hi Victor! Yes -- we should get together in person soon. Perhaps next week or the week after that? I will see if Robert, Tommy, and Amy are available for a meeting -- because I'm sure they would love to see your music programming language as well.
Lorna Quandt
Asst. Professor, Educational Neuroscience
Hello team! I love this project! I also work in VR development, using gestures and sign language, driven by theories of embodied cognition. So we are speaking the same language.
There is so much potential here to build embodied coding--in my team, we have discussed the idea of using sign language as well, so that deaf signers can code in VR space using a mixture of sign, text, and gesture. We have not yet developed anything like that, but we think it would be a very cool idea and I see you're thinking along those same lines already.
Your environment appears powerful and also rather complex. How do you teach users how to use the commands and execute actions in the space? Do you have any sense of how long it takes for people to become familiar with the coding environment? I would also be curious if principles of universal design can be applied to this framework to make it more accessible to people with disabilities and/or neurodivergence. Have you thought about that at all? Thanks for the interesting video!
Dan Roy
Ying Wu
Thanks for these comments, Lorna! We have created several tutorials offering guidance on the use of the system and have shared our platform with partners in the San Diego Unified School district. We are hoping to gain feedback from teachers and other educators in order to ensure that our system is user friendly and suitable for the coding activities that we would like to engage learners with.
Your question about neurodivergence and/or disabilities is very interesting and opens up new considerations for our group. Indeed, one of our developers is very interested in coding through sign language. It would be great if we could bring our groups together to chat some time! Feel free to email me: yingchoon@gmail.com or message me on Discord: YingChoonWu#9772.
Lorna Quandt
Marcelo Worsley
Assistant Professor
That was a very informative video. I really appreciate the different modes of interaction with your coding platform. One question I have is whether or not you all incorporate sound into the experiences? Would participants hear the created spheres fall? Also, how are you handling version control. How easy is it for students to undo or redo previous actions?
Ying Wu
Project Scientist
Thanks, Marcelo, for your suggestions and questions. With respect to version control, we haven't implemented a version control system or undo feature yet, but the plan is to save the code to a server that uses git for version control so users can undo and redo along automated commits (when those commits will be made is still up for discussion). Additionally Unity Engine has its own undo system that, once utilized by our system, will allow users to undo smaller actions. With respect to sounds -- you hit on an excellent topic that Victor Minces (above) and I have discussed. In short the answer is yes -- sound can and will be incorporated. Our longer term plans are to customize the environment for particular users and classroom objectives. Depending on the activities that the platform will be supporting for a particular group, we will add sound and build out the environment.