1. Ying Wu
  2. https://insight.ucsd.edu/our-team/
  3. Project Scientist
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://embodiedcode.net/
  6. UC San Diego
  1. Amy Eguchi
  2. https://eds.ucsd.edu/discover/people/faculty/eguchi.html
  3. Associate Teaching Professor
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://embodiedcode.net/
  6. UC San Diego
  1. Monica Sweet
  2. Co-Director of Research and Evaluation
  3. An Embodied, Augmented Reality Coding Platform for Pair Programming
  4. https://embodiedcode.net/
  5. UC San Diego
  1. Robert Twomey
  2. http://roberttwomey.com
  3. Assistant Professor
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://embodiedcode.net/
  6. University of Nebraska Lincoln
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 10, 2022 | 11:45 a.m.

    Welcome to the Embodied Coding Environment

    This NSF-funded project explores how the affordances of 3D space in Virtual and Augmented Reality can be leveraged to support computational concept learning.  This work is motivated by embodied learning theory, which centers on the idea that learners’ abilities to understand and reason about functions, algorithms, conditionals, and other abstract computational concepts stem in part from more fundamental sensorimotor and perceptual experiences of the physical world.  Key features of the Embodied Coding Environment include the following:

    1) a novel visual-spatial XR representation of coding allowing immersion in the problem and design spaces

    2) whiteboarding/annotation tools situated in a shared environment with code activities,

    3) gesture and movement paths for the direct specification of program instrumentation and data

    Our current goal centers on testing the impact of 3D spatial coding in the classroom.  The research team is building partnerships with San Diego high schools and developing customized coding lessons to be implemented in the Embodied Coding environment.

    Your comments and feedback are greatly appreciated!

    https://embodiedcode.net/

    https://insight.ucsd.edu/

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Dan Roy

    Dan Roy

    Facilitator
    Research Scientist, Interest-based Learning Mentor, Learning Game Designer
    May 11, 2022 | 06:26 a.m.

     Thanks for sharing Embodied Code. It looks like an intriguing tool to explore. A few questions about the role embodied cognition plays is making learning more accessible:
    -How much of the value of embodied learning in CS comes from making the abstract concrete?
    -Is this a question you explored or found in the literature?
    -Does making the abstract concrete without embodiment capture much of the value?

    A few questions about the value of sensorimotor and perceptual experiences of the physical world:
    -To the extent that it makes learning easier, could it be from reducing cognitive load?
    -Maybe processing certain ideas without using the body takes more mental effort to the extent that it becomes less efficient.
    -Should we think of embodied cognition like adding a GPU to a task that's currently relying fully on a CPU?

    About the kinds of topics best suited to embodied cognition:
    -Does EC help with anything abstract and challenging? 
    -Does it mainly help with spatial concepts?
    -How much overlap is there between spatial concepts and challenging abstract concepts?

    Have you already tested Embodied Code with users? What reactions have you seen? Any insights into efficacy, and for which topics in particular?

    How would you like to build on Embodied Code going forward, if you get the chance?

    Looking forward to discussing more!

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Victor Minces

    Victor Minces

    Researcher
    May 11, 2022 | 08:51 p.m.

    Hey Yin! Can't wait to try it. Should we have an in-person meeting now that we can? Our flow-based music programming language is ready to use, I can show it to you as well.

     

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2022 | 11:22 a.m.

    Hi Victor!  Yes -- we should get together in person soon.  Perhaps next week or the week after that?  I will see if Robert, Tommy, and Amy are available for a meeting -- because I'm sure they would love to see your music programming language as well.

  • Icon for: Lorna Quandt

    Lorna Quandt

    Facilitator
    Asst. Professor, Educational Neuroscience
    May 12, 2022 | 09:38 a.m.

    Hello team! I love this project! I also work in VR development, using gestures and sign language, driven by theories of embodied cognition. So we are speaking the same language. 

    There is so much potential here to build embodied coding--in my team, we have discussed the idea of using sign language as well, so that deaf signers can code in VR space using a mixture of sign, text, and gesture. We have not yet developed anything like that, but we think it would be a very cool idea and I see you're thinking along those same lines already. 

    Your environment appears powerful and also rather complex. How do you teach users how to use the commands and execute actions in the space? Do you have any sense of how long it takes for people to become familiar with the coding environment? I would also be curious if principles of universal design can be applied to this framework to make it more accessible to people with disabilities and/or neurodivergence. Have you thought about that at all? Thanks for the interesting video!

     
    1
    Discussion is closed. Upvoting is no longer available

    Dan Roy
  • Ying Wu

    Researcher
    May 12, 2022 | 11:13 a.m.

    Thanks for these comments, Lorna!  We have created several tutorials offering guidance on the use of the system and have shared our platform with partners in the San Diego Unified School district.  We are hoping to gain feedback from teachers and other educators in order to ensure that our system is user friendly and suitable for the coding activities that we would like to engage learners with.   

    Your question about neurodivergence and/or disabilities is very interesting and opens up new considerations for our group.  Indeed, one of our developers is very interested in coding through sign language.  It would be great if we could bring our groups together to chat some time! Feel free to email me: yingchoon@gmail.com or message me on Discord:  YingChoonWu#9772.

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Marcelo Worsley

    Marcelo Worsley

    Facilitator
    Assistant Professor
    May 12, 2022 | 04:16 p.m.

    That was a very informative video. I really appreciate the different modes of interaction with your coding platform. One question I have is whether or not you all incorporate sound into the  experiences? Would participants hear the created spheres fall? Also, how are you handling version control.  How easy is it for students to undo or redo previous actions?

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2022 | 06:55 p.m.

    Thanks, Marcelo, for your suggestions and questions.  With respect to version control, we haven't implemented a version control system or undo feature yet, but the plan is to save the code to a server that uses git for version control so users can undo and redo along automated commits (when those commits will be made is still up for discussion). Additionally Unity Engine has its own undo system that, once utilized by our system, will allow users to undo smaller actions.  With respect to sounds -- you hit on an excellent topic that Victor Minces (above) and I have discussed.  In short the answer is yes -- sound can and will be incorporated.  Our longer term plans are to customize the environment for particular users and classroom objectives.  Depending on the activities that the platform will be supporting for a particular group, we will add sound and build out the environment.