1. Lucy Yetman-Michaelson
  2. http://www.linkedin.com/in/lucy-yetman-michaelson-b1a56a232
  3. Junior Laboratory Associate
  4. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  5. https://mindhive.science/
  6. New York University
  1. Kim Burgas
  2. Independent consultant
  3. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  4. https://mindhive.science/
  5. New York University
  1. Suzanne Dikker
  2. http://www.suzannedikker.net
  3. Associate Research Scientist
  4. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  5. https://mindhive.science/
  6. New York University
  1. Alex Han
  2. Assistant Research Scientist
  3. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  4. https://mindhive.science/
  5. New York University
  1. Camillia Matuk
  2. https://steinhardt.nyu.edu/people/camillia-matuk
  3. Assistant Professor
  4. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  5. https://mindhive.science/
  6. New York University
  1. Yury Shevchenko
  2. https://yuryshevchenko.com/about
  3. Post-doc
  4. Crowdsourcing neuroscience: An interactive cloud-based citizen science platform for high school students, teachers, and researchers
  5. https://mindhive.science/
  6. University of Konstanz
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Ekundayo Shittu

    Ekundayo Shittu

    Higher Ed Faculty
    May 10, 2022 | 09:59 a.m.

    The use of citizen science in this project is impressive. The interactive platform is quite instructive for harnessing the differences in learning styles. Very good -- I voted!

  • Icon for: Lucy Yetman-Michaelson

    Lucy Yetman-Michaelson

    Lead Presenter
    Junior Laboratory Associate
    May 12, 2022 | 03:35 p.m.

    Hi Ekundayo, thank you so much for watching our video and for your kind words! 

  • Icon for: Brian Smith

    Brian Smith

    Facilitator
    Professor/Associate Dean of Research
    May 10, 2022 | 10:16 a.m.

    Thanks for sharing your work!

    I'm curious about the block-based interface to create research studies. It seems like a great way to scaffold the experimentation process. Do students use this a lot in practice? Do you notice them moving away from these tools and gradually creating designs on their own? Or maybe there's no need to fade the scaffolds? Do students use the block-based interface in their work like a spreadsheet, data visualization, or other tools that help make them more productive?

    I'm also wondering how teachers help the students identify their interests and define studies to address them. It looks like the teachers have formed a tight working group. Are there strategies they've discovered and shared to help kids become more independent in their research?

  • Icon for: Lucy Yetman-Michaelson

    Lucy Yetman-Michaelson

    Lead Presenter
    Junior Laboratory Associate
    May 12, 2022 | 03:34 p.m.

    Thank you for your thoughtful comment, Brian!

    The block-based design-approach is a key component of the MindHive platform because, as you mentioned, we find it to be an effective way to scaffold the experimentation process for students. Students construct their studies out of a series of tasks and surveys, and the data they collect is stored and presented with this organization. 

    Our program values iteration and we try to guide students to use the many tools that already exist in the field of brain and behavior research (i.e. tasks and surveys that have been tested and validated by scientists and researchers; e.g., the Stroop Task, the Big Five Inventory). We do, however, want to offer students the flexibility to explore their interests and think outside of the box, so the MindHive platform does have the capability for students to create their own tasks and surveys as well. During our most recent implementation (Spring 2022) we found that students were particularly interested in designing their own tasks and surveys. Helping students to balance creativity, feasibility, and validity in terms of their study design and data collection is an ongoing challenge, and we would really appreciate any insights the STEM for All community might have on this topic! 

    After each implementation of the MindHive program, we ask students to complete a survey where we assess any changes in learning outcomes and collect their feedback on the program. In this survey, we also ask students to list their interests in brain & behavior research topics (i.e. memory, music, attention, mental health, etc.). Based on the results of these surveys and general student feedback we receive during mentorship meetings, we continually add tasks and surveys to the platform to suit students’ needs. For example, during Spring 2021 implementation, we found that students were interested in designing music-related studies, so we added the Barcelona Music Reward Questionnaire (BMRQ) and The Goldsmith Musical Sophistication Index (Gold-MSI) to MindHive’s task and survey bank which have been used in student-designed studies during Spring 2022’s implementation. 

    The teachers we have worked with have been absolutely amazing partners in this project, and they are always on the lookout for unique ways of motivating students to explore their own interests. They have noted that students are most enthusiastic when the research process can be tied to current events and/or students’ own lived experiences.

  • Icon for: Amy Alznauer

    Amy Alznauer

    Facilitator
    Lecturer
    May 10, 2022 | 10:35 a.m.

    Good morning!

    You kick off this video in such a powerful way, naming the problem: how science has been conducted behind this “thick laboratory walls:” shielding students, educators, and the general public from the vital and inspiring world of scientific inquiry.  It is so hard to break through those walls and often projects that are designed to do that are pale relatives or imitations of real research. But your Mindhive platform seems to offer real tools to do just that. So, could you maybe say a little about your take on the open science movement (for those who are new to this term: "Open science is the movement to make scientific research and its dissemination accessible to all levels of society, amateur or professional.") 

    I hope you don’t mind if I link to you site: https://mindhive.science/  The projects listed on this page seem to be mostly designed by high school students, but on your about page and in this video you say that this is also for community groups interested in doing research toward policy goals.  Could you say a little about how these different groups use the platform?

    This next question might go beyond the scope of this discussion page, but I’d also love to hear about a particularly successful project – how it was conceived, how the Mindhive tools allowed the study to be conducted, and the conclusions of the study.

    Finally, could you talk about how you are measuring the impact (and what exactly you are measuring) on students and educators (and possibly on community groups) of the Mindhive project.

  • Icon for: Lucy Yetman-Michaelson

    Lucy Yetman-Michaelson

    Lead Presenter
    Junior Laboratory Associate
    May 12, 2022 | 03:38 p.m.

    Hi Amy, Thank you for your insightful comments!

    Yes, open science and citizen science have been the driving forces behind the MindHive program. We hope to encourage greater accountability and equity in psychology and neuroscience research by reframing the research process and the relationship between scientists and the public for students early on. 

    Our program strives to both practice and teach the six main open science tenets (Fecher and Friesike 2014): make knowledge freely available to all platform users (Democratic), make the science process more efficient and goal-oriented (Pragmatic), make science accessible to everyone (Public), create and maintain tools and services (Infrastructure), measure the scientific impact of research (Measurement), and support community inclusion and commitment (Community). For example, we have made the MindHive platform project completely open-source, anonymized data from MindHive studies can be accessed on the platform by authorized users, and all the educational research data is made available via open access data repositories such as The Open Science Framework and the Qualitative Data Repository. 

     

    In regards to your question about community-based projects on MindHive -- For this video, we wanted to focus on student-teacher-scientist partnerships, but, indeed, a big part of our project’s goal involves providing a platform on which communities can conduct grass-roots research for policy change. During our team’s STEM for All video last year we discussed The Brownsville Sentiment Equity Project an example of a community-scientist partnership, where we worked with a team of researchers from UC-Berkeley, NGO Public Sentiment, and a working group of community representatives from Brownsville, Brooklyn to conduct a six-month, longitudinal study to assess and address the needs of the Brownsville community during the early stages of the Covid-19 pandemic. 

    This spring, we implemented the MindHive program in an environmental science classroom. One student group created a study investigating how living conditions (rural vs. urban environments) might impact views on climate change and willingness to engage in pro-environmental behaviors. We envision expanding this study to create a student-scientist-community partnership, where students from a rural community and students from an urban community collaborate with a non-profit organization addressing climate change to assess the factors that might influence attitudes towards climate change and identify strategies for encouraging pro-environmental behavior and policies.  

     

    Regarding measuring impact -- Great question! There are multiple ways that we measure impact. Before and after implementation, we ask students to complete a survey where we measure their STEM engagement, competency, and self-efficacy to assess any self-reported changes in these areas. In the survey given after implementation is complete, we also ask questions aimed at assessing the efficacy and usability of the MindHive platform and curriculum. We also analyze student artifacts on the MindHive platform using a qualitative approach to assess changes in STEM competency and the efficacy of the program. These artifacts include student assignments, journal entries, study proposals, study protocols, and peer review entries. 

    Finally, we conduct meetings and interviews with teachers and students during and after the MindHive program to gain a more comprehensive view of the strengths and challenges of the program as well as what can be improved in the future. We currently have a journal article in review in which we discuss the impact the student peer review process had on student study design abilities and STEM competencies (Matuk, C., Yetman-Michaelson, L., Martin, R., Vasudevan, V., Burgas, K., Davidesco, I., Shevchenko, Y., & Dikker, S. (in revision). Open science in the classroom: Impacts on students’ study design and peer review abilities in human brain and behavior research. Submitted to Instructional Science.).

  • Icon for: Amy Alznauer

    Amy Alznauer

    Facilitator
    Lecturer
    May 13, 2022 | 02:16 p.m.

    I'm checking back in here and see that my response to your response never got posted! So let me just say again - thank you so much for these wonderful answers! I am even more enamored of your project now.

    I love the clarity and breadth of this statement: "Our program strives to both practice and teach the six main open science tenets (Fecher and Friesike 2014): make knowledge freely available to all platform users (Democratic), make the science process more efficient and goal-oriented (Pragmatic), make science accessible to everyone (Public), create and maintain tools and services (Infrastructure), measure the scientific impact of research (Measurement), and support community inclusion and commitment (Community)."

    Thank you too for linking to your previous community-centered project and also detailing the ways in which you are measuring impact. It was wonderful to see that this incredible project is thriving in all directions. I am super impressed by all.

    One small question - have you ever had home school or prison groups get involved? Two disparate groups (I know!) but related in the sense that they both operate outside the traditional school environments. 

     

     

  • Icon for: Suzanne Dikker

    Suzanne Dikker

    Co-Presenter
    Associate Research Scientist
    May 13, 2022 | 02:51 p.m.

    hi Amy, 

    It is so wonderful to receive your comments! (I might just print this for motivation to overcome any hurdles we encounter going forward :)

    As for your question - we have not included these groups but one of our schools coincidentally proposed a study where they wanted to survey (previously) incarcerated individuals. If you you know of any projects serving these groups that we might draw inspiration from, we'd love to hear about them!

  • Icon for: Amy Alznauer

    Amy Alznauer

    Facilitator
    Lecturer
    May 13, 2022 | 05:42 p.m.

    Oh interesting! Well, I teach at Northwestern University and my husband is in the philosophy dept. there, and one of his colleagues started this program: https://sites.northwestern.edu/npep/  which has been absolutely amazing. Not sure if this is the kind of program you'd ever partner with but would be happy to make an introduction for you. 

    As for homeschool groups, my daughter has been homeschooling for middle school so we've met up with a bunch of groups in the Chicago area and this seems like the kind of thing they'd love, so I'm at least going to send out the links to the Mindhive site.

  • Icon for: Suzanne Dikker

    Suzanne Dikker

    Co-Presenter
    Associate Research Scientist
    May 13, 2022 | 06:17 p.m.

    wow what an inspiring and important project. we will certainly look into connecting with them in some way.
    and do let me know if any of the homeschooling groups in Chicago are interested! You can reach us as info@mindhive.science or through the Teacher page (https://mindhive.science/teachers; we monitor the form daily)

     
    1
    Discussion is closed. Upvoting is no longer available

    Amy Alznauer
  • Icon for: Andee Rubin

    Andee Rubin

    Facilitator
    Senior Scientist
    May 11, 2022 | 11:59 a.m.

    Thanks for this fascinating video.  It definitely led me to want to know more about the specific ways that your platform has been used.  Like Amy, I was very curious about how community groups might use the platform to do research toward policy goals.  I was also curious about the "brain research" part of the description of MindHive.  I understand how students might create surveys and disseminate them through the platform, but I was less clear how they could do research on brain function.  (Perhaps that cap in the background of Dr. Dikker's video is involved?)

    As a data science educator, I'm also very curious what kind of support you provide to students on the data-relevant aspects of research, especially survey research, such as: designing survey questions, choosing a sample (or knowing that you don't have a random sample, so can't make generalizations), visualizing data and interpreting data visualizations.  I agree with Amy that hearing about a particular example would be very informative!

  • Icon for: Suzanne Dikker

    Suzanne Dikker

    Co-Presenter
    Associate Research Scientist
    May 13, 2022 | 03:21 p.m.

    Hi Andee, Great to hear from you and thank you for your great questions and comments! 

    For your community-based organization question, let me partly recycle Lucy’s response to Amy
    -- For this video we wanted to focus on student-teacher-scientist partnerships, but, indeed, a big part of our project’s goal involves providing a platform on which communities can conduct grassroots research for policy change. During our team’s STEM for All video last year (https://stemforall2021.videohall.com/presentations/2157) we discussed The Brownsville Sentiment Equity Project (https://www.publicsentiment.org/brownsville/ ) an example of community-scientist partnership, where we are collaborating with a team of researchers from UC-Berkeley, NGO Public Sentiment, and a working group of community representatives from Brownsville, Brooklyn. Data collected throughout the COVID-19 pandemic shows that food security is a leading stressor among Brownsville residents, which inspired us to form subsequent collaborations with food-based organizations and researchers to develop and test a locally responsive solution. 

    In terms of “brain research,” the MindHive curriculum emphasizes the link between neuroscience/neuroanatomy and behavior, and how many brain researchers conduct behavioral studies to probe brain function. Having said that, one of our future development goals is indeed to (re)integrate ‘real’ hands-on neuroscience and physiology data (yes, like the ones in the background ;), incorporating insights from our BrainWaves curriculum and tools (https://wp.nyu.edu/brainwaves/). 

    As for the data engagement: that too is something that we are working hard on at the moment! We have integrated a lesson about survey design and sample size/choice, and we are developing an elaborate task/survey bank that generates easy to download & analyze data formats. We are also working on functionality to integrate “student-as-participant” and “student-as-scientist” data engagement tools into the platform. This approach is inspired by projects like the Great Brain Experiment (https://www.thegreatbrainexperiment.com/) and the Person Project (https://www.thepersonproject.org/), where participants can explore their own data points in relation to the group after taking part in a task/survey. 

    Our ‘sandbox study’ for this project is a risk-taking and adolescence study designed by Prof. Robb Rutledge (https://mindhive.science/studies/risk-taking). This study consists of a gambling task with three conditions that vary by how much you can win/lose and a single survey question “how anxious do you feel right now?” both before and after the task. In our lessons, we discuss the motivations behind the study and how to look at the data (mean differences in risk-taking between conditions, correlations between risk-taking and anxiety). Students can then ‘clone’ the study and e.g., add their own survey, compare age groups, etc. 

    We are exploring different options for data visualization and annotation (we recently discovered JASP for example; (https://jasp-stats.org/) and would welcome any (browser-based) suggestions you might have!

  • Icon for: Russanne Low

    Russanne Low

    Researcher
    May 11, 2022 | 06:19 p.m.

    I have so many questions I don't know where to begin! I can see how the topics of exploration would be of great interest to student participants, and a great data archive that can be explored and used by students as they build  interest  and excitement in data literacy. I was wondering what IRB issues you encountered in the creation of this platform and in the collection of data about students, and how these challenges were solved. I am personally very excited about this innovative project and will be following up.  Can you provide a sense of scale- how many teachers and classes are engaged in this project? Any project that stimulates  teachers to participate in weekly meetings is doing something right that we can learn from. Great presentation! 

  • Icon for: Lucy Yetman-Michaelson

    Lucy Yetman-Michaelson

    Lead Presenter
    Junior Laboratory Associate
    May 12, 2022 | 04:53 p.m.

    Hi Russanne! Thank you for your comments and questions! I can really feel the enthusiasm! :)  We would love for you to follow up! Feel free to reach out to us at info@mindhive.science and explore our platform at mindhive.science.

    These are great questions -- First and foremost, our platform is GDPR-compliant and we have an elaborate Privacy Policy that went through rigorous scrutiny with the NYU data management offices. For our educational research, we have IRB approval from NYU and any school district that requires it. Scientists are responsible for obtaining IRB approval with their own institutions, and while schools do not typically have IRBs, we discuss ethics approval with teachers at the onset of the program, and it is also an integral part of the curriculum so that students are maximally aware of ethics considerations. Finally, mentors are tasked with guiding students and teachers away from studies that may be physically or mentally harmful.

    To support consent procedures, the platform features an environment for IRB protocols where you can upload your consent forms/language, which you can then add to your study. When participants take part in your study, they will then be asked to complete the appropriate consent form. This system provides necessary flexibility because some of the communities we work with require separate IRB approvals and consent forms. From a data point of view, all data is anonymized across the platform where relevant. Students’ names can be seen by their teachers and peers on their assignments and proposal cards for collaboration and feedback purposes, but their names are decoupled from their research data (every user is assigned a data identifier consisting of a random 3-word combination).

    Additionally, every user’s data is clearly marked based on their responses to the consent forms (i.e. you can easily see if an anonymous participant consented to have all their data from a study used or whether they consented to participate, but not to have their data used in the analysis, etc.) 

    In terms of scale, our most recent implementation (Spring 2022) included 5 teachers across 4 schools (5 classes of about 90 students total). All four schools in this past implementation happen to be in New York City, but in past implementations, we have worked with schools across the country, and our goal has always been to build connections between schools beyond state and regional boundaries.

  • Icon for: Lucy Yetman-Michaelson

    Lucy Yetman-Michaelson

    Lead Presenter
    Junior Laboratory Associate
    May 12, 2022 | 03:45 p.m.

    Thanks so much to all who have watched and/or commented on our video! It's a real pleasure to be a part of this event! 

    For teachers who are interested in collaborating with us in the future or just want to learn a bit more about the MindHive program, feel free to ask us any questions and/or visit the For Teachers page on our website: https://mindhive.science/teachers 

  • Icon for: Ravanasamudram Uma

    Ravanasamudram Uma

    Higher Ed Faculty
    May 17, 2022 | 04:41 p.m.

    What an innovative idea! I plan to explore your website more. How do you handle IRB issues since the participants are from across the globe (?) and you are storing the survey data?

  • Icon for: Oludare Owolabi

    Oludare Owolabi

    Higher Ed Faculty
    May 17, 2022 | 06:37 p.m.

    Great