836 Views
  1. Renee Cole
  2. https://chem.uiowa.edu/people/renee-s-cole
  3. Professor
  4. Presenter’s NSFRESOURCECENTERS
  5. University of Iowa
  1. Juliette Lantz
  2. Associate Dean of Curriclum, Professor
  3. Presenter’s NSFRESOURCECENTERS
  4. Drew University
  1. Kathryn Mauger-Sonnek
  2. Graduate Research Assistant
  3. Presenter’s NSFRESOURCECENTERS
  4. University of Iowa
  1. Suzanne Ruder
  2. Professor
  3. Presenter’s NSFRESOURCECENTERS
  4. Virginia Commonwealth University

Collaborative Research: Eliciting and Assessing Process Skills in STEM

NSF Awards: 1524936, 1524399, 1524965

2021 (see original presentation & discussion)

Grades 9-12, Undergraduate, Graduate

Skills such as communication, teamwork, critical thinking, and problem solving are frequently cited as important outcomes for STEM degree programs, as well as being part of the science practices promoted by the NGSS. The Enhancing Learning by Improving Process Skills in STEM (ELIPSS) project has developed materials and strategies that support the facilitation of these skills. These materials include rubrics that can be used to assess student interactions and completed work and provide students with feedback and suggestions for improvement to further their progress. 

These feedback rubrics and accompanying implementation strategies have been employed at a wide variety of institutions, across a broad range of STEM disciplines and varying class sizes. Approaches have been developed for providing students feedback and support in both in-person and virtual classroom settings. Assessments are completed by students, by the instructors, or classroom learning assistants. Rubrics can be completed in many formats, including paper, electronic platforms, and through the use of student response systems. A summary of implementations and commentary on successful entry strategies from early adopters will be highlighted. Additional tools for instructors appear on www.ELIPSS.com.

This video has had approximately 221 visits by 153 visitors from 101 unique locations. It has been played 74 times.
activity map thumbnail Click to See Activity Worldwide
Map reflects activity with this presentation from the 2021 STEM For All Video Showcase website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
show more
Discussion from the 2021 STEM For All Video Showcase (12 posts)
  • Icon for: Renee Cole

    Renee Cole

    Lead Presenter
    Professor
    May 11, 2021 | 09:50 a.m.

    Thank you for viewing the ELIPSS Project video. Our project involves supporting instructors in the facilitation and assessment of more than content knowledge in efforts to intentionally develop transferable skills such as information processing, collaboration, interpersonal communication and critical thinking. Working with a team of STEM educators at several different institutions, we have developed a series of rubrics to support the facilitation and feedback process of transferable skills, and a series of implementation strategies for their use. You can view these resources at elipss.com.

     

    We are especially interested in discussion and feedback regarding the following - What are the drivers and barriers for the explicit facilitation and assessment of transferable skills in the classroom?  What additional tools or resources could we develop to support adoption and implementation of the rubrics? 

     
    2
    Discussion is closed. Upvoting is no longer available

    Kirstin Milks
    Leslie Bolda
  • Icon for: Kirstin Milks

    Kirstin Milks

    Facilitator
    Science Teacher
    May 11, 2021 | 11:27 a.m.

    Hi ELIPSS team! I'm a high school science teacher who loves using POGILs (https://pogil.org/) and finds them transformational as an active learning tool, so I was very excited to see your work here on building tools to help students and instructors track and get feedback on developing process skills!

    I'm wondering if you've taken a look at the POGIL framework to see where the overlaps are in your materials. There's a great community of POGIL users at the HS and 2/4 year levels, and you might be able to reach new audiences in that community to increase the impact of this project!

    I'm also delighted by the self-assessment strategy -- helping learners become metacognitive of their learning process has such deep value! I am wondering about whether you've tracked information about the overlap/relationship between expert feedback (instructor) and inexpert feedback (students' self-assessment) using the rubrics. It would be cool to get some ideas of how instructors can weave these two practices throughout a course -- and to see which types of weavings are high-leverage!

  • Icon for: Rick Moog

    Rick Moog

    Higher Ed Faculty
    May 11, 2021 | 03:57 p.m.

    Kirstin - Although I am not one of the presenters of this video, the developers of these wonderful tools are all colleagues and active participants in The POGIL Project! So you are right on the mark when you say that these rubrics seem to be excellent matches  with POGIL activities. Many POGIL instructors have found them to be useful in classrooms that are implementing POGIL. And make sure that you visit The POGIL Project video also!

     
    1
    Discussion is closed. Upvoting is no longer available

    Kirstin Milks
  • Icon for: Suzanne Ruder

    Suzanne Ruder

    Co-Presenter
    Professor
    May 11, 2021 | 04:49 p.m.

    Kristin, The rubrics were designed to be used in any setting where transferable or process skills are being developed. They do  align with POGIL and many of our collaborators in the video use the POGIL pedagogy in their classrooms. We have additional videos on our website that provides further details on how instructors have implemented using the rubrics in different ways. More videos will be coming in the next few weeks in addition to the ones already on the site.

     
    1
    Discussion is closed. Upvoting is no longer available

    Kirstin Milks
  • Icon for: Megan Davis

    Megan Davis

    Higher Ed Administrator
    May 11, 2021 | 12:19 p.m.

    As a professional development specialist, I was especially interested in the remark that self-assessment of process skills helped students in their job/med school interviews.

    What frequency/dosage do you recommend so that this self-awareness and metacognition becomes second nature? 

  • Icon for: Suzanne Ruder

    Suzanne Ruder

    Co-Presenter
    Professor
    May 12, 2021 | 12:33 p.m.

    Megan,

    I generally have students do some sort of self-reflection after each class session. Sometimes it involves completing part of a rubric for the skill we are highlighting, with a justification. Other times the reflection involves tying the skill to the concept, usually asking them to write out how they solved the problem or what steps they took to get to an answer. 

    These reflections can also be added to a quiz or team assignment (generally as bonus points). Students must provide a justification that matches the skill in question. For example, if asked about critical thinking and they mention they worked well together, they do not get credit for that part. We found that their awareness of each skill improved over the course of a semester, particularly with feedback on these reflections and with getting completed rubrics back if they were asked to reflect on them.

     
    1
    Discussion is closed. Upvoting is no longer available

    Kirstin Milks
  • Icon for: Juliette Lantz

    Juliette Lantz

    Co-Presenter
    Associate Dean of Curriclum, Professor
    May 13, 2021 | 07:19 p.m.

    I focused on one skill for about 2-3 weeks, discussing aspects of that skill, and having students in their learning teams rotate through the management of the self-assessment on that skill as part of their weekly products.  Each time, I asked the team to indicate how they could improve, and then followed up to ask if they'd taken those steps. I often had them just interact with one or two categories each week.

  • Icon for: Sarah Haavind

    Sarah Haavind

    Facilitator
    Senior Research Project Manager
    May 11, 2021 | 11:21 p.m.

    Thank you for sharing your great work on the ELIPSS Project! Kirstin, Megan and I are all noticing your usage of the rubrics for self-assessment. I have also found self-assessment a valued alternate lens for learners to re-view their own work. In addition, I often engage classes in cycles of peer-reviewing one another's work using a rubric. I have found that it's often easier for students to see what might be missing or in need of further attention in someone else's work than their own. At the same time, giving that feedback often opens up new ways of seeing their own work more critically to the reviewer's own benefit. I ensure peer feedback is a constructive experience by asking that reviewers first point out strengths they notice, then questions they have or concerns paired with suggestions for next steps or possible solutions. The rubric ensures reviewers aren't just freely reacting to another's work but instead reflecting from more neutral ground. I'd be curious about your thoughts.

  • Icon for: Suzanne Ruder

    Suzanne Ruder

    Co-Presenter
    Professor
    May 13, 2021 | 02:20 p.m.

    Sarah,

    I use undergraduate LAs to assess student skills during active learning classes. Generally student self assessment scores align with the LA assessments, however, the student reflection on why they gave themselves a particular score is often lacking at first attempt. After getting feedback on specific observable characteristics and suggestions for improvement from the LA, the student self reflections improved as the semester progressed. LAs found the feedback rubrics to be really helpful because it gave them very specific things to look for as they observed their groups. I sometimes ask students to check the things they did well and then list two things they can improve upon, then go over the most commonly chosen things with the whole class.

     
    1
    Discussion is closed. Upvoting is no longer available

    Sarah Haavind
  • May 12, 2021 | 10:43 a.m.

    Is so interesting to see how an improvement on a rubric can make a difference for educators and students, I am glad to see more tools being used to solve real-life problems that support students.

    I invite you to provide feedback to our video: https://stemforall2021.videohall.com/presentati...

  • May 13, 2021 | 11:19 a.m.

    Nice to see these in action!

    I love the idea of a laminated rubric where the instructor can tick off some observations as fast formative feedback.  Following Sarah Haavind's note, it seems like this could be used for groups' self-assessment too: at the end of group time (or in the middle!) circle some behaviors you have used today and make a check on  a couple others you will try to use next.

     
    1
    Discussion is closed. Upvoting is no longer available

    Kirstin Milks
  • Icon for: Suzanne Ruder

    Suzanne Ruder

    Co-Presenter
    Professor
    May 13, 2021 | 02:27 p.m.

    Sandra,

    Our team of faculty have used variations of the rubric formats for student self-assessment. Laminated and paper forms work well with smaller classes and electronic formats work better in large classes. I have used personal response devices to have students list the things they did well and several they need to improve on (this can be a numbered list and they just enter the answers as numbers for the two parts of the question). I found that it is actually better to do this in the middle of an active class because at the end we often run out of time or the students are rushing to get to the next class.

     

  • Further posting is closed as the event has ended.

Multiplex Discussion
  • Members may log in to post to this discussion.