Grades 10–11 Math/Science: Project-Based Instruction, Video 3 (Day 9)

This is the third video documenting a multi-day project in a grades 10-11 course called Phylgebrics (Physics I and Algebra II combined). This is Day 9 of the project.

An observer scored this sample based on a classroom observed at Manor New Tech High School in Manor, Texas. An edited video of the class observed is below. Since the video is shorter than the full class period, we provide a lesson graph that describes the objectives and agenda for the class and aligns the actual minutes of the class with the minutes of the video.

Download the lesson graph for Video 3 (Day 9) (pdf). 

Download the competed instrument for Video 3 (Day 9) (pdf).

Video Link: 

http://vimeo.com/98560772

I. Background Information

Teacher: NA

School: Manor New Tech High School

Date of Observation: NA

Start and End Time of Observation: NA

Date of Post Interview: NA

Method of Post-Interview: Face to face

Subject Observed: Phylgebrics (Physics I and Algebra II combined class)

Grade Level: 10 and 11

Course Level: (Regular or Advanced/Accelerated): Regular

Observer: UTOP Expert

II. Lesson Overview

In a paragraph or two, describe the lesson you observed. Include where the lesson fits into the overall unit of study. Be sure to include enough detail to provide a context for your ratings of the lesson and also to allow you to recall the details of the lesson when needed in the future.

The purpose of this lesson was for students to continue working in groups on calculations required for completion of their project. Specifically, students applied what they learned in a gravity calculation workshop held during the previous class session in order to analyze gravitational forces in their solar system.

Students completed calculations, had work checked by the instructors, and earned credit when the physics teacher applied “stamps” in their notebooks, indicating successful completion of assigned tasks. Simultaneously, the mathematics teacher worked with student groups as they began to graph the planetary orbits of the stars and planets in their solar systems using a computer software program, Geometer’s Sketchpad.

During this class, the instructors worked collaboratively to manage the classroom, answer student questions, and check work. During one part of the class session, the physics instructor held a workshop with one member from each group to discuss the next section of the project, explaining expectations within the project rubric. During the majority of the class period (90 minutes), both instructors continued working with groups and individual students as requested.

III. Rating Scales

1. Classroom Environment

1.1 Classroom Engagement. Rating: 4

Indicator: The classroom environment facilitated by the teacher encouraged students to generate ideas, questions, conjectures, and/or propositions that reflected engagement or exploration with important mathematics and science concepts.

Evidence for 1.1

Throughout this lesson, students asked questions of their group members, classmates, and their instructors. The primary focus of this lesson was determining the equation for and focal points of ellipses that represent the orbit of a planet with a habitable zone in a solar system designed to meet criteria defined in the project rubric.

The algebra teacher held a workshop to scaffold student learning and support their progress in this task [12:42–23:08]. The graphing task and related classroom activities provided many opportunities for students to collaborate with peers, explain their reasoning, and reflect on their learning by writing required portions of the project report during the class period.

1.2 Classroom Interactions. Rating: 5

Indicator: Interactions reflected collegial working relationships among students (e.g., students worked together productively and talked with each other about the lesson).
*It’s possible that this indicator was not applicable to the observed lesson. You may rate NA in this case.

Evidence for 1.2

At the beginning of this lesson [1:35–1:50], one student explained to another how to write the results of his calculations using scientific notation with appropriate signs — positive or negative, depending on the magnitude of the number. This same student left her group to discuss her calculations with another class member across the room [5:02–6:21]. As one student in a group made a simple calculation error, she asked, “One half is 1.5?” The other group member corrected her, saying, “N,o it’s .5” [2:45–3:03].

When some group members were confused, one student referenced their notes and tried to find an equation from the previous workshop that would explain what they needed to consider: “Do you have them? [Student pointing to notes] See these were extra rules” [3:42–4:10].

Group work and roles were well established and followed by the students. During the workshop, the instructor pointed out the requirements of the rubric that other group members could be working on while one of their group created the graphs using Geometer’s Sketchpad [7:52–8:17].

1.3 Classroom On-Task. Rating: 5

Indicator: The majority of students were on task throughout the class.

Evidence for 1.3

On this edited video (27 minutes of a 90-minute class session), it appears that at least 90% of students were consistently on task throughout the lesson. For example, at 12:03–12:10, the camera provides a shot of the entire back row of students working collaboratively within their groups, and all appear on task even as the instructor focused attention on one particular student group. Another example [5:41–6:23] shows almost the entire class on task with the exception of one student who was talking about students having “pizza . . . next door.”

1.4 Classroom Management. Rating: 5

Indicator: The teacher’s classroom management strategies enhanced the classroom environment.

Evidence for 1.4

During this lesson, students were consistently moving around the room to attend workshops or talk to other students to collaborate or get the help they needed to complete tasks and activities for the project. This lesson took place during the last month of the school year and it was evident that clear, well-established norms for behavioral expectations existed. There were no verbal reprimands of students regarding behavior during the video. Throughout the lesson, including during the workshop, the instructors had no management issues to deal with as all students were on task, asking questions and taking notes [17:22–17:24].

1.5 Classroom Organization. Rating: 5

Indicator: The classroom is organized appropriately such that students can work in groups easily and get to lab materials as needed, and the teacher can move to each student or student group.

Evidence for 1.5

The physical classroom was organized appropriately as students and instructors were able to move around the room easily to collaborate with individuals or groups. During the small workshops, all students could see the images from the digital projector, hear the instructor, and see what he was writing and drawing.

The instructors had made thoughtful decisions about the set up of the digital materials necessary for the class period. Students could easily access materials needed in the project briefcase and electronic organizational system, notably the rubric and hint pages of content. The warm-up questions and student responses were web-based, and instructors were able to provide them with instant and ongoing feedback.

1.6 Classroom Equity. Rating: 5

Indicator: The classroom environment established by the teacher reflected attention to issues of access, equity, and diversity for students (e.g., cooperative learning, language-appropriate strategies and materials, attentiveness to student needs).

Evidence for 1.6

During small group work time students were working cooperatively on the tasks needed to complete the day’s project activities. The instructors made many moves to provide assistance, recognizing and supporting various student learning styles. For example, at 0:21–1:30, the instructor used a small white board to demonstrate the necessary calculations for two students. At 4:29–4:31, a student double-checked her work on a digital feedback form, which allowed for individual feedback that was protected from sharing with the whole group.

At 25:50, a student worked one-on-one with another student and, when he checked his work, offered suggestions for correction without verbal or embarrassing reprimand. At 6:27, the instructor began a short workshop to clarify any student misconceptions about the project rubric expectations and deliverables. At 8:15–8:30, the instructor offered suggestions for various routes to complete the entire project: “If one of your group members is working on calculations, this [indicating an alternative task] is a stamp you can get.”

Synthesis Rating for Classroom Environment

Classroom culture is non-interactive or non-productive.Classroom culture is productive and interactive only occasionally.Classroom culture is adequately productive and interactive.Classroom culture is often productive and interactive, with some collegial interactions.Classroom culture is consistently collegial, interactive, and productive.
12345

The synthesis rating for Classroom Environment is 5: Classroom culture is consistently collegial, interactive, and productive.

2. Lesson Structure

2.1 Lesson Sequence. Rating: 4

Indicator: The lesson was well organized and structured (e.g., the objectives of the lesson were clear to students, and the sequence of the lesson was structured to build understanding and maintain a sense of purpose).

Evidence for 2.1

The sequence of this lesson was well structured and organized. The beginning of the lesson was designated as time for the students to continue working on their projects, complete a warm up, and clarify any misconceptions regarding the expectations defined by the project rubric in a workshop.

During this first workshop, the physics instructor let students know of a second workshop opportunity planned for later in the class period. As promised, later in the class session, the mathematics instructor held a workshop for students who needed some help with ellipse equations. Time allotted at the end of the lesson allowed students to continue working on their projects while the instructors circulated and worked one-on-one with students who needed assistance.

2.2 Lesson Importance. Rating: 4 

Indicator: The structure of the lesson allowed students to engage with and/or explore important concepts in mathematics or science (instead of focusing on techniques that may only be useful on exams).

Evidence for 2.2

This lesson was designed for students to continue working on a project that integrated both math and science concepts. In this particular portion of the project, students were calculating the size of ellipses using equations to determine and then graph a planetary orbit that included a habitable zone for life. The structure of this lesson, planned with two workshops to clarify both project rubric expectations and required mathematical concepts, allowed students to engage with significant content for most of the class. There is the potential for missed opportunities for all students to be engaged with the math concepts as the group work structure and project expectations allowed for division of labor (i.e., one group member could do most of the calculations and graphing work while other group members accomplished other tasks).

2.3 Lesson Assessments. Rating: 4

Indicator: The structure of the lesson included opportunities for the instructor to gauge student understanding.

Evidence for 2.3

There were consistent opportunities for both instructors to gauge student understanding throughout this lesson. The instructors were circulating throughout the room when not conducting workshops, frequently checking student work or asking students to share / explain their calculations. Students were given multiple opportunities to “re-submit” their answers to the warm up questions and subsequent problems.

This part of the lesson / project design provided rich opportunities for students to correct and learn from their mistakes. Recognizing that the students were struggling with their ellipse calculations, the instructor planned a direct teach methodology for the mathematics workshop on calculating ellipse equations and graphing the results, repeatedly demonstrating several example calculations.

The structure of this part of the lesson allowed students to take notes on the material but provided little opportunity for discussion to reveal their thinking or potential misconceptions. However, the instructor did allow time to monitor student work on these concepts after the workshop in order to gauge student progress and understanding.

2.4 Lesson Investigation. Rating: 4

Indicator: The lesson included an investigative or problem-based approach to important concepts in mathematics or science.

Evidence for 2.4

This lesson was part of a three-week project. The lesson plan was designed so students worked worked in their small groups at their own pace to complete the multiple project tasks. The primary task for this lesson included ellipse calculations for the orbits of planets in their solar system, including at least one that contained a habitable zone. The overall project and specific lesson structure required each student group to choose to design a set of planets in a solar system that met unique specifications, indicating an investigative approach.

2.5 Lesson Resources. Rating: 4

Indicator: The teacher obtained and employed resources appropriate for the lesson

 

Evidence for 2.5

The resources used in this lesson were part of the classroom set up at this high tech high school. Each student was working on a computer. Although scientific calculators were available, students could also use their mobile devices (phones and tablets) and/or the calculator on the computers [0:00–0:10]. At times the instructors used small white boards to work out equations with the students when necessary. During one of the workshops, the instructor used the digital projector and Geometer’s Sketchpad.

2.6 Lesson Reflection. Rating: 5

Indicator: The teacher was critical and reflective about his/her practice after the lesson, recognizing the strengths and weaknesses of his/her instruction.
* This indicator may be rated NA if you do not have access to a teacher interview or teacher commentary


Evidence for 2.6

In reflection (post-observation interview) of the entire project, one teacher commented on how surprised he was that the students struggled with translation of the ellipses that described their planets’ orbits off the origin. He thought this skill would be aided by the use of Geometer’s Sketchpad, but he still found that he had to add workshops and group tutoring sessions to help walk the students through this process.

During the same interview, the physics instructor described the grading rubric and its flexibility. “So, yes, the rubric is aligned, does that mean it’s the law? Yes and no. There are some teams that are ready to go even further than what we say, so sometimes they’ll express things that I think are valid and that kind of align to our standards or are just valid to pursue, and I know they have the time. So we can negotiate what a rubric, what, how a part of the rubric will be assessed if they’re going to take that new direction. And I’ll structure that choice.”

Synthesis Rating for Lesson Structure

Lesson was very poorly structured to assist student learning.Lesson was poorly structured to assist student learning.Lesson was adequately structured to assist student learning.Lesson was well structured to assist student learning.Lesson was expertly structured to assist student learning.
12345

The synthesis rating for Lesson Structure is 4: Lesson was well structured to assist student learning.

3. Implementation

3.1 Implementation Questioning. Rating: 3

Indicator: The teacher used questioning strategies to encourage participation, check on skill development, and facilitate intellectual engagement and productive interaction with students about important science and mathematics content and concepts.


Evidence for 3.1

During the lesson, both instructors asked appropriate procedural questions while helping students work through problems whether individually or in small groups. For example, from 0:20–1:30 the instructor worked through an equation with students.

Rather than working the problem out for students to copy, she consistently asked the students questions to move them along to the next step in the solution procedure. “What did we do to the distance in that problem?” “What was the change, do you remember?” “So, what was changed, the masses? The distances? What was going on?” “What are we going to replace that d, by?” “So when I simplify, what do I get?”

She continued to check for student understanding by asking, “Do you have any questions?” [1:58–2:01]. This questioning of students as they moved through the calculation steps occurred again [2:05–2:25]: “What’s .95 in scientific notation?” “How’d you get 4? That’s right.”

During the workshop on ellipses, the instructor displayed a box on the screen and began typing a formula, briefly checking for student understanding by asking questions such as “And that’s what?” He expected students to answer, but when no one did, he answered his own questions [16:51–17:10]. During these types of teacher / student interactions the instructors did check for student progress and encourage participation, but they rarely challenged students with higher-level questioning or probe for deeper understanding.

3.2 Implementation Involvement. Rating: 4

Indicator: The teacher involved all students in the lesson (calling on non-volunteers, facilitating student–student interaction, checking in with hesitant learners, etc.).

Evidence for 3.2

This lesson contained two brief instructor-led workshops as well as a reasonable amount of student group work time. During the workshops, the instructors fielded student questions and asked questions of the entire group, but they did not call on individual students or non-volunteers. During the student group work time the instructors were circulating throughout the room and checking in with individual students as well as small groups. The group norms established that set expectations for collaborative student-student interactions were clearly implemented and students were observed continually interacting with each other in their groups throughout the lesson.

3.3 Implementation Modification. Rating: 4

Indicator: The teacher used formative assessment effectively to be aware of the progress of all students and modified the lesson appropriately when formative assessment demonstrated that students did not understand.

Evidence for 3.3

During the lesson, both instructors held workshops to scaffold student learning and ensure their ability to accomplish required tasks of the project. The physics instructor conducted a workshop to explain components of the rubric and offered suggestions on ways the group could collaborate effectively to complete the project. For example, she suggested activities that could be completed by some group members while another group member was working on the Geometer’s Sketchpad graphs of the ellipses that defined their planetary orbits [8:08]: “Those are some easy stamps you might want to go for if someone else is working on graphing those ellipses.” Later [9:00–9:10], the instructor offered a clarification and strategy for students to frame their answers to meet the rubric requirements.

During the workshop on ellipses [14:50–15:08], the instructor realized that students might not understand his questions and stated, “Maybe we don’t know what we are looking for in a focal point, so I’ll make up one,” then continued demonstrating sample calculations for the students.

When not holding workshops, the instructors circulated the classroom, consistently checking the progress of individual students and student groups. During these interactions, the instructors were gaining a sense of student understanding, acknowledging progress on assigned tasks by giving credits or stamps, and assisting students with hints and suggestions for approaches to completing the various portions of the project [24:29–25:08]. 

3.4 Implementation Timing. Rating: 4

Indicator: An appropriate amount of time was devoted to each part of the lesson.

Evidence for 3.4

There were no issues with timing during this lesson. During both workshops, students spent a sufficient amount of time working in small groups with the instructors. Though there was not a formal wrap up at the end of this lesson, the instructor reminded students to save their work at the end of class (not shown in video). The lack of wrap up was not a detriment to this lesson, as students were going to continue working on the project for the rest of the week.

3.5 Implementation Connections. Rating: 3

Indicator: The instructional strategies and activities used in this lesson clearly connected to students’ prior knowledge and experience.

Evidence for 3.5

The class began with a warm-up reviewing how to calculate the force of gravity for planets in their solar system from a previous lesson. Later, during the workshop clarifying some of the project rubric expectations, one of the instructors pointed out assigned tasks that students could complete with previously developed materials [7:52–8:17]. At 26:50, an instructor reminded a student of information that was presented in the ellipse workshop. He asked the student if he had attended the workshop, because the group’s graphs were not correctly arranged around the star. 

3.6 Implementation Safety. Rating: NA

Indicator: The teacher’s instructional strategies included safe, environmentally appropriate, and ethical implementation of laboratory procedures and/or classroom activities.
*This indicator may be rated NA if there were no relevant activities during the lesson.

Evidence for 3.6

There were no activities requiring instructional strategies related to safety.

Synthesis Rating for Implementation

Very poor lesson implementationPoor lesson implementationAdequate lesson implementationGood lesson implementationExcellent lesson implementation
12345

The synthesis rating for Implementation is 4: Good lesson implementation.

4. Mathematics/Science Content

4.1 Content Significance. Rating: 3

Indicator: The mathematics or science content chosen was significant, worthwhile, and developmentally appropriate for this course (includes the content standards covered, as well as examples and activities chosen by the teacher).

Evidence for 4.1

The content — ellipse and conic section calculations, calculating the magnitude of gravitational force, and using scientific notation — was grade-level-appropriate content for this integrated Physics and Algebra II course [TEKS 2A.05(b), PHY (05)(B), PHY (02)(H)].

This project served as a way to introduce concepts of conic sections, but the tasks involved in designing a solar system with three planets of different sizes and different orbits were challenging and complex. Although the content was certainly significant, the complexity of the project may have been developmentally inappropriate for the students. Looking at the lesson as a whole, students clearly struggled with some of the mathematics content and had yet to master some of the skills required to successfully complete all aspects of the project as designed.

4.2 Content Fluency. Rating: 3/4

Indicator: Content communicated through direct and non-direct instruction by the teacher is consistent with deep knowledge and fluency with the mathematics or science concepts of the lesson (e.g., fluent use of examples, discussions, and explanations of concepts, etc.).

Evidence for 4.2

For the score of 3: During the class period, there were multiple instances where the instructor communicated content in a way that may have confused students. During the workshop on ellipses, the term “center” was used to mean multiple things — center of the star and center of the ellipses [12:57–13:30]. The explanation of how to find the center of the ellipse [16:00] uses the concept of midpoint, but is explained as an average because they have the same y values. While this is a concept covered in geometry and can be explained using the idea of averaging, the instructor missed out on an opportunity to use correct vocabulary and connect to geometry content.

Later, at 19:18, when describing how to calculate a using a2 = b2 + c2, a student asked, “Isn’t that the Pythagorean theorem?” The teacher said, “No, the letters are different from the Pythagorean theorem. It’s the same concept as the Pythagorean theorem but it’s not a triangle. They’re arranged differently.” This quick explanation failed to explore how the equation was derived and may have led to confusion between the ratios of the side lengths of a right triangle and the ratios of the major radius, minor radius, and the distance from the focal point to the center in an ellipse.

For the score of 4: At the beginning of the lesson [0:21], the physics instructor explained the parts of the acceleration due to gravity equation using a white board. In order to address student misconceptions, she asked a variety of questions to probe at student thinking and elicit their understanding. For example, she asked, “What did we do to the distance in this problem? What was changed? What’s going on? What are we going to replace that d by?” In this same discussion, she simplified the calculation in student-friendly terms [1:26]: “I’m just replacing the d, the whole thing squared. So when I simplify what is in the bottom, what do, now you just need to figure that out.”

When discussing the tasks described in the project rubric, the instructor described the various types of orbits in many different ways. When one student asked how to approach planetary gravitational force calculations, the instructor referenced a prior class and offered a suggestion by saying, “Here is what I told [other] people that were struggling with this. You can use our solar system as an example” [9:51–10:52].

4.3 Content Accuracy. Rating: 5

Indicator: Teacher written and verbal content information was accurate.

Evidence for 4.3

There were no formal errors with the instructors’ verbal or written content information.

4.4 Content Assessments. Rating: 5

Indicator: Formal assessments used by teacher (if available) were consistent with content objectives (homework, lab sheets, tests, quizzes, etc.).
*It’s possible that this indicator was not applicable to the observed lesson. You may rate NA in this case.

Evidence for 4.4

During the lesson, one instructor conducted a workshop on the project rubric, and the rubric was used as an assessment throughout the period and throughout the three weeks of the project. The rubric was well designed to evaluate student understanding of concepts central to the project and lesson itself. The rubric also provided challenges through the advanced criteria — portions that student groups could choose to complete for extra points. The instructors circulated class when not conducting workshops to check on student work and awarded credit (“stamps”) for completed portions of the rubric.

The lesson also included a warm-up activity, a problem set reviewing gravitational force calculations, and students would need to apply this knowledge to their own solar systems in order to complete this portion of the project rubric. Because student responses to the warm-up questions were submitted electronically, the instructor was able to grade them quickly, provide feedback to students to help them learn from their mistakes, and correct their work.

4.5 Content Abstraction. Rating: 4

Indicator: Elements of mathematical/scientific abstraction were used appropriately (e.g., multiple forms of representation in science and mathematics classes include verbal, graphic, symbolic, visualizations, simulations, models of systems and structures that are not directly observable in real time or by the naked eye, etc.).
*It’s possible that this indicator was not applicable to the observed lesson. You may rate NA in this case.

Evidence for 4.5

Abstraction played a major role in the project overall, as students were designing and graphing planetary orbits in a solar system using mathematical concepts of conic sections. During this lesson, students were calculating and graphing the elliptical orbit of a planet around a star — an orbit that provided a habitable zone on the planet. This portion of the project engaged students in exploring and discussing the details of their graphical representation in the context of their solar system, especially how the orbit of the planet defined the habitable zone.

The significance of the challenge for students that surfaced when working with numbers of the size required for calculating and graphing elliptical orbits using a tool such as Geometer’s Sketchpad was illustrated [27:18] when the mathematics instructor found that, when a student didn’t convert units properly, the orbit “got really big . . . so big that you don’t even really see your star when you zoom out and see your orbits.” The student was unable to understand the magnitude of these orbits and stated that he simply “moved it because it looked wrong.” Although the Sketchpad tool can be helpful for graphing these solar systems, it was apparent that some students still struggled with comprehending the relative sizes of components in their model of the solar system.

4.6 Content Relevance. Rating: 2

Indicator: During the lesson, it was made explicit to students why the content is important to learn.

Evidence for 4.6

While not explicitly discussed, the content was important for students to learn and relevant in the context of completing the project. However, beyond the requirements to accomplish specific tasks in order to design and define their solar systems, the significance of the science and mathematics concepts was not discussed or elaborated on within the context of either discipline.

4.7 Content Interconnections. Rating: 5

Indicator: Appropriate connections were made to other areas of mathematics or science and/or to other disciplines (including non-school contexts).

Evidence for 4.7

The entire project, in fact the entire Phylgebrics course curriculum, was based on integrating learning standards for High School Physics and Algebra II. Although the connections between these two disciplines were not overtly discussed in this video segment, the instructors had prepared, and students continually referred to a project rubric checklist where the learning standards, content topics, tasks and activities for each subject were explicitly described.

4.8 Content Societal Impact. Rating: 2

Indicator: During the lesson, there was discussion about the content topic’s role in history or current events.

Evidence for 4.8

The instructor briefly mentioned the scientific debate and controversial decision to remove Pluto from the designation of a planet in our own solar system, but there was no significant or extended discussion that made explicit the impact of this on society [10:00–10:23].

Synthesis Rating for Mathematics/Science Content

Students learning inaccurate content knowledge.Students learning superficial content knowledge.Students learning adequate content knowledge.Students learning good content knowledge.Students learning deep, fluid content knowledge.
12345

The synthesis rating for Mathematics/Science Content is 3: Students learning adequate content knowledge.