Academic

2026 edTPA Scorable Examples: Full Task 3 Lesson Plans That Got a 58/60

What high-scoring assessment portfolios actually look like

By Chandler Supple8 min read

The edTPA assessment determines whether teacher candidates receive certification in most states. Task 3, focusing on assessment, proves most challenging for many candidates. You must demonstrate you can assess student learning effectively, analyze results, and provide targeted feedback. According to edTPA data, Task 3 has the lowest average scores across all content areas. Understanding what evaluators want and seeing high-scoring examples dramatically improves your chances of passing on the first attempt.

What Does Task 3 Actually Require?

Task 3 requires planning, implementing, and analyzing an assessment that measures student learning related to your learning segment. You must create an assessment aligned with your learning objectives, collect student work samples, analyze patterns across the class, provide feedback to students, and explain how you would re-teach based on assessment results. Each component has specific rubric criteria evaluators use to score your work.

Your assessment must measure the learning objective from your lesson segment, not just check completion. If your objective states students will explain causes of the Civil War using evidence, your assessment must require explanation with evidence, not just listing causes. Alignment between objectives and assessment is the foundation. Misalignment results in failing scores regardless of how well you write commentary.

Evaluators want to see authentic student work, not hypothetical analysis. You must submit actual student responses showing varied performance levels: one high, one medium, one low. Choose work samples that allow you to discuss specific patterns and needs. Samples with minimal student writing or work limit what you can analyze. Substantive work samples make strong analysis possible.

How Should You Design Your Assessment?

Create assessments that require students to demonstrate understanding through application, analysis, or creation rather than recall. Multiple choice tests rarely allow deep analysis of student thinking. Constructed response items, performance tasks, or projects provide richer evidence of learning. One high-scoring candidate in elementary math assessed fraction understanding by having students solve real-world problems and explain their reasoning using multiple representations.

Design assessments with clear criteria for success. Provide rubrics or scoring guidelines before students complete assessments. This ensures students understand expectations and helps you analyze results systematically. Your evaluation criteria become the framework for analyzing patterns in commentary. If your rubric addresses three specific skills, you can discuss class performance on each skill separately, showing nuanced understanding.

Include opportunities for students to show partial understanding. All-or-nothing assessments (fully correct or fully incorrect) limit your ability to analyze student thinking. Design tasks where students might demonstrate some understanding while struggling with other aspects. This allows sophisticated analysis of what students know versus what requires additional instruction.

Build in self-assessment or reflection. Having students evaluate their own work before you assess provides additional evidence of learning and shows you value student metacognition. One high-scoring candidate had students complete a reflection sheet identifying which parts of the task they found challenging and why. This student self-assessment enhanced the candidate's analysis of learning needs.

What Makes Assessment Commentary Score Well?

Strong commentary provides specific evidence from student work to support every claim. Never make generalizations without citing particular student responses. Compare these examples. Weak: "Many students struggled with main idea." Strong: "Seven of 18 students identified supporting details as the main idea (see samples A, C, and F), suggesting confusion about the relationship between main ideas and details. For example, Student C wrote 'The main idea is that eagles have sharp talons' when the paragraph was actually about eagle adaptations for hunting."

The strong version quantifies how many students struggled, specifies the nature of the struggle, provides evidence from actual work, and includes a specific example. This level of detail proves you analyzed carefully rather than making surface observations. Every rubric in Task 3 rewards specific evidence over general statements.

Discuss patterns across the class, not just individual students. Evaluators want to see you can identify whole-class trends that inform instruction. Group students by performance level or error type. Discuss what each group understands and needs. One high-scoring commentary stated: "Students fell into three groups: Group 1 (6 students) successfully identified and explained all causes using evidence. Group 2 (8 students) identified causes but struggled providing historical evidence. Group 3 (4 students) confused causes with effects and provided opinions rather than evidence."

This grouping allows the candidate to discuss differentiated re-teaching for each group's needs. Whole-class analysis shows instructional sophistication beyond addressing individual student needs.

How Should You Provide Feedback to Students?

Effective feedback must be specific, actionable, and focused on the learning objective. Generic praise ("Good job!") or vague criticism ("Needs improvement") fails to help students grow. Feedback should identify what students did well specifically, what needs improvement, and concrete next steps. One high-scoring candidate wrote on student work: "Your topic sentence clearly states your main argument. To strengthen this paragraph, add specific evidence from the text to support each reason you provide. See pages 23-24 for relevant quotes."

This feedback praises something specific, identifies the gap, and tells the student exactly what to do. The student can act on this guidance immediately. Evaluators look for feedback that would actually help students improve, not just record a grade.

Balance feedback across students. Do not give extensive feedback to high performers while providing minimal feedback to struggling students. All students deserve substantive guidance. Your three work samples should show comparable feedback quality even though student performance differs. This demonstrates you invest equally in all students' growth.

Connect feedback to your rubric or success criteria. Reference specific rubric language in your comments so students understand evaluation criteria. This teaches students to self-assess using the same standards you apply. One candidate wrote: "According to our rubric, accurate calculations earn 2 points. You earned 1 point because your process was correct but you made an arithmetic error on line 3. Check your work carefully to earn full points."

How Should You Plan Re-teaching and Next Steps?

Re-teaching plans must address specific gaps revealed by assessment data and differ for students with different needs. Generic "review the material" plans fail. Explain exactly what you would teach again, to whom, and using what different instructional approach. If direct instruction did not work initially, describe alternative strategies: peer teaching, manipulatives, graphic organizers, differentiated texts, or small group instruction.

One high-scoring commentary explained: "For Group 2 students who identified causes but lacked evidence, I will provide a cause-evidence matching activity using excerpts from textbook primary sources. Students will match each cause to specific historical evidence, then practice writing sentences that incorporate evidence effectively. This scaffolded practice addresses their specific gap without re-teaching content they already understand."

This plan targets the precise learning need with a different instructional approach. It explains what, who, why, and how. This level of specificity shows instructional decision-making based on assessment evidence.

Discuss how you would extend learning for students who met objectives. Many candidates forget that re-teaching includes challenging students who succeeded. Explain enrichment or extension activities. One candidate wrote: "For Group 1 students who mastered the objective, I will provide a complex case study requiring them to apply the same skills to a novel scenario with less scaffolding. This maintains their engagement while Group 2 receives targeted support."

Address how you would monitor whether re-teaching succeeded. Assessment is cyclical. After re-teaching, you would assess again to verify improvement. Describe what formative assessment you would use to check whether interventions worked. This shows you understand assessment as ongoing process, not one-time event.

What Common Mistakes Cause Task 3 Failures?

The most common failure is misalignment between assessment and learning objective. If your objective focuses on analysis but your assessment only requires recall, evaluators score you low on multiple rubrics. Check alignment carefully before implementing your assessment. Every assessment item should directly measure your stated objective.

Avoid generic analysis that could apply to any lesson. Phrases like "some students struggled while others succeeded" waste word count without providing insight. Every sentence should include specific details about your particular students, your particular assessment, and your particular findings. Generic statements suggest you wrote commentary without actually analyzing student work.

Never submit commentary exceeding page limits. Evaluators score only what fits within limits. Extra pages are not read. Most candidates exceed limits trying to include every possible detail. Edit ruthlessly for concision. Strong commentary says more with fewer words by focusing on what matters most: alignment, evidence-based analysis, specific feedback, and targeted re-teaching.

Do not ignore rubric language. Read each rubric carefully and address every criterion explicitly. Use rubric vocabulary in your commentary. If a rubric asks about "quantitative analysis," include numbers and data. If it requests "research or theory," cite educational research. Directly addressing rubric criteria ensures evaluators see you met requirements.

edTPA Task 3 assesses your ability to use assessment for instructional decision-making. Design aligned assessments, analyze student work specifically with evidence, provide actionable feedback, and plan targeted re-teaching. These skills separate effective teachers from those who just teach without attending to whether students actually learn. Use River's tools to organize your evidence and craft commentary that earns passing scores.

Chandler Supple

Co-Founder & CTO at River

Chandler spent years building machine learning systems before realizing the tools he wanted as a writer didn't exist. He founded River to close that gap. In his free time, Chandler loves to read American literature, including Steinbeck and Faulkner.

Ready to write better, faster?

Try River's AI-powered document editor for free.

Get Started Free →