Presenters: Kirk Ross, Educational Consultant, ODE; Larry Early, Associate Director of Assessment, ODE.
Summary: Purpose of presentation was to increase awareness of numerous considerations that are part of movement toward a computer-based assessment system; how stakeholders can prepare for change; engage participants in discussion about related topics.
Presenters addressed these questions (in bold).
Why computer-based assessments?
- They are a better fit for where we are going with instruction. New standards have technology built into them. Instruction will drive the assessments, rather than other way around. Want assessments to be seamless with instruction; with students unaware that they are taking tests.
- Assessments will be oriented to college- and career-readiness skills. Assessments to be based on real-world experiences. Will ask them to show what they know and can do.
What are technology-enhanced items?
- They may be of different types, will ask for a response or action, include media, and be interactive. They may be of various complexities, but must have fidelity, and use an automated or human scoring method.
- Will include various types of items including short answer, multiple choice, longer open-response and performance-based items. Expect to have examples to show students and teachers in advance of their use during pilots.
What are the benefits for Ohio of computer-based assessments?
Should be able to reduce our costs for the statewide assessments, although Ohio will have to do parallel paper/pencil and computer tests since not all districts are up to capacity and some families will object to computer assessments. Internet safety will be a consideration – there will be no access to Internet for anything other than the assessment.
What steps is the state taking toward the goal of a statewide computer-based assessment system?
- Conducted a needs analysis to create a detailed assessment-design roadmap.
- Piloting assessments.
- Looking at options to reduce costs to make technology available to all districts. Also keeping a long-term view on technology needs.
- Communicating with districts on an ongoing basis. Kirk Ross sends email communications with district technology staff and responds to phone and email questions regularly. Contact him at email@example.com.
What can districts do to prepare for the new assessments?
- Anticipate ongoing change – design and implement a plan for ongoing needs analysis, including system, development, capacity building and communication.
- Increase awareness, perhaps have a school-level team.
- Change behaviors – be ready to embrace technology in learning process, actively get engaged in implementation process and be willing to make necessary changes.
For More Information: Click here to download the presentation.
Presenters: Rachel Vannatta Reinhart, Toni Sondergeld, Center of Assessment and Evaluation Services, Bowling Green State University
Summary: For teachers who will not be evaluated by Ohio’s value-added system, LEAs will still need to have a means of assessing student growth. This presentation was aimed at helping LEAs think about developing and piloting assessments that can be used to create student learning objectives for these teachers.
Districts could use assessments created by vendors or the LEA itself. LEA assessments help with teacher ownership and better alignment of assessment with what LEA is actually teaching. But for student growth measures to be effective, they need to be based on detailed measurable objectives at specified learning levels.
Local measures must be standards-based, use standardized administrative practices, involve teacher contributions, have inter-rater reliability, be externally validated by content experts, use assessment refinement, influence practice, and have three years of trend data available.
Challenge is to create assessments that are comprehensive, measure student growth, time-efficient, objective with constructed response items, aligned with instruction and curriculum, and that are reliable and valid.
To achieve quality objectives, assessments can use multiple choice items. Speakers use 20 golden rules in assessment development. One example is to avoid “all of the above” or “none of the above” as choices. These are not good responses for diagnostic purposes. Also, development process involves following a checklist:
- Make response options plausible and real.
- For constructed response items, align item instructions with grading criteria using a grading rubric.
- Finalize assessment by writing directions, developing standard administration and grading practices.
- Teachers need to review their assessments and also have a peer review. Examine the alignment between the instructional blueprint and the assessment. Is it balanced to reflect areas of instruction in right proportions, as well as the appropriate cognitive levels?
- Eliminate error by interviewing students.
- Have an expert review.
- Pretest data analysis. Administer, analyze items and generate student reports that can be used to develop student learning objectives.
Desired outcome is for valid and reliable assessments..
Presenters: Cindy Mullen and Jen Carey, third-grade teachers with Hartville Elementary School in Lake Local Schools, Stark County.
Summary: Two of the six third-grade teachers at this elementary building described what they are doing together to collaborate on developing lesson plans that are differentiated for students of various abilities, are tied to Common Core and new Ohio standards in multiple disciplines and build student computer skills. They described two month-long units the six teachers have taught for several years. Their success is high: 85 percent of their OAA scores were in the advanced or accelerated levels, despite the fact that a high number of students are from low-income families.
- Teachers throughout Lake Local Schools are required to have 30 minutes of collaboration time each day. The third-grade teachers at Hartville also spent time during the summer to jump start their projects. The teachers believe that no one has time to do it all by themselves, nor can teachers fit in everything if they teach only one lesson at a time. So they work together as a team and prepare instruction that incorporates multiple content areas.
- Teachers design the units by identifying essential questions about what they want students to learn. Then they choose how they will learn information by selecting engaging activities that make real-world connections.
- All students answer the same essential question(s) that are placed in student packets. Teachers differentiated these lessons according to depth of writing and research expectations, which are explained in the low, medium and high level student packets. Teachers also encourage students to help each other and they do so easily. Hartville classrooms are less about teachers talking in lecture style than giving students activities where they learn by doing and working together.
- The first project, focused on these questions “What is an oil spill?”; “What is our responsibility to be a part of the solution to this problem?” and “Why should we be socially responsible in our world?”
- Students read books and websites, and view videos. They also did an activity in which they created an oil spill in a pie plate and had to figure out how to contain it and clean it up with various tools provided. They created an environment in which the spill happens, with animals and vegetation. Students later wrote about which cleaning method worked best. Wrote thank you notes to those who donated materials for exercise. This unit integrated social studies, science, reading and math (did charts).
- Another unit supported new standards stressing research skills. Students answered one of two essential questions: Why is your animal able to survive in its habitat? (middle- and struggling-readers and writers). Or “Why is your biome unique and how would it effect the planet if it would be destroyed” (for advanced students).
Presenters: Kelli Wohlgamuth, Cincinnati Public Schools; Virginia Ressa, Ohio Department of Education
Summary: Formative Instructional Practices (FIP) combine four things: establishing clear learning targets, collecting evidence of student learning, providing students with effective feedback and preparing students to take ownership of their learning. Research shows that FIP schools make great gains in student learning, partly because students become more aware of and responsible for where their learning stands in comparison to the goals. This session addresses collecting evidence of student learning, particularly designing effective assessments.
First, teachers should consider which assessment method will yield the most accurate information for a given learning target; what they need to do to ensure the evidence they collect can be used formatively; what is necessary to ensure the evidence collected matches what they taught; and what might be ways to document formative evidence.
There are four requirements in the design of an effective assessment:
- Select the assessment methods to match the learning targets you are teaching.
- Use an appropriate sample size.
- Write or select only high quality assessment items, tasks and scoring rubrics.
- Control for Bias.
These are covered in detail in the books, Classroom Assessment for Student Learning (2012) by Siggins, Arter, Chappuis and Chappuis.
Also, click here for the session PowerPoint, which contains much deeper detail on these four assessment requirements.
Presenters: Judith Monseur and Brenda Price, Ohio Department of Education
Summary: This session focused on the Resident Educator Summative Assessment that Year 3 Ohio Resident Educators may take. It also outlined provisions for Year 4 Resident Educators.
In Year 3, Resident Educators are inducted into the full life of teaching through reflective practice. They have developed an understanding of teaching and learning and are preparing to demonstrate their professional growth through the Resident Educator Summative Assessment (RESA).
The RESA’s purpose is to assess Resident Educators’ skills and practices, measured by proficiency on the Ohio Standards for the Teaching Profession. The assessment is being developed through Stanford University, Teachscape and the Ohio Department of Education.
For the assessment, Resident Educators will use an online platform to submit lesson plans, instructional materials, analysis of student learning, student work samples, video clips of lessons, and written teacher reflections as evidence of their teaching performance. These will be reviewed and scored by trained assessors who have five or more years of teaching experience and professional development and coaching experience. Assessment planners will analyze field test data being collected this school year to determine the requirements that will necessary for third-year Resident Educators to pass the RESA.
Resident educators (REs) who take RESA in Year 3 are required to have a licensed educator to help facilitate their preparation. They are not required to have a state-trained mentor to support them, though the person who has been their mentor may also qualify to serve as their facilitator. Required facilitation training will be online beginning in late summer 2013. When selecting a facilitator, districts should look for someone who holds a professional educator license, understands the Resident Educator Program and is already familiar with facilitation and questioning strategies.
If RESA participants are unsuccessful in passing all part of RESA in 2013-2014, they may retake deficient portions in 2014 – 2015.
Schools and districts will determine whether their third-year resident educators are ready for the assessment. If not, these teachers can wait until Year 4 to take the test. If the district determines a Resident Educator should wait until Year 4 to test, it still must provide specific supports to the Resident Educator in Year 3. The RE must be assigned trained mentor, and the mentor will follow a Year 3 Best Practices timeline similar to that of Year 2. The RE will complete a Year 3 Formative Progress Review in Spring 2014 and begin RESA in Fall 2014.
Costs to implement RESA, such as use of the software platform, site licenses, assessors and facilitation training, are paid by ODE.
Resident Educators, who successfully passed RESA in Year 3, will be supported as Year 4 REs through learning communities in their schools and districts as they deepen their content knowledge, collaborate with colleagues and accept teacher leadership responsibilities.
More information about professional learning opportunities for Year 4 REs will be provided in fall 2013.
Upon successful completion of RESA and all Year 4 requirements, REs may apply for the 5-year professional license.
Presenters: Gary Barber, Assistant Superintendent; Peggy McMurray, Principal; Angie Pollock, Director of Academic Achievement; Big Walnut Schools; Erin Barr, Intervention Specialist and president of the Big Walnut Education Association
Summary: Big Walnut is a Race to the Top district that is piloting the Ohio Teacher Evaluation System (OTES) this school year with 25 volunteer teachers and will implement OTES during the 2013-2014 school year. The district and the Big Walnut Education Association worked together to transition teachers to the OTES process as smoothly and positively as possible, choosing to view OTES as “a stepping stone” to better practice rather than as a “stumbling block” for schools and teachers. The district’s strategy was to get started early; work collaboratively, both internally and with its ESC; simplify information for teachers; and share newly received information as soon as possible.
The district followed this timeline:
- Opening day professional development for teachers on OTES and Student Learning Objectives
- Agreed with local teacher’s union to establish an Evaluation Committee containing an equal number of administrators and educators.
- Formed SLO writing teams and called a late-start day so teachers could work on SLOs in their teams. Result: each teacher was able to submit an SLO in October.
- Delivered a condensed, two-day version of ODE’s OTES training for evaluators to the Evaluation Committee.
- Evaluation Committee mapped out teacher professional development, professional growth/improvement plans and class walkthrough guidelines and created a Google document on which principals could record—and teachers could respond to—notes from the class walkthroughs. This facilitated reflective conversations about the walkthroughs between principals and teachers.
- District used a waiver day to allow teachers to work in SLOs tied to the new standards.
- Evaluation Committee developed OTES packets for teachers using ODE and locally developed materials and developed a video and FAQ document for teachers to go over in groups.
- Committed to spend most of the professional development time for this year to OTES-related topics.
- OTES packets distributed to staff by Evaluation Committee members in small meetings. Later, an administrator/teacher team from the committee answered follow-up questions in small groups.
- All questions collected in small groups were answered in writing by Evaluation Committee members and emailed to all district teaching staff.
- All teachers in the district then received 3.5 hours of OTES training.
For More Information: Big Walnut is glad share all the tools and documents it developed through its pilot programs with other districts. Email firstname.lastname@example.org; erinbarr@email@example.com; firstname.lastname@example.org; or email@example.com; or visit www.bigwalnut.k12.oh.us.
Example: A more detailed discussion of student growth measures can be found here.