_Chapter 2 - The Basics: Base Indicators_ |
To determine ratings under the standard accountability procedures, the 2006 accountability rating system for Texas public schools and districts uses four base indicators:
- spring 2006 performance on the Texas Assessment of Knowledge and Skills (TAKS),
- spring 2006 performance on the State-Developed Alternative Assessment II (SDAA II),
- the Completion Rate I for the class of 2005, and
- the 2004-05 Annual Dropout Rate for grades 7 and 8.
The TAKS indicator is the percent of students who scored high enough to meet the standard to pass the test. This is calculated as the number of students who met the TAKS student passing standard divided by the number tested. Results for the English version of the TAKS (grades 3-11) and the Spanish version (grades 3-6) are summed across grades for each subject. Results for each subject tested are evaluated separately to determine ratings.
Who is evaluated for TAKS: Districts and campuses that test students on any TAKS subject:
- Reading/ELA - Reading is tested in grades 3-9; English language arts (ELA) is tested in grades 10 & 11. Note that this is a combined indicator. It includes all students tested on and passing either the TAKS reading test or the TAKS English language arts test. The first two administrations of grade 3 and grade 5 TAKS reading results are included. See Reading/ELA Combined and Student Success Initiative in Other Information below.
- Writing - Writing is tested in grades 4 & 7.
- Social Studies - Social Studies is tested in grades 8, 10, & 11.
- Mathematics - Mathematics is tested in grades 3, 4, 5, 6, 7, 8, 9, 10 & 11. The first two administrations of grade 5 TAKS mathematics results are included. See Student Success Initiative in Other Information below.
- Science - Science is tested in grades 5, 8, 10, & 11. (Performance on the grade 8 science test will not be used for accountability purposes until 2008.)
Standard: The Academically Acceptable standard varies by subject, while the Recognized and Exemplary standards are the same for all subjects:
- Exemplary - For every subject, at least 90% of the tested students pass the test.
- Recognized - For every subject, at least 70% of the tested students pass the test.
- Academically Acceptable - Varies by subject:
- Reading/ELA - At least 60% of the tested students pass the test.
- Writing - At least 60% of the tested students pass the test.
- Social Studies - At least 60% of the tested students pass the test.
- Mathematics - At least 40% of the tested students pass the test.
- Science - At least 35% of the tested students pass the test.
Student Groups: Performance is evaluated for All Students and the following student groups: African American, Hispanic, White, and Economically Disadvantaged.
Methodology:
number of students passing [TAKS subject]
number of students tested in [TAKS subject]Minimum Size Requirements:
- All Students. These results are always evaluated regardless of the number of examinees. However, districts and campuses with a small number of total students tested on TAKS will receive Special Analysis. See Chapter 6 - Special Issues and Circumstances for more detailed information about Special Analysis.
- Student Groups.
- Any student group with fewer than 30 students tested is not evaluated.
- If there are 30 to 49 students within the student group and the student group comprises at least 10% of All Students, it is evaluated.
- If there are at least 50 students within the student group, it is evaluated.
- Student group size is calculated subject by subject. For this reason the number of student groups evaluated will sometimes vary. For example, an elementary school with grades 3, 4, & 5 tested may have enough Hispanic students to be evaluated on reading and mathematics, but not enough to be evaluated on writing (tested in grade 4 only) or science (tested in grade 5 only).
Year of Data: 2005-06
Data Source: Pearson Educational Measurement
Other Information:
- TAKS Grade 8 Science. For the first time in 2006, grade 8 students were assessed in TAKS Science. Performance on that assessment will not be incorporated into the accountability system until 2008. See Chapter 17 - Preview of 2007 and Beyond.
- Hurricanes Katrina-Rita Indicator (KRI). The performance of students displaced by Hurricane Katrina and/or Hurricane Rita who tested in Texas school districts in 2005-06 is not included in the indicators used for 2006 accountability ratings. For more information, see Appendix I.
- Student Success Initiative. For grades 3 and 5 reading and grade 5 mathematics performance, a cumulative percent passing is calculated by combining the first and second administrations of the TAKS. The results include performance on the Spanish versions of these tests.
- Special Education. Performance of special education students who take the TAKS is included in the TAKS indicator.
- Testing Window. Results for students given a make-up test within the testing window are included in the accountability indicators.
- Reading/ELA Combined. Reading (grades 3-9) and ELA (grades 10 & 11) results are combined and evaluated as a single subject. This affects districts and campuses that offer both grade 9 and grades 10 and/or 11. In these cases, counts of reading and ELA students who met the standard are summed and divided by the total number taking reading or ELA.
- TAKS Spanish. The TAKS tests are given in Spanish in reading and mathematics for grades 3, 4, 5, and 6; writing in grade 4; and science in grade 5. Performance on these tests is combined with performance on the English-language TAKS for the same subject to determine a rating.
- Student Passing Standards. To determine whether the student counts as a passer, the student must meet the passing standard adopted by the State Board of Education (SBOE) for the current year. Please note the following:
- For 2006, the student passing standard is panel recommendation (PR) for students in all grades and all subjects, except grade 8 science.
- The TAKS grade 8 science passing standard for 2006 is lower while it is phased in. Performance on this test will not be part of the accountability system until 2008.
- Some 11th graders who have repeated a grade may have a passing standard other than PR, depending on which standard was in place when they first entered 10th grade.
- Sum of All Grades Tested. Results for each subject are summed across grades. This refers to the grades tested at the particular campus or district. For example, the percent passing for TAKS reading in an elementary school with a grade span of K-5 is calculated as:
number of students who passed the reading test in grades 3, 4, & 5
number of students who took the reading test in grades 3, 4, & 5
- Exit-level TAKS. The performance of all juniors tested for the first time during the primary spring administration (ELA in February; mathematics, science, and social studies in April) is included in determining accountability ratings. Some juniors are tested at other times, and may have their performance included, if the conditions described below are met.
- Rising Juniors. In June 2005, a number of students entering 11th grade took the TAKS exit-level tests before beginning their junior year as part of a pilot study for the state assessment program. The performance of these students is included with the performance of other juniors taking the exit-level test if:
- they were entering their junior year in 2005-06;
- they were taking the exit-level TAKS for the first time in June 2005; and
- they passed all four assessments at that time.
Students tested in June who failed a test and retest during the primary administration will have only the performance on the test they re-took included with that of other juniors. Results are subsetted by comparing with October 2004 enrollment.
- October 2005 administration. In October 2005, some juniors eligible for early graduation took the TAKS. The performance of these students is included with the performance of other juniors taking the exit-level test if:
- they were juniors at the time of testing;
- they were taking the exit-level TAKS for the first time in October 2005; and
- they passed all four assessments at that time.
Students tested in October who failed any of the tests in October could retest in the spring; however, neither performance - from October or from the spring retest -is included in the accountability calculations. Results are not adjusted for mobility.
- December 2005 administration. In December of 2005, some students (juniors and seniors) took an on-line version of the exit-level TAKS. The performance of these students is not included with the performance of other juniors taking the exit-level test.
- Students Tested. Only those answer documents marked "Score" are included; answer documents coded "Absent," "Exempt," or "Other" are excluded. For example, results for limited English proficient students taking a linguistically accommodated TAKS or SDAA II mathematics test are not included in the state accountability system.
- Rounding of Met Standard Percent. The Met Standard calculations are expressed as a percent, rounded to whole numbers. For example, 49.877% is rounded to 50%; 79.4999% is rounded to 79%; and 89.5% is rounded to 90%.
- Rounding of Student Group Percent. The Student Group calculations are expressed as a percent, rounded to whole numbers. For example, 9.5% is rounded to 10%.
- Prior Year TAKS. For purposes of determining Required Improvement and to allow for comparison across years, TAKS performance is also provided for 2005. It will not match last year's reports exactly, due to the different student passing standards used for the exit-level TAKS in 2005 and 2006. To determine whether a student counts as a passer, the student must meet the passing standard adopted by the State Board of Education (SBOE) for the current year. For 2005 the student passing standard was 1 standard error of measurement (SEM) below panel recommendation for students in grade 11. For 2006, the passing standard for grade 11 is panel recommendation. The 2005 performance for grade 11 was recalculated to show how many would have passed at the 2006 standard. This provides an accurate comparison of performance across the two years.
This test assesses special education students in grades 3-10 who are receiving instruction in the state's curriculum but for whom the TAKS test is not an appropriate measure of their academic progress. Tests are given in the areas of reading/ELA, writing, and mathematics, on the same schedule as TAKS.
A single performance indicator is evaluated for SDAA II. The indicator sums across grades tested (3-10) and across subjects. This indicator is not based on the number of students tested but on the number of tests taken. It is calculated as the number of tests meeting ARD committee expectations divided by the number of SDAA II tests for which ARD expectations were established. Students who take multiple SDAA II tests are included multiple times (for each and every SDAA II test they take).
Who is evaluated for SDAA II: Districts and campuses that test students on any SDAA II subject.
Standard:
- Exemplary - Results on at least 90% of tests taken meet ARD expectations.
- Recognized - Results on at least 70% of tests taken meet ARD expectations.
- Academically Acceptable - Results on at least 50% of tests taken meet ARD expectations.
Student Groups: Performance for the percent Meeting ARD Expectations is evaluated for All Students only. Student group performance is not evaluated separately.
Methodology:
| number of SDAA II tests taken |
Minimum Size Requirements:
- SDAA II performance is evaluated for districts and campuses with results from 30 or more tests (summed across grades and subjects). Depending on grade level, an individual student might be counted as many as three times if he or she takes SDAA II tests in reading, writing, and mathematics. In this case, the minimum size requirement of 30 tests could represent as few as 10 students.
- There is no Special Analysis done on SDAA II performance.
Year of Data: 2006 (Spring SDAA II Administration)
Data Source: Pearson Educational Measurement
Other Information:
- TAKS-I. For the first time in 2006, students served in special education may take the new Texas Assessment of Knowledge and Skills Inclusive (TAKS-I) in subjects and grades where the SDAA II is not available. TAKS-I performance is not used in determining the accountability ratings in 2006, but will be shown on the AEIS reports released in the fall.
- Students Tested on both SDAA II and TAKS. In some cases, students may take both the SDAA II and TAKS. For example, a grade 6 student may take the TAKS for mathematics, and the SDAA II for reading. In this case, the student's TAKS performance is included with the TAKS indicators and the SDAA II performance is included with the SDAA II indicator.
- Hurricanes Katrina-Rita Indicator (KRI). The performance of students displaced by Hurricane Katrina and/or Hurricane Rita who tested in Texas school districts in 2005-06 is not included in the indicators used for 2006 accountability ratings. For more information, see Appendix I.
- Rounding of Met ARD Expectation Percent. The Met ARD Expectation calculations are expressed as a percent, rounded to whole numbers. For example, 49.877% is rounded to 50%; 79.4999% is rounded to 79%; and 89.5% is rounded to 90%.
Accountability Subset
For the TAKS and SDAA II indicators, only the performance of students enrolled on the PEIMS fall "as-of" date of October 28, 2005, are considered in the ratings. This is referred to as the accountability subset (sometimes also referred to as the October subset or the mobility adjustment). This adjustment is not applied to any other base indicator.
Students who move from district to district are excluded from the campus and district's TAKS and SDAA II results. Further, students who move from campus to campus within a district are kept in the district's results but are excluded from the campus's TAKS and SDAA II results. No campus is held accountable for students who move between campuses after the PEIMS "as-of" date and before the date of testing, even if they stay within the same district. The subsets are determined as follows:
Campus-level accountability subset: If a student was reported in membership at one campus on October 28, 2005, but moves to another campus before the TAKS or SDAA II test, that student's performance is removed from the accountability results for both campuses, whether the campuses are in the same district or different districts. Campuses are held accountable only for those students reported to be enrolled in the campus in the fall and tested in the same campus in the second semester.
District-level accountability subset: If a student was in one district on October 28, 2005, but moved to another district before the TAKS or SDAA II test, that student's performance is taken out of the accountability subset for both districts. However, if the student moved from campus to campus within the district, his or her performance is included in that district's results, even though it does not count for either campus. This means that district performance results do not match the sum of the campus performance results.
Examples of how the accountability subset criteria are applied are provided below. Note that these apply to both SDAA II and TAKS performance results. For more information, see Tables 30, 31, and 32 in Appendix D - Data Sources.
| Student Situation | In Whose Accountability Subset? |
| General | |
| 1. Grade 9 student is enrolled at campus A in the fall and tests there on TAKS reading in February and mathematics in April. | This student's results affect the rating of both campus A and the district. |
| 2. Grade 6 student is enrolled in district A in the fall and moves to district B at the semester break. The student is tested on TAKS reading and mathematics in April. | This student's results do not affect the rating of any campus or district. Results are reported to district B. |
| 3. Grade 6 student is enrolled at campus Y (district A) in the fall and then moves to campus Z (district A) at the semester break. The student is tested on TAKS reading and mathematics in April. | This student's results do not affect the rating of campus Y or Z, but they do affect district A. Results for both tests are reported to campus Z. |
| 4. Grade 6 student is reported in enrollment in district A at campus Z, but is withdrawn for home schooling on November 10th. Parents re-enroll the student at the same campus on April 1. The student is tested in TAKS reading and mathematics in April. | Performance on both tests is reported and included in the ratings evaluation for campus Z and district A. The fact that the student was enrolled on the "as of" date and tested in the same campus and district are the criteria for determining the accountability subset. |
| Mobility between Writing and other tests | |
| 5. Grade 4 student enrolls in campus A in the fall and takes the TAKS writing test there in February. The student then transfers to campus B in the same district and tests on TAKS reading and mathematics in April. | This student's results do not affect the rating of campus A or B. Although writing was assessed at the same campus where the student was enrolled in the fall, the writing results are reported to campus B, where the student tested last. The results affect the district rating. Results for all tests are reported to campus B. |
| 6. Grade 4 student enrolls in campus A in the fall and takes the writing TAKS there in February. The student then transfers to campus B in a different district and tests on TAKS reading and mathematics in April. | This student's results do not affect the rating of either campus or district. Test results are reported to the campus where the student tested last. Results for all tests are reported to campus B. |
| 7. Grade 7 student is reported in enrollment in district A and takes the writing test in that district at campus Y. In March, the student transfers to district B and takes the remaining Grade 7 TAKS tests there. The answer documents submitted by district B use different name spellings than did the one submitted by district A. | To the test contractor these are two different students. Performance on the student's writing test is reported to district A and counts toward its rating and the rating of campus Y. The student's results in reading and mathematics are reported to district B but do not contribute to the rating of either the district or the campus where the student tested because the student was not there in the fall. |
| 8. Grade 7 student is reported in enrollment in district A at campus Z. The student takes the writing test in that district at campus Z in February. In March, the student moves out of state. | Performance on the student's writing test counts toward the rating of district A and the rating of campus Z. |
|
Grades 3 and 5 Reading; Grade 5 Mathematics (Student Success Initiative) (See Tables 30 and 31 in Appendix D - Data Sources for further information.) |
|
| 9. Grade 3 student takes reading in February at campus A where she was enrolled in the fall, passes the test and moves to campus B (in the same district) where, in April, she takes and fails the mathematics test. | This student's results do not affect the rating of campus A or B. The reading results from the February test are reported to campus A and the mathematics results are reported to campus B. Results from both tests affect the district. |
| 10. Grade 5 student takes reading on February 21st at campus A where he was enrolled in the fall, and fails the test. In March he moves to campus B (in the same district) where he retests in April and passes reading, mathematics, and science. | This student's results do not affect the rating of campus A or B. The February reading results are reported to campus A, even though math, science and the 2nd reading results are reported to campus B. Results from reading, science, and mathematics tests affect the district. |
| 11. Grade 3 student enrolls in campus A in the fall, but then moves to campus B (in the same district) in December. On February 21st the student takes the reading test there, and passes. In early April the student moves back to campus A, where he takes and passes the mathematics test. | This student's reading results do not affect the rating of campus A or B, but the math results affect the rating of campus A. The reading results from the February test are reported to campus B, and the math results are reported to campus A. Results from both reading and mathematics tests affect the district. |
| 12. Grade 3 student takes TAKS reading in February at the campus where she was enrolled in the fall. She fails the test. In March, the student moves out of state. She does not take TAKS mathematics. | This student's TAKS reading results do not affect the rating for the campus or district. |
| 13. Grade 5 student takes TAKS reading in February at the campus where she was enrolled in the fall, and passes the test. On April 4th she takes the TAKS mathematics test but fails. The following week, the student moves to another district, where she takes TAKS science and retests in mathematics and fails again. | This student's TAKS reading, mathematics, and science results do not affect the rating for any campus or district. |
| 14. Grade 5 student takes TAKS reading in February at the campus where she was enrolled in the fall, and passes the test. On April 4th she takes the TAKS mathematics test but fails. The following week, the student and her family move out of state. She does not take TAKS science or retest in mathematics. | The three subjects are handled differently: Science: She did not test in science at all, so there are no results to attribute. Reading: She did not need to retest in reading; however, the fact that she did not take the science test in mid-April establishes her as mobile, so her reading results are taken out of the accountability subset. Mathematics: There are no results available for her in May, nor are there answer documents for any of the mathematics passers, as there is no other TAKS test given at that time. For this reason, the April performance on mathematics is retained and will affect the rating of this campus and district. |
| Spanish TAKS | |
| 15. A grade 6 student's LPAC committee directs that she be tested in reading on the Spanish TAKS and in mathematics on the English TAKS. She remains at the same campus the entire year. | Performance on both tests is reported and included in the rating evaluation for the campus and district. Results on both English and Spanish versions of the TAKS contribute to the overall passing rate. |
| Both SDAA II and TAKS | |
| 16. The ARD committee for a grade 6 student in special education directs that she be tested in reading on the SDAA II and in mathematics on the TAKS. She remains at the same campus the entire year. | Performance on both tests is reported and included in the rating evaluation for the campus and district. This student's reading results are included with the SDAA II performance, and the mathematics results contribute to the TAKS results. |
| 17. Grade 3 student takes TAKS reading in February and fails the test. Her ARD committee decides she should take the SDAA II reading in April, on which she meets ARD expectations. She also takes TAKS mathematics and passes. She remains at the same campus the entire year. | This student's TAKS reading (failure) and mathematics (passing) results will affect the TAKS performance for the campus and the district. The SDAA II reading results (passing) will affect the SDAA II indicator for the campus and district. |
This longitudinal rate shows the percent of students who first attended ninth grade in the 2001-02 school year and have completed or are continuing their education four years later. Known as the 2001-02 cohort, these students' progress was tracked over the four years using data provided to TEA by districts and data available in the statewide General Educational Development (GED) database.
The definition of a "completer" has changed for the 2006 accountability year. Beginning this year, "Completion Rate I" will be used for accountability purposes. Under this definition, students who attain a GED certificate are no longer considered completers. Only those students who received a high school diploma with their class (or earlier) and students who re-enrolled in the fall of 2005 will count as a completer. Note that Completion Rate II remains in use under the Alternative Education Accountability (AEA) procedures. See Part 2 of this manual for more information on AEA procedures.
Who is evaluated for Completion Rate I:
- Districts and campuses that serve grades 9, 10, 11, and/or 12.
- Use of District Rate. A completion rate is evaluated for any campus that served students in grades 9, 10, 11, and/or 12 in the fall of the 2004-05 school year. However, a completion rate is calculated only for campuses or districts that offered grades 9 through 12 since 2001-02. When a campus serves only some of those grades-for example, a senior high school that only serves grades 11 and 12-the district's completion rate is attributed to that campus because it does not have its own completion rate. Campuses that have been in existence for fewer than five years will also be evaluated using their district's completion rate.
Standard:
- Exemplary - Completion Rate I of 95.0% or more.
- Recognized - Completion Rate I of 85.0% or more.
- Academically Acceptable - Completion Rate I of 75.0% or more.
Student Groups: Performance is evaluated for All Students and the following student groups: African American, Hispanic, White, and Economically Disadvantaged.
Methodology:
| number in class* |
*See Appendix D for the definition of number in class.
Minimum Size Requirements:
- All Students. These results are evaluated if:
- there are at least 10 students in the class and
- there are at least 5 dropouts.
- Student Groups. These results are evaluated if there are at least 5 dropouts within the student group and:
- there are 30 to 49 students within the student group and the student group comprises at least 10% of All Students; or
- there are at least 50 students within the student group.
Years of Data: Continued enrollment in 2005-06, graduating class of 2005, grade 11 of 2003-04, grade 10 of 2002-03, grade 9 of 2001-02. (Results are based on the original cohort, whether the students remain on grade level or not.)
Data Source: PEIMS submission 1 enrollment data, 2001-02 through 2005-06; PEIMS submission 1 leaver data, 2002-03 through 2005-06; PEIMS submission 3 attendance data, 2001-02 through 2004-05; and General Educational Development records as of March 1, 2006.
Other Information:
- Transfers. Any student who transfers into the cohort is added to it, and any student who transfers out of the cohort is subtracted from it.
- Rounding. All calculations are expressed as a percent, rounded to one decimal point. For example, 74.875% is rounded to 74.9%, not 75%. However, student group percents (minimum size requirements) are always rounded to whole numbers.
- Special Education. The completion status of special education students is included in this measure.
For accountability purposes, the annual dropout rate is used to evaluate campuses and districts with students in grades 7 and/or 8. This is a one-year measure, calculated by summing the number of dropouts across the two grades.
Who is evaluated for Annual Dropout Rate: Districts and campuses that serve students in grades 7 and/or 8.
Standard:
- Exemplary - An Annual Dropout Rate of 0.2% or less.
- Recognized - An Annual Dropout Rate of 0.7% or less.
- Academically Acceptable - An Annual Dropout Rate of 1.0% or less.
Student Groups: Performance is evaluated for All Students and the following student groups: African American, Hispanic, White, and Economically Disadvantaged.
Methodology:
| number of grade 7-8 students who were in attendance at any time during the school year |
Minimum Size Requirements:
- All Students. These results are evaluated if:
- there are at least 10 students in grades 7-8 and
- there are at least 5 dropouts.
- Student Groups. These results are evaluated if there are at least 5 dropouts within the student group and:
- there are 30 to 49 students within the student group and the student group comprises at least 10% of All Students; or
- there are at least 50 students within the student group.
Year of Data: 2004-05
Data Source: PEIMS submission 1 enrollment data 2004-05; PEIMS submission 1 leaver data, 2005-06; PEIMS submission 3 attendance data, 2004-05.
Other Information:
- Official Dropouts. "Official" dropouts are reported dropouts who are not excluded by TEA's automated check. See Appendix D - Data Sources for more information.
- Cumulative Attendance. A cumulative count of students is used in the denominator. This method for calculating the dropout rate neutralizes the effects of mobility by including in the denominator every student ever reported in attendance at the campus or district throughout the school year, regardless of length of stay.
- Rounding. All calculations are expressed as a percent, rounded to one decimal point. For example, 2.49% is rounded to 2.5%, and 0.25% is rounded to 0.3%. However, student group percents (minimum size requirements) are always rounded to whole numbers.
- Special Education. Dropouts served in special education are included in this measure.
Accountability 2006 | Accountability | Performance Reporting | TEA Home